Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: A KerasTensor is symbolic: it's a placeholder for a shape an a dtype. It doesn't have any actual numerical value. You cannot convert it to a NumPy array. #1401

Open
accioharshita opened this issue Mar 26, 2024 · 3 comments

Comments

@accioharshita
Copy link

Hey, so I've downloaded the preprocessing & encoder layer of BERT in order to build a simple email classification model. When I'm finally building my model to pass the training data it throws this error. Can someone tell me what's wrong?

Screenshot 2024-03-27 021032
Screenshot 2024-03-27 021007

@LeviBen
Copy link

LeviBen commented Mar 31, 2024

I have the same problem. With the URL was working fine, but with the model working locally, for some reason it crashes

@SoumyaCodes2020
Copy link

@accioharshita Hi, I was having the same problem. In fact, I was using the exact same code as you. I managed to solve my problem by importing Bert through the keras_nlp library. Here is the code I ended up with:

text_input = tf.keras.layers.Input(shape=(), dtype=tf.string)
preprocessor = keras_nlp.models.BertPreprocessor.from_preset("bert_base_en_uncased",trainable=True)
encoder_inputs = preprocessor(text_input)
encoder = keras_nlp.models.BertBackbone.from_preset("bert_base_en_uncased")
outputs = encoder(encoder_inputs)
pooled_output = outputs["pooled_output"]      # [batch_size, 768].
sequence_output = outputs["sequence_output"]  # [batch_size, seq_length, 768].

@BismaAyaz
Copy link

BismaAyaz commented Apr 17, 2024

@SoumyaCodes2020 can you please let me know how you saved model with this approach. I'm using this approach model3.save("model3.keras")
model3 = keras.models.load_model("model3.keras") but getting error

No vocabulary has been set for WordPieceTokenizer. Make sure to pass a `vocabulary` argument when creating the layer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants