Skip to content Skip to sidebar Skip to footer

Valueerror: Input 0 Is Incompatible With Layer Layer_1: Expected Ndim=3, Found Ndim=2

I am trying to build text-summarizer using word Embeddings and encoder-decoder architecture. This is my first shot at Keras and I am not able to understand why layer_1 requires ndi

Solution 1:

Your problem lies in these lines:

for i in range(3):
        lstm = LSTM(rnn_size, name="layer_%s" %(i))
        model.add(lstm)
        model.add(Dropout(prob, name="drop_%s" %(i)))

LSTM by default returns only the last step of its predictions - so data is losing its sequential nature. In your example - after the first iteration LSTM outputs a vector instead of a sequence of errors - and that's why an error is raised.

In order to fix that try:

for i in range(2):
        lstm = LSTM(rnn_size, name="layer_%s" %(i), return_sequences=True)
        model.add(lstm)
        model.add(Dropout(prob, name="drop_%s" %(i)))
lstm = LSTM(rnn_size, name="layer_%s" %(i), return_sequences=False)
model.add(lstm)

Another thing which I've noticed is that you are using Dense in an incorrect manner. You should provide the number of output neurons:

model.add(Dense(nb_of_output_neurons))

Cheers.

Post a Comment for "Valueerror: Input 0 Is Incompatible With Layer Layer_1: Expected Ndim=3, Found Ndim=2"