You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi everyone, this is my first time posting so if I make any glaring mistakes or misconceptions please forgive me.
I'm attempting to convert the provided univariate gluonts transformer to accept multivariate data.
I did the same thing in the Simple Feed Forward estimator quite easily (by changing to point forecasting instead of distribution output and adjusting the output shape of self.mlp)
Following a similar procedure for the transformer, I'm trying to switch to point forecasting and adjust the output shape of the decoder so that I can have the control to switch in multivariate input. However, in the end I'm getting unexpected shapes and finding it very difficult to pinpoint where exactly that shape is coming from since I cannot print symbol shapes and in the inferred shapes are None.
and this is how I use it to get point forecasted loss instead
dec_output = self.decoder(
dec_input,
enc_out,
self.upper_triangular_mask(F, self.prediction_length),
)
# pass through one more layer to reshape it to future_target shape for point forecast shape
dec_output = self.mlp(dec_output)
loss = (dec_output - future_target).abs().mean(axis=-1)
And the end shape error is as follows
ValueError: Deferred initialization failed because shape cannot be inferred. MXNetError: Error in operator transformertrainingnetwork14__mul0: [15:17:33] ../src/ndarray/./../operator/tensor/../elemwise_op_common.h:134: Check failed: assign(&dattr, vec.at(i)): Incompatible attr in node transformertrainingnetwork14__mul0 at 1-th input: expected [32], got [32,8]
If it helps at all I can paste the entirety of the Transformer code with its changes.
I understand this is a very messy explanation and would be happy to provide any more details that can help.
The reason for wanting to change it is because it seems like using multivariate data improves performance overall even if it is slower, as is the case with the ffn.
Any guidance or advice would be greatly appreciated!
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone, this is my first time posting so if I make any glaring mistakes or misconceptions please forgive me.
I'm attempting to convert the provided univariate gluonts transformer to accept multivariate data.
I did the same thing in the Simple Feed Forward estimator quite easily (by changing to point forecasting instead of distribution output and adjusting the output shape of self.mlp)
Following a similar procedure for the transformer, I'm trying to switch to point forecasting and adjust the output shape of the decoder so that I can have the control to switch in multivariate input. However, in the end I'm getting unexpected shapes and finding it very difficult to pinpoint where exactly that shape is coming from since I cannot print symbol shapes and in the inferred shapes are None.
This is what I'm appending to the decoder output
and this is how I use it to get point forecasted loss instead
And the end shape error is as follows
ValueError: Deferred initialization failed because shape cannot be inferred. MXNetError: Error in operator transformertrainingnetwork14__mul0: [15:17:33] ../src/ndarray/./../operator/tensor/../elemwise_op_common.h:134: Check failed: assign(&dattr, vec.at(i)): Incompatible attr in node transformertrainingnetwork14__mul0 at 1-th input: expected [32], got [32,8]
If it helps at all I can paste the entirety of the Transformer code with its changes.
I understand this is a very messy explanation and would be happy to provide any more details that can help.
The reason for wanting to change it is because it seems like using multivariate data improves performance overall even if it is slower, as is the case with the ffn.
Any guidance or advice would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions