Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

embed_tokens #59

Open
CodeMiningCZW opened this issue Aug 16, 2023 · 4 comments
Open

embed_tokens #59

CodeMiningCZW opened this issue Aug 16, 2023 · 4 comments
Assignees

Comments

@CodeMiningCZW
Copy link

In the RetNet model, embed _ tokens is not given, I can 't run the code. When I use this model, what should the parameter token _ embeddings pass ? Or how do I define embed _ tokens ?

@donglixp donglixp self-assigned this Aug 16, 2023
@donglixp
Copy link
Contributor

I found one blog (in Japanese) that might be useful https://zenn.dev/selllous/articles/retnet_tutorial.

@shumingma
Copy link
Contributor

A simple nn.Embeddng(vocab_size, embedding_size) will work.
Or you can refer to our example on language modeling.

@egoistor
Copy link

egoistor commented Sep 1, 2023

I also encountered this problem. When I want to use the encoder and decoder modules separately, the code will report an error, I also want to know where the problem is and how to solve it

@DaZhUUU
Copy link

DaZhUUU commented Oct 31, 2023

A simple nn.Embeddng(vocab_size, embedding_size) will work. Or you can refer to our example on language modeling.

from fairseq.models.transformer import DEFAULT_MIN_PARAMS_TO_WRAP, Embedding

I can't find the transformer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants