Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: High PPL on wikitext2 of ReLU-LLAMA-7B for language modeling tasks #162

Open
3 tasks done
llCurious opened this issue Mar 11, 2024 · 2 comments
Open
3 tasks done
Labels
question Further information is requested

Comments

@llCurious
Copy link

llCurious commented Mar 11, 2024

Prerequisites

Before submitting your question, please ensure the following:

  • I am running the latest version of PowerInfer. Development is rapid, and as of now, there are no tagged versions.
  • I have carefully read and followed the instructions in the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).

Question Details

I use the sparsed LLaMA model from SparseLLM (Huggingface), named as ReluLLaMA-7B. I calculate the PPL with max_seq_length of 512 for wikitext2 dataset (from Huggingface). However, the PPL reaches 16003, while the original dense LLaMA2, named as Llama-2-7b-hf has a PPL of 54.

There seems to be a huge PPL loss due to the relu activation. Do you have any ideas on this phenomenon?

Additional Context

All the packages use the latest version. All the models and datasets are from Huggingface.

@llCurious llCurious added the question Further information is requested label Mar 11, 2024
@llCurious
Copy link
Author

hi @hodlen , do you have any ideas?

@hodlen
Copy link
Collaborator

hodlen commented Apr 6, 2024

Sorry for the late reply. That was a bit unexpected since we have tested its perplexity under both transformers/torch and PowerInfer. Can you provide minimal reproducible code so we can further help to investigate?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants