-
Notifications
You must be signed in to change notification settings - Fork 258
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SOLVED] Running Llama3 with Ctranslate2 #1688
Comments
Seems to work fine here with the converted Llama-3-8B-Instruct:
|
OK, let me retry it...thanks. |
Strange...it did the same thing again. Below I am including (1) the full response, (2) the command I used to run the script, (3) a modified script I created that (a) uses a hard-coded system message and path to the model instead, (b), uses the "apply_chat_template" method from transformers, (c) adds "exit" to exit the conversation, and (4) the output from my script, which is basically producing the same thing as your results...any idea? 1. FULL RESPONSE
2. COMMAND
3. MODIFIED SCRIPT
4. RESPONSE FROM MODIFIED SCRIPT
Also, I'm getting these warning messages immediately before it states "You:" where I enter my prompt:
|
I solved the issue by using the "end_token" parameter. Here's the script for peoples' benefit:
|
I ran the new Llama3 sample script and it seems to be conversing with itself so I think there's a problem with how the prompt is being constructed...See below:
The text was updated successfully, but these errors were encountered: