New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
checkpoint str has no attribute 'get' #3444
Comments
@Jack-Khuu is the on-device evaluation ready? edit: Acutally coreml should be able to run on Mac too, @antmikinka are you looking for on device evaluation, or just evaluate the coreml model either on Mac or iPone? |
Yes, I'm trying to see an evaluation for the model on the Mac. I would like to put the model on my iPhone (iPhone 13 Pro) as well. I was trying to determine what hardware (cpu/gpu/ane) was being utilized to compute the model. |
Seems an issue with the executorch setup. |
Eval is ready, but this error doesn't seem to be related to eval. It fails during load_llama_model, prior to eval |
I think it's related to how we expect eval to work with delegated model, in this case coreml |
Just as an update so this doesn't go stale, investigating CoreML Eval is on our plate Will update as things flesh out |
I was following the llama2 7b guide, consenus not enough ram and other issues.
tried the stories110M guide, worked all the way till I went to test it.
I may remember lm_eval not being installed (its what my terminal said) not sure if that could be causing anything
I am trying to eval model accuracy, and that is where this error is stemming from.
file I am using to save the .pte
script and terminal info
The text was updated successfully, but these errors were encountered: