New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reproducing model conversions #714
Comments
It looks as though the initial model is missing The (working) model (
Whereas the converted model (
Is there some step in the conversion I'm missing that includes these |
@thekevinscott - I am not that experienced in this field but just doing some playing and running into same issue with all of Xenova's models (almost). Hopefully this might help you here. @xenova - hoping you see this. Various models you have deployed are running into this issue. (Which I am grateful for your work!) |
Hi there 👋 The correct task is
@MarketingPip can you provide a list of these models? |
Thanks for the discussion and the responses. I've been trying to implement this updated command, but dependencies seem to have shifted since I last posted. I'm trying to move these commands into a I have to step away from this but will pick it up later today; maybe it's helpful to share my progress so far. The main challenges I'm seeing are:
This last step fails with:
I'll pick up the investigation thread later this week. Thanks for all the help and input so far! |
@xenova - I have ran into this using
and those are just a few to list. @thekevinscott - I am assuming you are using a version of Python lower than 3.8? If so may I advise upgrading / using an upgraded environment for running Transformers (Python)? Seem's you are running into issue models on-top of just general Transformer's errors that should be solved via a Torch & Python upgrade (as far as I know). Cheers Edit: seen you are using 3.9. Try upgrading / purging both Torch + Transformers. |
I've landed on a working implementation here: https://github.com/thekevinscott/reproducing-phi-1-5-conversion This appears to convert Phi 1.5 successfully from the original repository. To summarize the issues I ran into along the way:
It would be amazing if the Hugging Face model cards contained some of this information on the necessary steps to reproduce model conversions - that way more people could help contributing new models for this awesome library. Cheers to both of you for your help. I'll leave this issue open since it sounds like @MarketingPip also has some issues, but feel free to close since my original query is now solved. |
Question
I'm trying to reproduce the conversion of
phi-1_5_dev
to better understand the process. I'm running into a few bugs / issues along the way that I thought it'd be helpful to document.The model
@Xenova/phi-1_5_dev
states:I'm doing the following:
Here, I hit my first issue - it looks like
transformers
onpypi
does not support Phi:So I install from Github:
That produces:
I believe
optimum
is also out of date:With those two dependencies updated, this command now works:
Though there are a few warnings I'm assuming I can ignore:
However, out of the box it can't find the right
onnx
file:I see in the
@Xenova
repo history that the files were manually renamed; I'll try that too:I then try to run the model with:
The model loads, but upon generating I see:
I'm not entirely sure to proceed from here. Any suggestions? It seems to be something specific to the
.onnx
file, as if I replace it with the.onnx
file from the@Xenova
repo it works perfectly.The text was updated successfully, but these errors were encountered: