Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No Model selectable - req.model.split(".") ERROR #334

Open
FreshImmuc opened this issue Mar 31, 2023 · 14 comments · May be fixed by #348
Open

No Model selectable - req.model.split(".") ERROR #334

FreshImmuc opened this issue Mar 31, 2023 · 14 comments · May be fixed by #348

Comments

@FreshImmuc
Copy link

I can't select a model on the web-gui. How do i fix that?
Here is my web gui:
Screenshot_20230331_125207_Chrome

And here is the log when i try to submit a prompt:
Screenshot_20230331_104512_Termius

@Azilone
Copy link

Azilone commented Mar 31, 2023

When I follow the instructions to run dailai on docker i can see the model but I have the same error it seems to come from socket.io

@lucas-strummer
Copy link

I had the same problem, bypassed it by installing a second model.

@FreshImmuc
Copy link
Author

@lucas-strummer wich models did you install? I installed llama 7B and 13B and it doesnt work...

@abulka
Copy link

abulka commented Mar 31, 2023

I get this too. I once ran it ok a few weeks ago on another Mac (intel) but now trying on my M1 Mac I get

/Users/andy/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (/Users/andy/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:35)
    at Socket.<anonymous> (/Users/andy/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:534:20)
    at Socket.emit (node:events:513:28)
    at Socket.emitUntyped (/Users/andy/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22)
    at /Users/andy/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v18.15.0

Could it be the model isn't downloaded properly? My ~/dalai directory is not filled with gigabytes of model data. So I ran npx dalai alpaca install 7B and now something big is downloading...

@Hayden2018
Copy link

I have this exact same Error.

@DevWael DevWael linked a pull request Apr 2, 2023 that will close this issue
@Mikecodex
Copy link

same error in Windows and in vm linux ubuntu

@EnguerrandDeclercq
Copy link

EnguerrandDeclercq commented Apr 6, 2023

I have the same issue. Hardcoded the Core and Model variables in the index.js file to bypass it (line 219). For instance, if you installed llama 7B:

let [Core, Model] = ['llama', '7B']

@Wagner-Peter
Copy link

This is the one that is working for me.

I have the same issue. Hardcoded the Core and Model variables in the index.js file to bypass it (line 219). For instance, if you installed llama 7B:

let [Core, Model] = ['llama', '7B']

Although, remember to switch "llama" to alpaca if you are using. Worked a charm, thanks man.

@d-tool
Copy link

d-tool commented May 22, 2023

If you have installed at least two models, the normal line 219 in index.js works

let [Core, Model] = req.model.split(".")

If you only have one model installed, you should edit the line to

let [Core, Model] = req.models[0].split(".")

Hardcoding the models is not a good solution I think and also didn't work for me, as long as I remember correctly

Llama/Alpaca still doesn't work cause it doesn't give me results, it just calculates 'till death. Or..untill I hit the Stop button. But the model problem is fixed.
It's said that the models are corrupt, which explains the problem of no results (#432) already mentioned by joelduerksen

Also didn't work with other models...

@xero-q
Copy link

xero-q commented Jul 7, 2023

I had one model and none of the solutions worked for me. After hardcoding the model and core it did not show any error when running but it never finished. Now I installed a second model and yet I get the same error. I think the primary cause of error is that the GUI does not receive correctly the installed models so one can't select it, which can be seen in the log in the query, the "models" param is empty.

@xero-q
Copy link

xero-q commented Jul 7, 2023

@lucas-strummer wich models did you install? I installed llama 7B and 13B and it doesnt work...

@FreshImmuc have you been able to solve this??, I also installed those two same models and I can't put it to work.

@gavin1818
Copy link

gavin1818 commented Jul 9, 2023

After changing to let [Core, Model] = req.model[0].split(".")
I experienced a new issue:

    let [Core, Model] = req.model[0].split(".")
                                 ^

TypeError: Cannot read properties of undefined (reading '0')
    at Dalai.query (/Users/xxee/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:34)

it looks like my model is empty, my first time using the app, and I'm not sure what goes wrong.

Also, my model dropdown is empty in the UI:
Screen Shot 2023-07-09 at 9 32 21 AM

@xero-q
Copy link

xero-q commented Jul 9, 2023

Same happens to me, no model appears in the UI.

@LizsDing
Copy link

LizsDing commented Aug 28, 2023

Fixed is on my end for Windows.
It turned out to me that my llama model was not installed (quantize) correctly.
After I applied the fix here, this error was gone for me
#241

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.