Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LocalAI with LocalAI-frontend? #7

Open
scott-mackenzie opened this issue Sep 21, 2023 · 5 comments
Open

LocalAI with LocalAI-frontend? #7

scott-mackenzie opened this issue Sep 21, 2023 · 5 comments

Comments

@scott-mackenzie
Copy link

scott-mackenzie commented Sep 21, 2023

Screenshot 2023-09-21 at 3 23 53 PM

The objective would be to get your project working as an overlay onto LocalAI running separately. I commented out the LocalAI in the docker-compose.yaml:

`❯ cat docker-compose.yaml
version: '3.6'

services:
frontend:
build:
context: .
dockerfile: Dockerfile
ports:
- 3000:3000`

❯ netstat -an | grep LISTEN
tcp46 0 0 *.3000 . LISTEN

The docker container is up and running as shown in the above image.

The API is running as an autonomous project separately and working independently. See below:

`❯ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
"model": "llama-2-7b-chat",
"prompt": "What is the expected population of Ghana by the year 2100",
"temperature": 0.7
}'

{"object":"text_completion","model":"llama-2-7b-chat","choices":[{"index":0,"finish_reason":"stop","text":"?\nlazarus May 3, 2022, 1:49pm #1\nThe population of Ghana is projected to continue growing in the coming decades. According to the United Nations Department of Economic and Social Affairs Population Division, Ghana’s population is expected to reach approximately 47 million by the year 2100. This represents a more than fivefold increase from the country’s estimated population of around 8.5 million in 2020.\nHowever, it is important to note that population projections are subject to uncertainty and can be influenced by various factors such as fertility rates, mortality rates, and migration patterns. Therefore, actual population growth may differ from projected values."}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}%`

My question is how to get the "Select Model" and "Model Gallery" to effectively integrate with the LocalAI project when run separately and not directly integrated into your project? Is this possible?

I love the project concept of being able to change "model" and have "model galleries".

@scott-mackenzie
Copy link
Author

Direct request to LocalAI for models returns the model list:

❯ curl http://localhost:8080/v1/models {"object":"list","data":[{"id":"Dollyv2-3B","object":"model"},{"id":"GPT4All-J-13B-Snoozy","object":"model"},{"id":"MPT-7B-Chat","object":"model"},{"id":"RedPajama-INCITE-Chat-3B","object":"model"},{"id":"ggml-gpt4all-j","object":"model"},{"id":"llama-2-13b-chat","object":"model"},{"id":"llama-2-7b-chat","object":"model"}]}%

@scott-mackenzie
Copy link
Author

Also, noticed in console CORS errors:

`Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:8080/v1/models. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 200.

Error: TypeError: NetworkError when attempting to fetch resource. ChatGptInterface.js:118:14
e ChatGptInterface.js:118
Babel 9
N ChatGptInterface.js:112
p ChatGptInterface.js:122
React 3
S scheduler.production.min.js:13
L scheduler.production.min.js:14
(Async: EventHandlerNonNull)
813 scheduler.production.min.js:14
Webpack`

@s-KaiNet
Copy link

s-KaiNet commented Oct 4, 2023

I guess #5 is related.

@Mist-Hunter
Copy link

Just chiming in to say thank you for this project.

I've been chasing the trail of a working front-end for localai and thought this might rescue me (most webuis seem dead or only working in limited capacity), I was hoping this one, which looks very straight forward might be a good option being custom written for it, but I'm running into the same thing as everyone else, no model list.

I'm not entirely clear from looking at the docker file and the docker-compose.yml how this is supposed to get the model list. I know local ai has a model list @ /v1/modes but I don't see any querries headed there in my localai logs.

@guilhermeprokisch
Copy link

guilhermeprokisch commented Oct 18, 2023

If you set the CORS option in the localai server side it will work.

put this on the .env

CORS settings

CORS=true
CORS_ALLOW_ORIGINS=*

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants