Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Great Updates for the Local Cat #9

Open
BW-KING opened this issue Dec 13, 2023 · 3 comments
Open

Great Updates for the Local Cat #9

BW-KING opened this issue Dec 13, 2023 · 3 comments

Comments

@BW-KING
Copy link

BW-KING commented Dec 13, 2023

Being that Ollama has been updated and will continue to be updated more frequently - it would be really good to have these two things added for admin

  1. A dropdown menu for selecting the local model that has been locally downloaded already and the user want to use for that session or even to be selected for all sessions.
  2. simple upgradeability of Ollama - like Git Pull command, because there are gonna be a lot of updates soon

3)The admin should show which version number of ollama software is being used.

this would make the local cat much better I think

@pieroit
Copy link
Member

pieroit commented Dec 14, 2023

1 - Dropdown can be done
2 - you upgrade from the compose, by just changing the Ollama container version
3 - more integration with Ollama is risky. For the moment, let's make sure the connection works ;)

Thanks for your suggestions @BW-KING !

@jimmyjam-50066
Copy link

to support GGUF files, could we have a script in the docker that will take the argument and create the Model file for ollama to use?

docker exec ollama_cat pull_gguf_from_url.sh solar-10.7b https://huggingface.co/TheBloke/SOLAR-10.7B-Instruct-v1.0-GGUF
  • magic * (really creating a new modelfile with the first parameter and downloading the second to a gguf directory or something)
    then
docker exec ollama_cat ollama create solar-10.7b -f solar-10.7bModel

@pieroit
Copy link
Member

pieroit commented Dec 15, 2023

to support GGUF files, could we have a script in the docker that will take the argument and create the Model file for ollama to use?

docker exec ollama_cat pull_gguf_from_url.sh solar-10.7b https://huggingface.co/TheBloke/SOLAR-10.7B-Instruct-v1.0-GGUF
  • magic * (really creating a new modelfile with the first parameter and downloading the second to a gguf directory or something)
    then
docker exec ollama_cat ollama create solar-10.7b -f solar-10.7bModel

This stuff is related to the Ollama container, which we do not control.
Please open an issue over there :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants