-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLMAs alternatives to ChatGPT #85
Comments
Hi @brunoalbertopeixoto to clarify, there is no monthly fee to use this plugin. However, you do have to pay OpenAI for the GPT API usage on a pay-per-use basis. Having said that, I do agree it would be nice to support other LLMs especially open-sourced ones. |
I am not a programmer, but I have been able to tinker around with some projects and use opensource LLM. Most opensource LLM frameworks are compatible with Open AI API calls. I believe that if you allow the base URL to be modified it would easily allow people to use their own models (through ollama, LMstudio, or other cloud ones) |
Hi @Ehesh, you can change the API Base URL under Zotero Preferences > Aria > Model Configuration. If an open-source LLM uses the same API signature as OpenAI's, it might work. Good luck and let me know! |
Create a tutorial how to do this please. I bought the openia api but I would like to work with an opensource alternative without limits. I would like to work in Brazilian Portuguese. |
Create a tutorial how to do this please. I bought the openia api but I would like to work with an opensource alternative without limits. I would like to work in Brazilian Portuguese. |
The system is currently optimized for the OpenAI GPT models. I'd like to see broader model support, especially open sourced ones. However, I really don't have the time to work on it. Community effort is welcome 🤗 |
I have an openai compatible api endpoint , i get this error { "name": "Error", "message": "input values have 2 keys, you must specify an input key or pass only 1 key as input", "stack": "getInputValue@resource://gre/modules/addons/XPIProvider.jsm -> jar:file://extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file://extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:53279:11\nsaveContext@resource://gre/modules/addons/XPIProvider.jsm -> jar:file://extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file://extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:53398:43\ninvoke@resource://gre/modules/addons/XPIProvider.jsm -> jar:file://extensions/aria@apex974.com.xpi!/bootstrap.js -> jar:file://extensions/aria@apex974.com.xpi!/chrome/content/scripts/index.js:20784:17\n" } i don't need an api key so simply pass random |
Hi @dcmumby , I suspect this error is not related to your API end point. Do you always get the same error even with different questions? |
same error regardless of question |
I leave as a suggestion the possibility of adding opensource and free Illmas APIs, such as Gemini, as alternatives. Paying 20 dollars a month is unfeasible.
The text was updated successfully, but these errors were encountered: