Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Models: REQUEST_VALIDATION_ERROR when adding custom models #33

Open
maxjacu opened this issue Feb 8, 2024 · 7 comments
Open

Models: REQUEST_VALIDATION_ERROR when adding custom models #33

maxjacu opened this issue Feb 8, 2024 · 7 comments
Assignees
Labels
bug Something isn't working enhancement New feature or request

Comments

@maxjacu
Copy link

maxjacu commented Feb 8, 2024

Describe the bug
When attempting to add custom model property error is raised. This happens with any combination of properties, toggled off, on etc.

{
"status": "error",
"error": {
"code": "REQUEST_VALIDATION_ERROR",
"message": "Properties are not allowed for this model schema."
}
}

To Reproduce
Steps to reproduce the behavior:

  1. Navigate to new Model
    2.Custom Model
  2. Select any custom hosted model (same error on all three)
  3. Fill out and click green button

Expected behavior
Creates a custom hosted model. In my case litellm endpoint.

Screenshots
image

Desktop (please complete the following information):

  • OS: Mac
  • Browser Brave
  • Version 0.1.3

Additional context
I would like to access models behind a litellm proxy i.e. llama2_70b text completion model, but can't get it to work on my or colleagues machine.

@jameszyao
Copy link
Contributor

@maxjacu Thank you for your feedback. We'll try to reproduce and fix it asap if needed.

@ishaan-jaff
Copy link

@maxjacu I'm the litellm maintainer - thanks for using us. Can we hop on a call to better understand your problem ?
Sharing my calendly for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

@sihyeonn
Copy link

sihyeonn commented Feb 15, 2024

@jameszyao
Hi, I have a same issue here.
models api with custom_host/openai-function-call cannot handle a request when properties field is not empty, or empty.
Also, I saw the issue about taskingai-inference(#28). When can I check this repository?
I'd liked to work with custom host and custom body. So It'd be nice if I can check the taskingai-inference repo for this since I cannot find any flexibility in this repository!

# when properties field is null
 docker (master) ✗ curl 'http://localhost:8080/api/v1/models' \
  -H 'Authorization: Bearer {key}' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: application/json' \
  --data-raw '{"name":"llama","model_schema_id":"custom_host/openai-function-call","credentials":{"CUSTOM_HOST_ENDPOINT_URL":"http://custom_url:8080","CUSTOM_HOST_MODEL_ID":"cpp","CUSTOM_HOST_API_KEY":"no-key"},"properties":null}' \                                     
  --compressed
{"status":"error","error":{"code":"REQUEST_VALIDATION_ERROR","message":"Model properties are required for openai-function-call"}}%


# when properties field is not empty                                                                                          
 docker (master) ✗ curl 'http://localhost:8080/api/v1/models' \
  -H 'Authorization: Bearer {key}' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: application/json' \
  --data-raw '{"name":"llama","model_schema_id":"custom_host/openai-function-call","credentials":{"CUSTOM_HOST_ENDPOINT_URL":"http://custom_url:8080","CUSTOM_HOST_MODEL_ID":"cpp","CUSTOM_HOST_API_KEY":"no-key"},"properties":{"function_call":false,"streaming":false}}' \                                     
  --compressed
{"status":"error","error":{"code":"REQUEST_VALIDATION_ERROR","message":"Properties are not allowed for this model schema."}}%                  


# when properties field is empty                                                                                                         
 docker (master) ✗ curl 'http://localhost:8080/api/v1/models' \
  -H 'Authorization: Bearer {key}' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: application/json' \
  --data-raw '{"name":"llama","model_schema_id":"custom_host/openai-function-call","credentials":{"CUSTOM_HOST_ENDPOINT_URL":"http://custom_url:8080","CUSTOM_HOST_MODEL_ID":"cpp","CUSTOM_HOST_API_KEY":"no-key"},"properties":{}}' \                                     
  --compressed
{"status":"error","error":{"code":"REQUEST_VALIDATION_ERROR","message":"Properties are not allowed for this model schema."}}%  

@IVTore
Copy link

IVTore commented Feb 21, 2024

+1
LM Studio ...

@Asongguo
Copy link

+1 遇到了同样的问题

@SimsonW
Copy link
Contributor

SimsonW commented Mar 1, 2024

We have resolved this issue in the latest version (will be released soon). Additionally, we separate local model providers like LM Studio and Ollama into individual providers. This allows for the local model integration with TaskingAI.

Custom Host will still remain a valuable option for those wishing to use any provider not explicitly listed by us.

@fenglvming
Copy link

{
"status": "success",
"data": [
{
"object": "ModelSchema",
"model_schema_id": "custom_host/openai-function-call",
"name": "OpenAI Function Call",
"description": "",
"provider_id": "custom_host",
"provider_model_id": "openai-function-call",
"type": "chat_completion",
"properties": null
},
{
"object": "ModelSchema",
"model_schema_id": "custom_host/openai-text-embedding",
"name": "OpenAI Text Embedding",
"description": "",
"provider_id": "custom_host",
"provider_model_id": "openai-text-embedding",
"type": "text_embedding",
"properties": null
},
{
"object": "ModelSchema",
"model_schema_id": "custom_host/openai-tool-calls",
"name": "OpenAI Tool Call",
"description": "",
"provider_id": "custom_host",
"provider_model_id": "openai-tool-calls",
"type": "chat_completion",
"properties": null
}
],
"fetched_count": 3,
"total_count": 3,
"has_more": false
}

it seems like this model_schema is wrong

@jameszyao jameszyao mentioned this issue Mar 10, 2024
9 tasks
@jameszyao jameszyao added bug Something isn't working enhancement New feature or request labels Mar 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

9 participants