Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use non-Gemini models through ChatVertexAI #5196

Open
5 tasks done
jarib opened this issue Apr 24, 2024 · 2 comments
Open
5 tasks done

Unable to use non-Gemini models through ChatVertexAI #5196

jarib opened this issue Apr 24, 2024 · 2 comments
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@jarib
Copy link
Contributor

jarib commented Apr 24, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatVertexAI } from '@langchain/google-vertexai'

const chat = new ChatVertexAI({
    model: 'claude-3-opus@20240229',
    temperature: 0.0,
    maxOutputTokens: 1200,
})

const result = await chat.invoke('Tell me a story')

console.log(result)

Error Message and Stack Trace (if applicable)

Error: Unable to verify model params: {"lc":1,"type":"constructor","id":["langchain","chat_models","chat_integration","ChatVertexAI"],"kwargs":{"model":"claude-3-opus","temperature":0,"max_output_tokens":1200,"platform_type":"gcp"}}
    at validateModelParams (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/utils/common.js:100:19)
    at copyAndValidateModelParamsInto (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/utils/common.js:105:5)
    at new ChatGoogleBase (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-common/dist/chat_models.js:191:9)
    at new ChatGoogle (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-gauth/dist/chat_models.js:12:9)
    at new ChatVertexAI (file:///Users/redacted/src/langchain-repro/node_modules/@langchain/google-vertexai/dist/chat_models.js:10:9)
    at main (file:///Users/redacted/src/langchain-repro/test.js:5:22)
    at file:///Users/redacted/src/langchain-repro/test.js:22:1
    at ModuleJob.run (node:internal/modules/esm/module_job:218:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:329:24)
    at async loadESM (node:internal/process/esm_loader:28:7)

Description

I am trying to use ChatVertexAI with Anthropic Claude 3, but it seems this class only supports Gemini models and returns the above error message.

This appears to be a deliberate choice in the code:

switch (modelToFamily(model)) {
case "gemini":
return validateGeminiParams(testParams);
default:
throw new Error(
`Unable to verify model params: ${JSON.stringify(params)}`
);
}

I've verified that using Claude and Vertex AI and Anthropics Vertex SDK works fine:

import { AnthropicVertex } from '@anthropic-ai/vertex-sdk'

const projectId = 'my-project-id'
const region = 'us-east5'

// Goes through the standard `google-auth-library` flow.
const client = new AnthropicVertex({
    projectId,
    region,
})

async function main() {
    const result = await client.messages.create({
        model: 'claude-3-opus@20240229',
        max_tokens: 100,
        messages: [
            {
                role: 'user',
                content: 'Hey Claude!',
            },
        ],
    })
    console.log(JSON.stringify(result, null, 2))
}

main()

Output:

{
  "id": "msg_vrtx_01M1yGR5LiteHznRmyK2MaPG",
  "type": "message",
  "role": "assistant",
  "model": "claude-3-opus-20240229",
  "stop_sequence": null,
  "usage": {
    "input_tokens": 10,
    "output_tokens": 12
  },
  "content": [
    {
      "type": "text",
      "text": "Hello! How can I assist you today?"
    }
  ],
  "stop_reason": "end_turn"
}

System Info

    "@anthropic-ai/vertex-sdk": "^0.3.5",
    "@google-cloud/vertexai": "^1.1.0",
    "@langchain/google-vertexai": "^0.0.11",
    "langchain": "^0.1.35"

Platform: Mac
Node: 20.11.0

Copy link

dosubot bot commented Apr 24, 2024

To address the issue of using Anthropic Claude 3 with LangChain.js, particularly with the ChatVertexAI class, it's important to note that the current implementation indeed focuses on Gemini models, leading to the error you encountered. However, you can integrate Claude 3 by adapting the approach used for Anthropic models in LangChain.js, specifically through a custom implementation similar to the ChatAnthropic class.

Here's a concise guide to get you started:

  1. Ensure Dependencies: Make sure you have both LangChain and the Anthropic SDK installed in your project.

  2. Use the ChatAnthropic Class: Import and configure the ChatAnthropic class from LangChain to work with Claude 3. You'll need to specify the model name, API key, and other relevant settings like temperature and max tokens.

  3. Invoke the Model: Utilize the invoke method for sending prompts to Claude 3 and receiving responses. For interactive applications, consider exploring streaming responses.

  4. Customize as Needed: Tailor the integration to fit your application's needs, adjusting prompt formatting, response handling, or integrating with other services as required.

This approach allows you to leverage Claude 3's capabilities within LangChain.js applications. For detailed implementation and examples, refer to the ChatAnthropic class in the LangChain.js repository and the associated documentation.

Keep in mind, this solution circumvents the ChatVertexAI class's current limitations with non-Gemini models. The LangChain.js team is working on expanding support for a broader range of models, so staying updated with the repository may provide more streamlined solutions in the future.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Apr 24, 2024
@afirstenberg
Copy link
Contributor

Similar to #2562

The issue is that Claude in Vertex AI is neither the same protocol as Gemini, nor quite the same as what is offered through Anthropic. So we need something that combines the two.

Do-able, but hasn't risen in priority.

My general thoughts about how to do this would be

  • The Anthropic library (and possibly the the @langchain/anthropic library) would be a build dependency and optional dependency to use it. Probably use this for the types, mostly, and parsing the results.
  • Add a utility class for anthropic.
    • The things that call baseMessageToContent, safeResponseToChatGeneration, and safeResponseToChatResult need to call their equivalents in this class
  • Changes in GoogleAIConnection
    • modelFamily() should add "claude" or something along those lines. And GoogleLLMModelFamily should be adjusted as well (probably).
    • buildUrlVertex() needs a change to handle non-Google models (possibly determine that based on the model family?)
  • Probably make a subclass of GoogleAIConnection or AbstractGoogleLLMConnection to handle the Claude layout.
    But I haven't dug into the details.

The good news is that the authentication part is handled by default with the Connection classes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

2 participants