Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use with node-llama-cpp for backend #238

Open
TimeLordRaps opened this issue Feb 22, 2024 · 5 comments
Open

How to use with node-llama-cpp for backend #238

TimeLordRaps opened this issue Feb 22, 2024 · 5 comments
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers

Comments

@TimeLordRaps
Copy link

TimeLordRaps commented Feb 22, 2024

Documentation description
Just a relevant example of how to use with node-llama-cpp
Steps include installing node-llama-cpp
installing langchain/community
and here's relevant code:

import { CopilotBackend, LangChainAdapter } from "@copilotkit/backend";
import { ChatLlamaCpp } from "@langchain/community/chat_models/llama_cpp"

const modelPath = ""
const contextLength = 8192
export async function POST(req: Request): Promise<Response> {
  const copilotKit = new CopilotBackend();
  return await copilotKit.response(
    req, 
    new LangChainAdapter(async (forwardedProps) => {
      const model = new ChatLlamaCpp({
        modelPath: modelPath,
        contextSize: contextLength,
        batchSize: contextLength, //weird bug with node-llama-cpp where if batchSize isn't big enough it just crashes.
      });
      return model.stream(
        forwardedProps.messages,
        { tools: forwardedProps.tools }
      );
    })
  );
}

Relevant Context
Where this documentation fits in

  1. Existing documentation (see docs.copilotkit.ai): 'either https://docs.copilotkit.ai/reference/CopilotBackend) or https://docs.copilotkit.ai/getting-started/quickstart-backend
@TimeLordRaps TimeLordRaps added the documentation Improvements or additions to documentation label Feb 22, 2024
@mme
Copy link
Collaborator

mme commented Feb 23, 2024

@TimeLordRaps that's awesome, thanks for figuring this out!

Would you mind contributing this to our documentation?

@TimeLordRaps
Copy link
Author

So I'm still currently testing around with this, it might not be fully functional as the LlamaCpp custom Llm form langchain has trouble with tools. Could someone familiar with the matter explain to me where in the code to look to see the downstream effects of failing tool uses.

@ataibarkai
Copy link
Collaborator

@TimeLordRaps at a high level the code lives here and here.

But it can be a bit challenging to come in there cold. It's a complex part of the codebase (and will actually get more complex in the near as we repurpose the underlying function calling capability to do more and more..).

It might be easiest to test empirically, having 2 identical examples and switching only node-llama-cpp and observing the differences.

I actually think this should live more than just in documentation, but in ready-made adapter factories / instances.

I'd love to collaborate / pair program with you to help you make this first addition of its kind of you are interested. And I can also help you get oriented with tool calls if you want to dig deeper.

btw, cool username

@TimeLordRaps
Copy link
Author

I'd love the opportunity to pair program an implementation of some contribution to this project. However I'm not sure this issue would be my first choice. I'll reach out on discord though.

@mme mme added the good first issue Good for newcomers label Mar 14, 2024
@mme
Copy link
Collaborator

mme commented Mar 14, 2024

This would make a great first issue: Add the code to the langchain docs to explain how to use other LLMs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants