Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] java.lang.NullPointerException - "tokenUsageAccumulator" is null - when using LocalAI function calling #935

Closed
rprabhu opened this issue Apr 14, 2024 · 10 comments · Fixed by #1119
Labels
bug Something isn't working P3 Medium priority

Comments

@rprabhu
Copy link

rprabhu commented Apr 14, 2024

Describe the bug

When implementing function calling using LocalAI a NPE is thrown stating "tokenUsageAccumulator" is null

Log and Stack trace

Exception in thread "main" java.lang.NullPointerException: Cannot invoke "dev.langchain4j.model.output.TokenUsage.add(dev.langchain4j.model.output.TokenUsage)" because "tokenUsageAccumulator" is null
        at dev.langchain4j.service.DefaultAiServices$1.invoke(DefaultAiServices.java:175)
        at $Proxy3.chat(Unknown Source)
        at AiServicesFunctionCalling.main(AiServicesFunctionCalling.java:71)

To Reproduce

Following is the jbang script

//DEPS dev.langchain4j:langchain4j:0.29.1
//DEPS dev.langchain4j:langchain4j-local-ai:0.29.1


import java.time.Duration;
import dev.langchain4j.agent.tool.Tool;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.localai.LocalAiChatModel;
import dev.langchain4j.service.AiServices;

class AiServicesFunctionCalling {

    private static final String MODEL = "gpt-4";
    private static final String BASE_URL = "http://localhost:8080";
    private static final Duration timeout = Duration.ofSeconds(120);
    private static String question;

    static class Calculator {

        @Tool("Calculates the length of a string")
        int stringLength(String s) {
            System.out.println("Called stringLength with s='" + s + "'");
            return s.length();
        }

        @Tool("Calculates the sum of two numbers")
        int add(int a, int b) {
            System.out.println("Called add with a=" + a + ", b=" + b);
            return a + b;
        }

        @Tool("Calculates the square root of a number")
        double sqrt(int x) {
            System.out.println("Called sqrt with x=" + x);
            return Math.sqrt(x);
        }
    }

    interface ChatMinion {
        String chat(String message);
    }

    public static void main(String[] args) {

        ChatLanguageModel model = LocalAiChatModel.builder()
                .baseUrl(BASE_URL)
                .modelName(MODEL)
                .timeout(timeout)
                .temperature(0.0)
                .build();
        ChatMemory memory = MessageWindowChatMemory.withMaxMessages(10);

        ChatMinion minion = AiServices.builder(ChatMinion.class)
                .chatLanguageModel(model)
                .tools(new Calculator())
                .chatMemory(memory)
                .build();
        question = "What is the square root 226?";
        System.out.println("Asking question: " + question);
        String text = minion.chat(question);
        System.out.println("\n\nResponse:\n\n" + text);
    }
}

Expected behavior

Should call the function and return the result

Please complete the following information:

  • LangChain4j version: 0.29.1
  • LLM(s) used: LocalAI (gpt-4)
  • Java version: 21
  • Spring Boot version (if applicable): NA

Additional context

@rprabhu rprabhu added the bug Something isn't working label Apr 14, 2024
@Kugaaa
Copy link
Contributor

Kugaaa commented Apr 15, 2024

dev.langchain4j.model.localai.LocalAiChatModel#generate(java.util.List<dev.langchain4j.data.message.ChatMessage>, java.util.List<dev.langchain4j.agent.tool.ToolSpecification>, dev.langchain4j.agent.tool.ToolSpecification)

private Response<AiMessage> generate(List<ChatMessage> messages,
                                         List<ToolSpecification> toolSpecifications,
                                         ToolSpecification toolThatMustBeExecuted
    ) {
        ChatCompletionRequest.Builder requestBuilder = ChatCompletionRequest.builder()
                .model(modelName)
                .messages(toOpenAiMessages(messages))
                .temperature(temperature)
                .topP(topP)
                .maxTokens(maxTokens);
        ...
        return Response.from(
                aiMessageFrom(response),
                null,
                finishReasonFrom(response.choices().get(0).finishReason())
        );
    }

the response of generate returned that tokenUsage is null
So, is null here the correct situation? I have considered two ways

  • proxy method invoke compatible null value
  • diffrent TokenUsage implement

I hope to try to solve it~

@langchain4j
Copy link
Owner

Related to #921

@langchain4j
Copy link
Owner

@Kugaaa I think AI Service should be changed to handle TokenUsage == null situation well, not all LLM providers return it.

Ideally, LocalAiChatModel should return TokenUsage, if LocalAI returns it (need to check it).

@ohadpinch
Copy link

Please raise priority

@langchain4j
Copy link
Owner

@ohadpinch it should have been fixed by https://github.com/langchain4j/langchain4j/pull/939/files, could you please check the latest version (or snapshot)?

@ohadpinch
Copy link

@ohadpinch it should have been fixed by https://github.com/langchain4j/langchain4j/pull/939/files, could you please check the latest version (or snapshot)?

I checked it, it's fixed with those changes.

@langchain4j
Copy link
Owner

@ohadpinch thank you!

@langchain4j
Copy link
Owner

langchain4j commented May 10, 2024

Fixed by #939

@ohadpinch
Copy link

@langchain4j sorry for the mix up!
The issue still exist on #939, I ment I checked it with #942 and it seems solved.

@langchain4j langchain4j reopened this May 12, 2024
@Kugaaa
Copy link
Contributor

Kugaaa commented May 12, 2024

@Kugaaa I think AI Service should be changed to handle TokenUsage == null situation well, not all LLM providers return it.

Ideally, LocalAiChatModel should return TokenUsage, if LocalAI returns it (need to check it).

@langchain4j I think #939 not compatible with this situation

#942 If the solution is just return a null value, it appears that too little information is provided, so I want to use the null object mode return a NullTokenUsage, it will throw an exception and more infomation about used LLM not supported token usage.

Kugaaa added a commit to Kugaaa/langchain4j that referenced this issue May 13, 2024
…Usage instance with null fields when openAi Usage is null
langchain4j pushed a commit that referenced this issue May 17, 2024
langchain4j pushed a commit that referenced this issue May 17, 2024
@langchain4j langchain4j mentioned this issue May 17, 2024
6 tasks
langchain4j added a commit that referenced this issue May 21, 2024
## Issue
Fix #935 

## Change
Replaced `TokenUsage.add(TokanUsage)` with static
`TokenUsage.sum(TokanUsage, TokanUsage)` to better handle cases when one
of the `TokanUsage` is absent (null).

## General checklist
<!-- Please double-check the following points and mark them like this:
[X] -->
- [X] There are no breaking changes
- [X] I have added unit and integration tests for my change
- [X] I have manually run all the unit and integration tests in the
module I have added/changed, and they are all green
- [X] I have manually run all the unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules, and they are all green
<!-- Before adding documentation and example(s) (below), please wait
until the PR is reviewed and approved. -->
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working P3 Medium priority
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants