Skip to content

Releases: zhudotexe/kani

v0.6.1

08 Nov 20:54
Compare
Choose a tag to compare
v0.6.1 Pre-release
Pre-release
  • Internal changes to the OpenAIEngine to make extending it easier
  • No consumer-facing changes

v0.6.0

08 Nov 18:23
Compare
Choose a tag to compare
v0.6.0 Pre-release
Pre-release

As of Nov 6, 2023, OpenAI added the ability for a single assistant message to request calling multiple functions in
parallel, and wrapped all function calls in a ToolCall wrapper. In order to add support for this in kani while
maintaining backwards compatibility with OSS function calling models, a ChatMessage now actually maintains the
following internal representation:

ChatMessage.function_call is actually an alias for ChatMessage.tool_calls[0].function. If there is more
than one tool call in the message, when trying to access this property, kani will raise an exception.

To translate kani's FUNCTION message types to OpenAI's TOOL message types, the OpenAIEngine now performs a translation based on binding free tool call IDs to following FUNCTION messages deterministically.

Breaking Changes

To the kani end user, there should be no change to how functions are defined and called. One breaking change was necessary:

  • Kani.do_function_call and Kani.handle_function_call_exception now take an additional tool_call_id parameter, which may break overriding functions. The documentation has been updated to encourage overriders to handle *args, **kwargs to prevent this happening again.

New Features

kani can now handle making multiple function calls in parallel if the model requests it. Rather than returning an ASSISTANT message with a single function_call, an engine can now return a list of tool_calls. kani will resolve these tool calls in parallel using asyncio, and add their results to the chat history in the order of the list provided.

Returning a single function_call will continue to work for backwards compatibility.

v0.5.1

25 Oct 17:53
Compare
Choose a tag to compare
v0.5.1 Pre-release
Pre-release
  • OpenAI: The OpenAIClient (internal class used by OpenAIEngine) now expects OpenAIChatMessages as input rather than kani.ChatMessage in order to better type-validate API requests
  • OpenAI: Updated token estimation to better reflect current token counts returned by the API

v0.5.0

24 Oct 17:06
Compare
Choose a tag to compare
v0.5.0 Pre-release
Pre-release

New Feature: Message Parts API

The Message Parts API is intended to provide a foundation for future multimodal LLMs and other engines that require engine-specific input without compromising kani's model-agnostic design. This is accomplished by allowing ChatMessage.content to be a list of MessagePart objects, in addition to a string.

This change is fully backwards-compatible and will not affect existing code.

When writing code with compatibility in mind, the ChatMessage class exposes ChatMessage.text (always a string or None) and ChatMessage.parts (always a list of message parts), which we recommend using instead of ChatMessage.content. These properties are dynamically generated based on the underlying content, and it is safe to mix messages with different content types in a single Kani.

Generally, message part classes are defined by an engine, and consumed by the developer. Message parts can be used in any role’s message - for example, you might use a message part in an assistant message to separate out a chain of thought from a user reply, or in a user message to supply an image to a multimodal model.

For more information, see the Message Parts documentation.

Up next: we're adding support for multimodal vision-language models like LLaVA and GPT-Vision through a kani extension!

Improvements

  • LLaMA 2: Improved the prompting in non-strict mode to group consecutive user/system messages into a single [INST] wrapper. See the tests for how kani translates consecutive message types into the LLaMA prompt.
  • Other documentation and minor improvements

v0.4.0

05 Oct 18:43
Compare
Choose a tag to compare
v0.4.0 Pre-release
Pre-release

BREAKING CHANGES

  • Kani.full_round now emits every message generated during the round, not just assistant messages
    • This means that you will need to handle FUNCTION messages, and potentially SYSTEM messages from a function exception handler.
    • Kani.full_round_str's default behaviour is unchanged.
  • Kani.full_round_str now takes in a message_formatter rather than a function_call_formatter
    • By default, this handler only returns the contents of ASSISTANT messages.
  • Kani.do_function_call now returns a FunctionCallResult rather than a bool
    • To migrate any overriding functions, you should change the following:
    • Rather than calling Kani.add_to_history in the override, save the ChatMessage to a variable
    • Update the return value from a boolean to FunctionCallResult(is_model_turn=<old return value>, message=<message from above>)
  • Kani.handle_function_call_exception now returns a ExceptionHandleResult rather than a bool
    • To migrate any overriding functions, you should change the following:
    • Rather than calling Kani.add_to_history in the override, save the ChatMessage to a variable
    • Update the return value from a boolean to ExceptionHandleResult(should_retry=<old return value>, message=<message from above>)

Improvements

  • Added kani.utils.message_formatters
  • Added kani.ExceptionHandleResult and kani.FunctionCallResult
  • Documentation improvements

Fixes

  • Fixed an issue where ChatMessage.copy_with could cause unset values to appear in JSON serializations

v0.3.4

26 Sep 18:53
08b5968
Compare
Choose a tag to compare
v0.3.4 Pre-release
Pre-release

Improvements

  • Updated dependencies to allow more recent versions
  • The documentation now shows fully-qualified class names in reference sections
  • Added .copy_with method to ChatMessage and FunctionCall to make updating chat history easier
  • Various documentation updates

v0.3.3

18 Sep 18:08
Compare
Choose a tag to compare
v0.3.3 Pre-release
Pre-release

Improvements

  • Added a warning in Kani.chat_round to use Kani.full_round when AI functions are defined
  • Added examples in Google Colab
  • Other documentation improvements

Fixes

v0.3.2

11 Sep 17:41
Compare
Choose a tag to compare
v0.3.2 Pre-release
Pre-release

Improvements

  • Made chat_in_terminal work in Google Colab, rather than having to use await chat_in_terminal_async

v0.3.1

11 Sep 15:27
Compare
Choose a tag to compare
v0.3.1 Pre-release
Pre-release
  • HuggingFace Engine: Fixed an issue where completion message lengths were overreported by an amount equal to the prompt length.
  • Other documentation improvements

v0.3.0

06 Sep 16:15
Compare
Choose a tag to compare
v0.3.0 Pre-release
Pre-release

Improvements

  • Added Kani.add_to_history, a method that is called whenever kani adds a new message to the chat context
  • httpclient.BaseClient.request now returns a Response to aid low-level implementation
    • .get() and .post() are unchanged
  • Add additional documentation about GPU support for local models
  • Other documentation improvements