Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow local function calling? #889

Open
3 tasks done
jeffometer opened this issue Mar 27, 2024 · 3 comments
Open
3 tasks done

Allow local function calling? #889

jeffometer opened this issue Mar 27, 2024 · 3 comments
Labels
question Further information is requested

Comments

@jeffometer
Copy link

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to look for a similar issue and didn't find it.
  • I searched the Marvin documentation for this feature.

Describe the current behavior

Hello, I really like Marvin, especially the AI functions, but I have a need that it doesn't do (from all the docs that I've seen) and I am curious if it is on the roadmap or not. Specifically, I would like local function calling.

Describe the proposed behavior

I've been using Fructose in the mean time because they are essentially the same as the Marvin ai functions, but they also support local function calling which let's me augment my LLM calls in the same clean interface that brought me to Marvin/Fructose in the first place.

Example Use

See https://github.com/bananaml/fructose?tab=readme-ov-file#local-function-calling

Additional context

Ultimately I'd like to know if this is on the roadmap or considered "in scope" for Marvin. If so, I'm hoping to see how it could be prioritized. Thank you!

@jeffometer jeffometer added the enhancement New feature or request label Mar 27, 2024
@jeffometer
Copy link
Author

BTW, I am aware of the returned values feature, but as far as I can tell, that is different in that it doesn't allow the LLM to choose between multiple "tools" and also decide on the correct arguments to pass.

@zzstoatzz
Copy link
Collaborator

zzstoatzz commented Mar 27, 2024

hi @jeffometer - to clarify, by

local function calling

you mean an LLM calls a python function you wrote when appropriate? easiest way to do that is with assistants

image

here's a more complex example where we have many (10-15) tools to have it choose from

is that what you're looking for?

@zzstoatzz zzstoatzz added question Further information is requested and removed enhancement New feature or request labels Mar 27, 2024
@jeffometer
Copy link
Author

Hi @zzstoatzz, thanks for sharing that. It looks close to what I want and might work.

The only part that I am not sure of is how well that works with composition. What I mean by that is that I really like having "normal" functions that call "ai" functions, whose result might be passed to another "normal" function. I'd like to be able to use tools in the "ai" function in the middle. I'm unclear if the assistants approach works in such a nice ergonomic way.

Does that clarify what I'm looking for better?

Thanks agian!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants