Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming Function Calls #47

Open
rodrigoGA opened this issue Apr 10, 2024 · 0 comments
Open

Streaming Function Calls #47

rodrigoGA opened this issue Apr 10, 2024 · 0 comments
Labels
component:quickstarts Issues/PR referencing quickstarts folder type:feature request New feature request/enhancement

Comments

@rodrigoGA
Copy link

I think an example of function calls using streaming would be helpful, as I believe it's the most common case for the chatbot. I've made a test script, though I'm not sure if it's the correct way to do it, but many questions have arisen.


functions = {
    'find_movies': find_movies,
    'find_theaters': find_theaters,
    'get_showtimes': get_showtimes,
}
instruction = "Hablarás igual que yoda en starwars."
model = genai.GenerativeModel(
    "models/gemini-1.5-pro-latest" , 
    system_instruction=instruction, 
    generation_config=genai.GenerationConfig(temperature=0),
    tools=functions.values()
)



def generate_response(messages):
  functios_to_call = []
  complete_response = ''

  response = model.generate_content(messages,  stream=True )
  for chunk in response:
    part = chunk.candidates[0].content.parts[0]
    if part.function_call:
      functios_to_call.append(part.function_call)


    if part.text:
      print('presponse part:', chunk.text)
      complete_response = complete_response +   chunk.text


  if len(complete_response)>0:
    messages.append({'role':'user', 'parts': [complete_response]},)

  if len(functios_to_call) > 0:
    for function_call in functios_to_call:
      result = call_function(part.function_call, functions)
      s = Struct()
      s.update({'result': result})
      # Update this after https://github.com/google/generative-ai-python/issues/243
      function_response = glm.Part(function_response=glm.FunctionResponse(name='find_theaters', response=s))
      messages.append({'role':'model', 'parts': response.candidates[0].content.parts})
      messages.append({'role':'user',  'parts': [function_response]})
    generate_response(messages)





messages = []

while True:
  print("_"*80)
  user_input = input()
  messages.append({'role':'user', 'parts': [user_input]},)
  generate_response(messages)
    

The questions are as follows:

  • Is the model capable of responding with text, calling functions within the same iteration? And can it call more than one function at a time? How should I structure the response message in those cases?
  • When the model calls a function, does the streaming not work? That is, does the call come all at once, or if not, is there any way to receive the call in parts? That is, I want to know as quickly as possible that a function will be called since at that moment you can give the user a heads-up that the response will take longer than normal, or in systems that have speech, something like 'mmm' can be said. On the other hand, if it is only notified to the model already with the complete call data and parameters, it does not make much sense, as the waiting time has already been reduced.
  • In each streaming chunk, I am keeping part = chunk.candidates[0].content.parts[0], my question is whether I need to iterate through all the parts, that is, could the model return a parameter and a text?
  • Another doubt I have is about how the instruction is being managed, that is, in a system with many models or chats, it is forcing to have a model instantiated in memory per instruction. That is, the instruction can often be different for each user, it would be much better if the system_instruction could be passed as a parameter at the moment of performing generate_content, is there any way to do this?
@ymodak ymodak added component:examples Issues/PR referencing examples folder type:feature request New feature request/enhancement component:quickstarts Issues/PR referencing quickstarts folder and removed component:examples Issues/PR referencing examples folder labels Apr 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component:quickstarts Issues/PR referencing quickstarts folder type:feature request New feature request/enhancement
Projects
None yet
Development

No branches or pull requests

2 participants