-
Notifications
You must be signed in to change notification settings - Fork 254
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to make streaming calls to an already established Assistant? #85
Comments
The code appears to be mostly correct to me. You can substitute print with your own code to display or forward the message chunk. |
here is my own code,write by myself
|
is there something wrong with my MessageChunk class? |
If I remove the following conditional statement, the code block will be able to return the expected result for the given question.
However, it seems that the result is not returned in a streaming manner. I need to wait for a period of time and then receive all the items at once before printing them. |
@Hapluckyy @TomZhou2024
|
much appreciated! |
How can I use the TaskingAI Python SDK to make streaming calls to an already established Assistant? Currently, I am using the following code:
Start an async chat completion task with streaming
import time
import torch
import uvicorn
from pydantic import BaseModel, Field
from fastapi import FastAPI, HTTPException
from fastapi.middleware.cors import CORSMiddleware
from contextlib import asynccontextmanager
from typing import Any, Dict, List, Literal, Optional, Union
from transformers import AutoTokenizer, AutoModel, AutoModelForCausalLM
from taskingai.inference import a_chat_completion
from taskingai.inference import UserMessage, SystemMessage
from enum import Enum
import taskingai
from taskingai.assistant import Assistant
from taskingai.assistant.message import Message
class MessageRole(str, Enum):
USER = "user"
ASSISTANT = "assistant"
class MessageChunk(BaseModel):
role: MessageRole = Field(...)
index: int = Field(...)
delta: str = Field(...)
created_timestamp: int = Field(..., ge=0)
taskingai.init(api_key='tkOb2nWY19LJq2IpyXRnH82YRL4tv2sD', host='https://taskai.bnuzh.free.hr')
assistant: Assistant = taskingai.assistant.get_assistant(
assistant_id="X5lMiuMf1AspX3GBKm1YIVN0"
)
chat = taskingai.assistant.create_chat(
assistant_id=assistant.assistant_id,
)
assistant_message_response = taskingai.assistant.generate_message(
assistant_id=assistant.assistant_id,
chat_id=chat.chat_id,
stream=True,
)
print(f"Assistant:", end=" ", flush=True)
for item in assistant_message_response:
if isinstance(item, MessageChunk):
print(item.delta, end="", flush=True)
The text was updated successfully, but these errors were encountered: