Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: add streaming response in production env. #27

Closed
playertk opened this issue Mar 24, 2024 · 5 comments · Fixed by #53
Closed

feature: add streaming response in production env. #27

playertk opened this issue Mar 24, 2024 · 5 comments · Fixed by #53
Labels
enhancement New feature or request

Comments

@playertk
Copy link

I buid app, The chat box cannot dynamically display one word at a time. but in run dev ,it work
When there is too much content, the chat box is black and waiting for a long time

@playertk playertk changed the title in Build app ,The chat box cannot dynamically display word one by one Bug:in Build app ,The chat box cannot dynamically display word one by one Mar 24, 2024
@jakobhoeg
Copy link
Owner

jakobhoeg commented Mar 24, 2024

Thanks for the feedback.

I am aware that in production the response is not being streamed as of right now. I am working on this.

The other issue of the chatbubble being black, can you elaborate on that issue? Is it only when there is too much content?

@playertk
Copy link
Author

I mean is that this is the same question. The process of AI generating articles word by word is not present, and users can only wait for all the content to be generated before seeing the results.

@jakobhoeg jakobhoeg added the enhancement New feature or request label Mar 24, 2024
@jakobhoeg jakobhoeg changed the title Bug:in Build app ,The chat box cannot dynamically display word one by one feature: add streaming response in production env. Mar 24, 2024
@malteneuss
Copy link
Contributor

This, is something that was bugging me as well for #50; no showstopper though.

It probably has to do with the post-processing of the LangchaingJS stream in React.

@malteneuss
Copy link
Contributor

@jakobhoeg Any idea when this will be worked on? This would probably bring the most benefit to users from the current state. If you host ollama on a slow machine and worse without graphics card, the speed can be very slow. With streaming at least you see that something is happening.

@jakobhoeg
Copy link
Owner

@jakobhoeg Any idea when this will be worked on? This would probably bring the most benefit to users from the current state. If you host ollama on a slow machine and worse without graphics card, the speed can be very slow. With streaming at least you see that something is happening.

I've been very busy with other stuff these past few weeks, but I'll try to make it a priority and take a look at it this weekend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants