You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
CreateWebsite Chatbot from Template
Add Datasource of Company website
Check "use local ai if available"
Change using Model to gpt-3.5-turbo
Enter preview mode
Ask What is the VAT of Company? AI responds with
I'm sorry, but I don't have access to the current VAT rates of Company. However, you can check the official website or contact their customer support for more information.
Ask What is the VAT Number of Company? multiple times
see error llmstack.processors.providers.promptly.text_chat.TextChatOutput() argument after ** must be a mapping, not NoneType
Expected behavior
I expect LLMstack to scrape the website for a VAT Number and return it back
Version
V0.0.15
Environment
DISTRIB_DESCRIPTION="Ubuntu 22.04.3 LTS"
Docker version 24.0.5, build ced0996
Docker Compose version v2.20.3
Screenshots
Additional context
Local-AI: I use the .env option: PRELOAD_MODELS=[{"url": "github:go-skynet/model-gallery/gpt4all-j.yaml", "name": "gpt-3.5-turbo"}, { "url": "github:go-skynet/model-gallery/bert-embeddings.yaml", "name": "text-embedding-ada-002"}]
age=0.000 s; distance=16 kB, estimate=16 kB
local-ai.example.com | [127.0.0.1]:38564 200 - GET /readyz
local-ai.example.com | [127.0.0.1]:44518 200 - GET /readyz
llmstack-0015-api-1 | INFO 2023-09-26 05:37:13,575 coordinator Spawned actor InputActor (urn:uuid:c14a0071-7373-4e57-900b-bc847e0f3977) for coordinator urn:uuid:c9288903-25c1-4998-a08f-62832159e5e9
llmstack-0015-api-1 | INFO 2023-09-26 05:37:13,575 coordinator Spawned actor OutputActor (urn:uuid:9227f669-8f57-4fca-8702-f038c7454268) for coordinator urn:uuid:c9288903-25c1-4998-a08f-62832159e5e9
llmstack-0015-api-1 | INFO 2023-09-26 05:37:13,576 coordinator Spawned actor TextChat (urn:uuid:cc6c9de3-878e-419b-9bd7-a4eafaf68268) for coordinator urn:uuid:c9288903-25c1-4998-a08f-62832159e5e9
llmstack-0015-api-1 | INFO 2023-09-26 05:37:13,576 coordinator Spawned actor BookKeepingActor (urn:uuid:b2f016da-e10b-483b-87ce-1ded941669d6) for coordinator urn:uuid:c9288903-25c1-4998-a08f-62832159e5e9
local-ai.example.com | 5:37AM DBG Request received:
local-ai.example.com | 5:37AM DBG Configuration read: &{PredictionOptions:{Model:ggml-gpt4all-j.bin Language: N:0 TopP:0.7 TopK:80 Temperature:0.7 Maxtokens:0 Echo:false Batch:0 F16:false IgnoreEOS:false RepeatPenalty:0 Keep:0 MirostatETA:0 MirostatTAU:0 Mirostat:0 FrequencyPenalty:0 TFZ:0 TypicalP:0 Seed:0 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0 UseFastTokenizer:false ClipSkip:0 Tokenizer:} Name:gpt-3.5-turbo F16:false Threads:10 Debug:true Roles:map[] Embeddings:false Backend:gpt4all-j TemplateConfig:{Chat:gpt4all-chat ChatMessage: Completion:gpt4all-completion Edit: Functions:} PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: FunctionsConfig:{DisableNoAction:false NoActionFunctionName: NoActionDescriptionName:} FeatureFlag:map[] LLMConfig:{SystemPrompt: TensorSplit: MainGPU: RMSNormEps:0 NGQA:0 PromptCachePath: PromptCacheAll:false PromptCacheRO:false MirostatETA:0 MirostatTAU:0 Mirostat:0 NGPULayers:0 MMap:false MMlock:false LowVRAM:false Grammar: StopWords:[] Cutstrings:[] TrimSpace:[] ContextSize:1024 NUMA:false LoraAdapter: LoraBase: NoMulMatQ:false} AutoGPTQ:{ModelBaseName: Device: Triton:false UseFastTokenizer:false} Diffusers:{PipelineType: SchedulerType: CUDA:false EnableParameters: CFGScale:0 IMG2IMG:false ClipSkip:0 ClipModel: ClipSubFolder:} Step:0 GRPC:{Attempts:0 AttemptsSleepTime:0}}
local-ai.example.com | 5:37AM DBG Parameters: &{PredictionOptions:{Model:ggml-gpt4all-j.bin Language: N:0 TopP:0.7 TopK:80 Temperature:0.7 Maxtokens:0 Echo:false Batch:0 F16:false IgnoreEOS:false RepeatPenalty:0 Keep:0 MirostatETA:0 MirostatTAU:0 Mirostat:0 FrequencyPenalty:0 TFZ:0 TypicalP:0 Seed:0 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0 UseFastTokenizer:false ClipSkip:0 Tokenizer:} Name:gpt-3.5-turbo F16:false Threads:10 Debug:true Roles:map[] Embeddings:false Backend:gpt4all-j TemplateConfig:{Chat:gpt4all-chat ChatMessage: Completion:gpt4all-completion Edit: Functions:} PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: FunctionsConfig:{DisableNoAction:false NoActionFunctionName: NoActionDescriptionName:} FeatureFlag:map[] LLMConfig:{SystemPrompt: TensorSplit: MainGPU: RMSNormEps:0 NGQA:0 PromptCachePath: PromptCacheAll:false PromptCacheRO:false MirostatETA:0 MirostatTAU:0 Mirostat:0 NGPULayers:0 MMap:false MMlock:false LowVRAM:false Grammar: StopWords:[] Cutstrings:[] TrimSpace:[] ContextSize:1024 NUMA:false LoraAdapter: LoraBase: NoMulMatQ:false} AutoGPTQ:{ModelBaseName: Device: Triton:false UseFastTokenizer:false} Diffusers:{PipelineType: SchedulerType: CUDA:false EnableParameters: CFGScale:0 IMG2IMG:false ClipSkip:0 ClipModel: ClipSubFolder:} Step:0 GRPC:{Attempts:0 AttemptsSleepTime:0}}
local-ai.example.com | 5:37AM DBG Prompt (before templating): You are a helpful chat assistant
local-ai.example.com | You are a chatbot that uses the provided context to answer the user's question.
local-ai.example.com | If you cannot answer the question based on the provided context, say you don't know the answer.
local-ai.example.com | No answer should go out of the provided input. If the provided input is empty, return saying you don't know the answer.
local-ai.example.com | Keep the answers terse.
local-ai.example.com | ----
local-ai.example.com | context:
local-ai.example.com | What is the VAT of Company?
local-ai.example.com | I'm sorry, but I don't have access to the current VAT rates of Company. However, you can check the official website or contact their customer support for more information.
local-ai.example.com | [172.23.0.8]:59670 200 - POST /chat/completions
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | 5:37AM DBG Stream request received
local-ai.example.com | 5:37AM DBG Template found, input modified to: The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.
local-ai.example.com | ### Prompt:
local-ai.example.com | You are a helpful chat assistant
local-ai.example.com | You are a chatbot that uses the provided context to answer the user's question.
local-ai.example.com | If you cannot answer the question based on the provided context, say you don't know the answer.
local-ai.example.com | No answer should go out of the provided input. If the provided input is empty, return saying you don't know the answer.
local-ai.example.com | Keep the answers terse.
local-ai.example.com | ----
local-ai.example.com | context:
local-ai.example.com | What is the VAT of Company?
local-ai.example.com | I'm sorry, but I don't have access to the current VAT rates of Company. However, you can check the official website or contact their customer support for more information.
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | ### Response:
local-ai.example.com | 5:37AM DBG Prompt (after templating): The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.
local-ai.example.com | ### Prompt:
local-ai.example.com | You are a helpful chat assistant
local-ai.example.com | You are a chatbot that uses the provided context to answer the user's question.
local-ai.example.com | If you cannot answer the question based on the provided context, say you don't know the answer.
local-ai.example.com | No answer should go out of the provided input. If the provided input is empty, return saying you don't know the answer.
local-ai.example.com | Keep the answers terse.
local-ai.example.com | ----
local-ai.example.com | context:
local-ai.example.com | What is the VAT of Company?
local-ai.example.com | I'm sorry, but I don't have access to the current VAT rates of Company. However, you can check the official website or contact their customer support for more information.
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | What is the VAT Number of Company?
local-ai.example.com | ### Response:
local-ai.example.com | 5:37AM DBG Loading model gpt4all-j from ggml-gpt4all-j.bin
local-ai.example.com | 5:37AM DBG Sending chunk: {"object":"chat.completion.chunk","model":"gpt-3.5-turbo","choices":[{"index":0,"delta":{"role":"assistant","content":""}}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}
local-ai.example.com | 5:37AM DBG Model already loaded in memory: ggml-gpt4all-j.bin
local-ai.example.com | [127.0.0.1]:57536 200 - GET /readyz
llmstack-0015-api-1 | INFO 2023-09-26 05:37:42,059 output Error in output actor: {'_inputs1': 'llmstack.processors.providers.promptly.text_chat.TextChatOutput() argument after ** must be a mapping, not NoneType'}
llmstack-0015-api-1 | INFO 2023-09-26 05:37:42,067 coordinator Coordinator urn:uuid:c9288903-25c1-4998-a08f-62832159e5e9 stopping
llmstack-0015-api-1 | INFO 2023-09-26 05:37:42,069 bookkeeping Stopping BookKeepingActor
llmstack-0015-api-1 | sys:1: ResourceWarning: unclosed <socket.socket fd=29, family=2, type=1, proto=6, laddr=('172.23.0.8', 59670), raddr=('172.23.0.2', 8080)>
llmstack-0015-api-1 | ResourceWarning: Enable tracemalloc to get the object allocation traceback
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
gpt-3.5-turbo
What is the VAT of Company?
AI responds withWhat is the VAT Number of Company?
multiple timesllmstack.processors.providers.promptly.text_chat.TextChatOutput() argument after ** must be a mapping, not NoneType
Expected behavior
I expect LLMstack to scrape the website for a VAT Number and return it back
Version
V0.0.15
Environment
DISTRIB_DESCRIPTION="Ubuntu 22.04.3 LTS"
Docker version 24.0.5, build ced0996
Docker Compose version v2.20.3
Screenshots
Additional context
Local-AI: I use the
.env
option:PRELOAD_MODELS=[{"url": "github:go-skynet/model-gallery/gpt4all-j.yaml", "name": "gpt-3.5-turbo"}, { "url": "github:go-skynet/model-gallery/bert-embeddings.yaml", "name": "text-embedding-ada-002"}]
Docker Compose
.env file
The text was updated successfully, but these errors were encountered: