Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configured providers but twinny not sending any requests to provider. #242

Open
Duoquote opened this issue May 9, 2024 · 3 comments
Open
Labels
help wanted Extra attention is needed

Comments

@Duoquote
Copy link

Duoquote commented May 9, 2024

Describe the bug
I have setup the following providers and I checked with curl that /api/generate endpoint on http://duodesk.duo:11434 works, the extension shows loading circle but is not sending any requests. Also tried setting Ollama Hostname setting to duodesk.duo, but no luck.

To Reproduce
Just added the providers I have attached.

Expected behavior
Should work with the providers I have I think?

Screenshots
image

Logging
Logging is enabled but not sure where am I supposed to see the logs, checked Output tab but there is no entry for twinny.

API Provider
Ollama running at http://duodesk.duo:11434 in local network.

Chat or Auto Complete?
Both

Model Name
codellama:7b-code

Desktop (please complete the following information):

  • OS: Windows
  • Version: 11

Additional context

@Duoquote
Copy link
Author

Duoquote commented May 9, 2024

I was looking at other issues and saw where I could see the logs, the log is as follows:
image

Looks like hostname is set incorrectly, how do I change it?

@rjmacarthy
Copy link
Owner

rjmacarthy commented May 9, 2024

Maybe try a restart? The settings look correct to me. Also in the extension settings change the Ollama settings too. Click the cog in the extension header, there are some api settings for ollama in there too.

@rjmacarthy rjmacarthy added the help wanted Extra attention is needed label May 15, 2024
@randomeduc
Copy link

randomeduc commented May 21, 2024

How can I check some logs? Because I have the same problem on Windows, I'm running VS Code from WSL:Ubuntu

  1. Install Ollama and check if is running at http://localhost:11434

imagen

  1. Install the models
    imagen

  2. Install and configure the vscode extension
    Settings
    imagen

Providers
imagen

  1. Test the chat
    imagen

Got this message from vs code dev tool
imagen

But the loading spinner keep there, any help? Also try 127.0.0.1 as host but same results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants