Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR: Add Support for Groq's Hosted Models via Groq's OpenAI Compatibility API #683

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

jackspirou
Copy link

  • Add Groq model options via Groq's OpenAI API compatibility
  • Update documentation accordingly
  • Fix example

Fixes #682

Copy link

codecov bot commented Mar 14, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 98.46%. Comparing base (699f397) to head (e85e425).

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #683      +/-   ##
==========================================
+ Coverage   98.44%   98.46%   +0.01%     
==========================================
  Files          24       24              
  Lines        1353     1364      +11     
==========================================
+ Hits         1332     1343      +11     
  Misses         15       15              
  Partials        6        6              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

- Add Groq model options via Groq's OpenAI API compatibility
- Update documentation accordingly
- Fix example
- Fix golangci-lint compliants
- Fix comment typo
- Fix usage example comments
- Remove silly test.mp3
- Add test for chat completions endpoint using groq client config
@sashabaranov
Copy link
Owner

@jackspirou thank you for the PR!

I don't think Groq needs any special treatment, as there are multiple services with OpenAI-compatible API now.

It requires a minimal four-line change to the basic example we have in README to make it work with Groq (or any other service like Ollama for that matter). I'm totally open to expanding our README with such an example, though!

--- basic.go	2024-03-15 14:40:43
+++ groq.go	2024-03-15 14:40:29
@@ -7,11 +7,14 @@
 )

 func main() {
-	client := openai.NewClient("your token")
+	config := openai.DefaultConfig("token")
+	config.BaseURL = "https://api.groq.com/openai/v1"
+	client := openai.NewClientWithConfig(config)
+
 	resp, err := client.CreateChatCompletion(
 		context.Background(),
 		openai.ChatCompletionRequest{
-			Model: openai.GPT3Dot5Turbo,
+			Model: "mixtral-8x7b-32768",
 			Messages: []openai.ChatCompletionMessage{
 				{
 					Role:    openai.ChatMessageRoleUser,```

@jackspirou
Copy link
Author

jackspirou commented Mar 16, 2024

@sashabaranov - Thanks for the response and thanks for the example! I didn't understand the forest for the trees here. I agree with your assessment above, given the example you shared.

I'll update this PR to take a stab at a simple readme update per your advice.

One side-note - it might be nice to update the pattern of:
openai.NewClient(authToken string) to something like
openai.NewClient(authToken string, optons ...func(*Client)).

I think it might have provided some hints for me to see the forest for the trees in this case. Maybe it's something you've already considered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Support for Groq's Hosted Models via Groq's OpenAI Compatibility API
2 participants