Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add return_exceptions to batch_completion (retry) #3462

Merged
merged 2 commits into from
May 24, 2024

Conversation

ffreemt
Copy link
Contributor

@ffreemt ffreemt commented May 5, 2024

Refer to #3397.

Samepl code and screenshot to show it works locally with and without setting return_exceptions.

  1. without setting return_exception (or setting to False): it throws an exception, exactly as what litellm behaves without this modification.
  2. with return_exception setting to True, bacth_completion won't exceptions, exceptions if any are returned instead.
# litellm-rundown.py
import litellm
msg1 = [{"role": "user", "content": "hi 1"}]
msg2 = [{"role": "user", "content": "hi 2"}]

if "res" in globals():
  del res
try:
  res = litellm.batch_completion(
      model="gpt-3.5-turbo",
      messages=[msg1, msg2],
      api_key="sk_xyz",  # deliberately set invalid key
      # return_exceptions=False,
  )
except Exception as exc:
  # res not defined
  print("*** 1>>>", "res" in globals(), exc)

res = litellm.batch_completion(
    model="gpt-3.5-turbo",
    messages=[msg1, msg2],
    api_key="sk_xyz",  # deliberately set invalid key
    return_exceptions=True,
)

# res defined, exceptions returned to res
print("**** 2>>>", "res" in globals())

print(isinstance(res[0], litellm.exceptions.AuthenticationError), isinstance(res[1], litellm.exceptions.AuthenticationError))

Sample runs showing batch completions working locally with and without setting return_exceptions

litellm-run-Screenshot 2024-05-05 130545

i actually need this in a usecase. Thanks a lot for merging if possible.

Copy link

vercel bot commented May 5, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 24, 2024 3:10am

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we actually make this default behavior @ffreemt

no need to hide this behind return_exceptions

@ffreemt
Copy link
Contributor Author

ffreemt commented May 24, 2024

I made return-exceptions as default behavior @ishaan-jaff

I also removed a redundant test (test_batch_completion_return_exceptions_default()) in litellm/litellm/tests/test_batch_completion_return_exceptions.py since batch_completion will no longer raise any exception.

@ffreemt ffreemt requested a review from ishaan-jaff May 24, 2024 16:11
@ishaan-jaff ishaan-jaff merged commit 466accd into BerriAI:main May 24, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants