Skip to content

Commit

Permalink
Merge pull request #23 from MoritzLaurer/add-hugging-face-TGI
Browse files Browse the repository at this point in the history
adding huggingface TGI
  • Loading branch information
imaurer committed May 15, 2024
2 parents 4ca4cc4 + 3b09d17 commit e5335a2
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ None of these names are great, that's why I named this list just "Awesome LLM JS
| Fireworks.ai | firefunction-v1 | [Function Calling](https://readme.fireworks.ai/docs/function-calling)<br>[JSON Mode](https://readme.fireworks.ai/docs/structured-response-formatting)<br>[Grammar mode](https://readme.fireworks.ai/docs/structured-output-grammar-based)<br>[Pricing](https://fireworks.ai/pricing)<br>[Announcement (2023-12-20)](https://blog.fireworks.ai/fireworks-raises-the-quality-bar-with-function-calling-model-and-api-release-e7f49d1e98e9) |
| Google | gemini-1.0-pro | [Function Calling](https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling#rest)<br>[Pricing](https://ai.google.dev/pricing?authuser=1) |
| Groq | llama2-70b<br>mixtral-8x7b<br>gemma-7b-it | [Function Calling](https://console.groq.com/docs/tool-use)<br>[Pricing](https://wow.groq.com/) |
| Hugging Face TGI | [many open-source models](https://huggingface.co/docs/text-generation-inference/supported_models) | [Grammars, JSON mode, Function Calling and Tools](https://huggingface.co/docs/text-generation-inference/conceptual/guidance#guidance)<br>For [free locally](https://huggingface.co/docs/text-generation-inference/basic_tutorials/consuming_tgi), or via [dedicated](https://huggingface.co/docs/inference-endpoints/index) or [serverless](https://huggingface.co/docs/api-inference/index) endpoints. |
| Mistral | mistral-large-latest | [Function Calling](https://docs.mistral.ai/guides/function-calling/)<br>[Pricing](https://docs.mistral.ai/platform/pricing/) |
| OpenAI | gpt-4<br>gpt-4-turbo<br>gpt-35-turbo | [Function Calling](https://openai.com/blog/openai-api/)<br>[JSON Mode](https://platform.openai.com/docs/guides/text-generation/json-mode)<br>[Pricing](https://openai.com/pricing)<br>[Announcement (2023-06-13)](https://openai.com/blog/function-calling-and-other-api-updates) |
| Rysana | inversion-sm | [API Docs](https://rysana.com/docs/api)<br>[Pricing](https://rysana.com/pricing)<br>[Announcement (2024-03-18)](https://rysana.com/inversion) |
Expand Down Expand Up @@ -75,6 +76,9 @@ Below is a list of hosted API models that support multiple parallel function cal

[Functionary](https://functionary.meetkai.com/) (2023-08-04, [MeetKai](https://meetkai.com/)) interprets and executes functions based on JSON Schema Objects, supporting various compute requirements and call types. Compatible with OpenAI-python and llama-cpp-python for efficient function execution in JSON generation tasks.

[Hugging Face TGI](https://huggingface.co/docs/text-generation-inference/conceptual/guidance) enables JSON outputs and function calling for a [variety of local models](https://huggingface.co/docs/text-generation-inference/supported_models).


## Python Libraries


Expand Down

0 comments on commit e5335a2

Please sign in to comment.