A high-throughput and memory-efficient inference and serving engine for LLMs
-
Updated
May 17, 2024 - Python
A high-throughput and memory-efficient inference and serving engine for LLMs
PerfectRep is a 3D pose estimation model tailored specifically for powerlifting analysis. It allows for precise tracking and analysis of lifter's movements to ensure perfect form and technique.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
Large Language Model Text Generation Inference
Implementation of the GPT architecture in Rust 🦀 + Burn 🔥
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
Self-Created Tools to convert ONNX files (NCHW) to TensorFlow/TFLite/Keras format (NHWC). The purpose of this tool is to solve the massive Transpose extrapolation problem in onnx-tensorflow (onnx-tf). I don't need a Star, but give me a pull request.
Translate manga/image 一键翻译各类图片内文字 https://cotrans.touhou.ai/
Neural machine translation English - Vietnamese with Transformer, trained from scratch
A powerful HTTP client for Dart and Flutter, which supports global settings, Interceptors, FormData, aborting and canceling a request, files uploading and downloading, requests timeout, custom adapters, etc.
A PyTorch implementation of the Transformer.
Research and Materials on Hardware implementation of Transformer Model
Faster Whisper transcription with CTranslate2
paragraph2OWL is an advanced software tool designed to assist researchers in identifying and utilizing relevant ontologies in their fields of study
Implemented transformer NN block for Machine translation, text classfication, Natural language inference as well as Machine reading comprehension model.
Implementations of Deep Learning Techniques
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."