Skip to content

uni-openai/lechat-pro

Repository files navigation

An application developed on UniAI, inspired by KimiChat, integrated with multiple AI models.

中文说明

Introduction

We admire Moonshot and its product, KimiChat. Inspired by KimiChat, we have developed and open-sourced LeChat, a chat tool based on large models. LeChat closely follows KimiChat and provides almost identical functionalities.

Unlike KimiChat, which has its own large models and received a $1 billion investment from Sequoia Capital, our project may appear modest. We have no external investments and only one full-time engineer from an unknown research institute (myself) and a former intern who has since joined a tech giant. Currently, the project is maintained solely by me due to limited personal resources and funding. In the spirit of open source, we have made all the code for this project, including the frontend, backend, and core libraries, available for public use. We hope you enjoy it and consider giving us a star on GitHub, as it motivates us to continue maintaining the project.

Since we don't have our own large models, we are more flexible and can integrate with any model. If you wish to deploy this project, you will need to register with one of the following large model providers' APIs:

  • Moonshot
  • OpenAI GPT
  • iFlyTek Spark
  • Baidu WenXin Workshop
  • Google Gemini
  • ZhiPu AI GLM

You can also implement interfaces to integrate with more models by contributing to our other open-source project, UniAI core library development.

If you are an open-source enthusiast who does not want commercial models in your project, we also support ChatGLM-6B. To deploy it, please refer to: https://github.com/uni-openai/GLM-API. After deployment, follow the backend environment variable configuration section and add the GLM API address.

Furthermore, you may consider integrating Musk's Grok-1 on your own, which would require developing the UniAI core library as well.

preview

Experience it here:

👉 LeChat

Open-source repository:

👉 UniAI-MaaS

Core library:

👉 UniAI Core Library

Quick Start

Note: This project requires the UniAI backend framework. https://github.com/uni-openai/uniai-maas

Before you begin, make sure you have correctly installed the Node.js runtime environment. If you haven't installed Node.js yet, click here to download.

Once you're ready, navigate to the project root directory and run the following commands to start the project:

npm install
npm run dev

Or

yarn
yarn dev

Upon successful execution, you will typically see (please refer to your specific output):

VITE v3.2.5 ready in 294 ms

➜ Local: http://localhost:5173/
➜ Network: use --host to expose

Hold down Ctrl or Command and click on the Local link to open the project in your browser. You can then log in via QR code or mobile verification code to start using the application.

Contributors

Weilong Yu

Youwei Huang

If you intend to package the project for local deployment, check here.

This project is licensed under the MIT License.