Skip to content

yaodongC/DriveLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 

Repository files navigation

DriveLLM, The offical code repo for DriveLLM

This repository contains the official implementation of the DriveLLM system, which leverages large language model (LLM) capabilities to enhance autonomous driving decision-making processes. The LLM-service backend is implemented using Fast-API and langchain. The LLM-Autonomous driving bridge is implemented for ROS 1 designed for a modified version of Autoware stack

Other resources:

Testing rosbags recorded on all-weather autonomous shuttle bus (WATonoBus) project

NAS link

instruction-following Finetune Dataset for General Driving Application

This contained a 20K dataset generated using self-instruct and is designed for general driving applications. Relevant dataset code on dataset generation, cleaning and formatting is here

Finetuned LLaMA model weight

How to use/test this service

Note: change http://localhost:9000/query to http://localhost:8300/query If you are running the service in docker.

Test the query endpoint:

Sending requests to the /query endpoint. Here is an example of using curl for service debuging (running locally):

curl -N -X POST 'http://localhost:9000/query' \
-H 'Content-Type: application/json' \
-d '{
    "messages": [
        {
            "role": "user",
            "perception":"perception", 
            "system_health":"system_health", 
            "weather":"weather", 
            "location":"location", 
            "vehicle_state":"vehicle_state",
            "control_command":"control_command",
            "command":"passenger_command"
        }
    ]
}'

๐Ÿš€ Running the app in docker (recommended for deployment)

docker-compose up

Build the docker image and run the container.

docker-compose build
docker-compose up

If you need to completely shut down your environment or clean up your resources:

docker-compose down 

stops running containers and remove them, along with their associated networks, volumes, and images.

Note: For docker, the service will now be accessible at http://localhost:8300.

โœ… Running locally (recommended for developer)

  1. Clone this repository
  2. Install dependencies:
pip install -r requirements.txt
  1. Create an .env file and add the following environment variables:
LOGGING_LEVEL=10 # 10-DEBUG, 20-INFO, 30-WARN, 40-ERROR
  1. Run the application using Uvicorn:
uvicorn main:app --host 0.0.0.0 --port 9000
  1. The service will now be accessible at http://localhost:9000.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published