Skip to content

Releases: cpacker/MemGPT

v0.3.17

05 Jun 06:18
e1cbe64
Compare
Choose a tag to compare

🦙 You can now use MemGPT with the Ollama embeddings endpoint!

What's Changed

New Contributors

Full Changelog: 0.3.16...0.3.17

v0.3.16

26 May 23:05
ec894cd
Compare
Choose a tag to compare

🧿 Milvus integration: you can now use Milvus to back the MemGPT vector database! For more information, see: https://memgpt.readme.io/docs/storage#milvus

What's Changed

New Contributors

Full Changelog: 0.3.15...0.3.16

v0.3.15

16 May 22:56
c6325fe
Compare
Choose a tag to compare

🦙 Llama 3 support and bugfixes

What's Changed

New Contributors

Full Changelog: 0.3.14...0.3.15

v0.3.14

03 May 22:52
0a4adcb
Compare
Choose a tag to compare

🐜 Bug-fix release

What's Changed

Full Changelog: 0.3.13...0.3.14

v0.3.13

01 May 20:43
dfb4224
Compare
Choose a tag to compare

🖥️ MemGPT Dev Portal (alpha build)

Please note the dev portal is in alpha and this is not an official release!

This adds support for viewing the dev portal when the MemGPT service is running. You can view the dev portal on memgpt.localhost (if running with docker) or localhost:8283 (if running with memgpt server).

Make sure you install MemGPT with pip install pymemgpt and run memgpt quickstart [--backend openai] or memgpt configure before running the server.

There are two options to deploy the server:

Option 1: Run with docker compose

  1. Install and run docker
  2. Clone the repo: git clone git@github.com:cpacker/MemGPT.git
  3. Run docker compose up
  4. Go to memgpt.localhost in the browser to view the developer portal

Option 2: Run with the CLI:

  1. Run memgpt server
  2. Go to localhost:8283 in the browser to view the developer portal

What's Changed

Full Changelog: 0.3.12...0.3.13

0.3.12

23 Apr 04:42
274596c
Compare
Choose a tag to compare

🐳 Cleaned up workflow for creating a MemGPT service with docker compose up:

  • Reverse proxy added so you can open the dev portal at http://memgpt.localhost
  • Docker development with docker compose -f dev-compose.yaml up --build (built from local code)
  • Postgres data mounted to .pgdata folder
  • OpenAI keys passed to server via environment variables (in compose.yaml)

🪲 Bugfixes for Groq API and server

What's Changed

New Contributors

Full Changelog: 0.3.11...0.3.12

0.3.11

19 Apr 03:48
aeb4a94
Compare
Choose a tag to compare

🚰 We now support streaming in the CLI when using OpenAI (+ OpenAI proxy) endpoints! You can turn on streaming mode with memgpt run --stream

screencast

What's Changed

  • fix: remove default persona/human from memgpt configure and add functionality for modifying humans/presets more clearly by @sarahwooders in #1253
  • fix: update ChatCompletionResponse to make model field optional by @sarahwooders in #1258
  • fix: Fixed NameError: name 'attach' is not defined by @taddeusb90 in #1255
  • fix: push/pull container from memgpt/memgpt-server:latest by @sarahwooders in #1267
  • fix: remove message UTC validation temporarily to fix dev portal + add -d flag to docker compose up for tests by @sarahwooders in #1268
  • chore: bump version by @sarahwooders in #1269
  • feat: add streaming support for OpenAI-compatible endpoints by @cpacker in #1262

New Contributors

Full Changelog: 0.3.10...0.3.11

0.3.10

13 Apr 05:35
b9f0eb3
Compare
Choose a tag to compare

We added support for Anthropic, Cohere, and Groq!
image

What's Changed

Full Changelog: 0.3.9...0.3.10

0.3.9

11 Apr 02:53
9ffa003
Compare
Choose a tag to compare

This PR add Google AI Gemini Pro support for MemGPT, as well as Python 3.12 support.

Using MemGPT with Gemini

Setting up Gemini with MemGPT configure:

> memgpt configure
Loading config from /Users/loaner/.memgpt/config
? Select LLM inference provider: google_ai
? Enter your Google AI (Gemini) API key (see https://aistudio.google.com/app/a
pikey): *********
? Enter your Google AI (Gemini) service endpoint (see https://ai.google.dev/api/rest): generativelanguage
? Select default model: gemini-pro
Got context window 30720 for model gemini-pro (from Google API)
? Select your model's context window (see https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versioning#gemini-model-versions): 30720
? Select embedding provider: openai
? Select default preset: memgpt_chat
? Select default persona: sam_pov
? Select default human: basic
? Select storage backend for archival data: chroma
? Select chroma backend: persistent
? Select storage backend for recall data: sqlite
📖 Saving config to /Users/loaner/.memgpt/config

What's Changed

Full Changelog: 0.3.8...0.3.9

0.3.8

03 Apr 20:10
fb2d78f
Compare
Choose a tag to compare

This release introduces initial support for running a MemGPT server with Docker Compose, and bugfixes for storing embeddings and message timestamps.

What's Changed

New Contributors

Full Changelog: 0.3.7...0.3.8