The memory-mcp platform is currently in development. The Self-hosted Community Edition is available now!

Quickstart

This quickstart will get Memory MCP-CE running locally using Docker Compose.

The first run takes a few minutes while containers and embedding models are downloaded. Subsequent restarts are fast.

1. Prepare your environment #

Fetch the required files directly from GitHub:

curl -O https://raw.githubusercontent.com/symbiomind/memory-mcp-ce/main/docker-compose.yml
curl -O https://raw.githubusercontent.com/symbiomind/memory-mcp-ce/main/.env.example

Copy the example environment file and edit it:

cp .env.example .env

At minimum, change:

  • POSTGRES_PASSWORD

You may also adjust other settings later, but the defaults are fine to start.


2. Create the data directory #

mkdir -p data

Docker will create the required subdirectories automatically.

Important: The ‘data’ directory stores your PostgreSQL database and Ollama models. Deleting it will permanently erase all stored memories and downloaded models.


3. Pull the embedding model (one-time setup) #

The embedding model must be pulled once before starting the full stack.

docker compose up -d ollama
docker exec -it memory-ollama ollama pull granite-embedding:30m

This may take a few minutes depending on your network connection.


4. Start the full stack #

docker compose up -d

5. Verify it is running #

Your MCP server should now be available at:

http://localhost:5005

Next steps #

?

AI Assistant

0/500