By default, every Claude session, every Cursor chat, and every Windsurf task starts fresh. The assistant has no memory of what you told it yesterday, which libraries you prefer, or the context of your ongoing project. This is the agent amnesia problem.
Dakera solves it with a self-hosted memory server that connects to your AI tools via the Model Context Protocol (MCP). Once configured, your assistant can store what it learns and recall it in future sessions — without any code changes to your workflow.
This guide walks you through the complete setup: server installation, MCP binary, and client configuration for Claude Desktop, Claude Code, Cursor, and Windsurf.
A self-hosted memory server running on localhost:3300 that stores memories persistently to disk and exposes them to your AI clients via 83 MCP tools. All data stays on your machine.
localhost:3300
The simplest install runs Dakera as a single container. For production, see the full Docker Compose setup below.
docker run -d \ --name dakera \ --restart unless-stopped \ -p 3300:3300 \ -v dakera-data:/data \ -e DAKERA_PORT=3300 \ -e DAKERA_ROOT_API_KEY=my-dev-key \ ghcr.io/dakera-ai/dakera:latest
This starts Dakera on port 3300 with persistent storage in a named Docker volume.
curl http://localhost:3300/health # {"service":"dakera","status":"healthy","version":"0.11.55"}
You should see "status":"healthy". If you see a connection error, check that Docker is running and port 3300 isn't in use.
"status":"healthy"
The dakera-mcp binary is a small process that bridges your AI client to the Dakera server. It speaks the Model Context Protocol and exposes all 83 Dakera tools to your assistant.
dakera-mcp
Choose your platform:
Linux (x64):
curl -fsSL https://github.com/dakera-ai/dakera-mcp/releases/latest/download/dakera-mcp-linux-x64.tar.gz \ | tar -xz -C /usr/local/bin chmod +x /usr/local/bin/dakera-mcp
macOS (Apple Silicon):
curl -fsSL https://github.com/dakera-ai/dakera-mcp/releases/latest/download/dakera-mcp-darwin-arm64.tar.gz \ | tar -xz -C /usr/local/bin chmod +x /usr/local/bin/dakera-mcp
macOS (Intel):
curl -fsSL https://github.com/dakera-ai/dakera-mcp/releases/latest/download/dakera-mcp-darwin-x64.tar.gz \ | tar -xz -C /usr/local/bin chmod +x /usr/local/bin/dakera-mcp
Windows (PowerShell):
Invoke-WebRequest -Uri "https://github.com/dakera-ai/dakera-mcp/releases/latest/download/dakera-mcp-windows-x64.zip" -OutFile "dakera-mcp.zip" Expand-Archive dakera-mcp.zip -DestinationPath "$env:LOCALAPPDATA\dakera-mcp" # Add $env:LOCALAPPDATA\dakera-mcp to your PATH
dakera-mcp --version # dakera-mcp 0.9.8
Choose your client below:
Open your Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
Add the Dakera MCP server:
{ "mcpServers": { "dakera": { "command": "dakera-mcp", "env": { "DAKERA_API_URL": "http://localhost:3300", "DAKERA_API_KEY": "my-dev-key" } } } }
Restart Claude Desktop. You should see "dakera" appear in the MCP servers list.
In your project directory, create or edit .mcp.json:
.mcp.json
Or add globally in ~/.claude/claude.json to have memory across all projects.
~/.claude/claude.json
Open Cursor Settings → MCP Servers → Add new server:
{ "dakera": { "command": "dakera-mcp", "env": { "DAKERA_API_URL": "http://localhost:3300", "DAKERA_API_KEY": "my-dev-key" } } }
Create ~/.windsurf/mcp_config.json:
~/.windsurf/mcp_config.json
{ "servers": { "dakera": { "command": "dakera-mcp", "env": { "DAKERA_API_URL": "http://localhost:3300", "DAKERA_API_KEY": "my-dev-key" } } } }
In your AI client, type:
Store a memory that I prefer TypeScript over JavaScript for new projects, importance 0.8, for agent "me".
The assistant should call dakera_store and confirm the memory was saved.
dakera_store
In a new session, ask:
What programming language do I prefer?
The assistant should call dakera_recall and return the memory you stored. This confirms the full round-trip works: store → persist → recall across sessions.
dakera_recall
curl -s -X POST http://localhost:3300/v1/recall \ -H "Authorization: Bearer my-dev-key" \ -H "Content-Type: application/json" \ -d '{"agent_id":"me","query":"programming language preference","top_k":3}' \ | jq '.memories[].content'
Once the plumbing is working, the question becomes: what should you actually tell your assistant to remember? Here are patterns that work well:
"Remember that I prefer functional React components over class components, use Zod for validation, and always write JSDoc for public functions."
"Remember that this project uses PostgreSQL 16 with Drizzle ORM, the API is deployed on Railway, and the frontend is at vercel.com/my-project."
"Remember that we chose Tailwind over CSS Modules because the team is distributed and Tailwind's utility classes make async code reviews easier."
"Remember: never use console.log in this codebase — use the Logger utility at src/lib/logger.ts instead."
Tell your assistant: "At the start of each session, recall my top 5 memories for this project." This creates a consistent warm-up ritual that surfaces relevant context before you've even asked a question.
For teams or production use, the single-container setup lacks persistent object storage for large memory volumes. The recommended stack uses Dakera with MinIO:
git clone https://github.com/dakera-ai/dakera-deploy cd dakera-deploy cp .env.example .env
Edit .env:
.env
DAKERA_ROOT_API_KEY=change-me-to-a-strong-random-key MINIO_ROOT_USER=minio-admin MINIO_ROOT_PASSWORD=change-me-too
Start the full stack:
docker compose up -d
This brings up:
Update your MCP config to use your server's IP or hostname instead of localhost if running remotely:
localhost
{ "mcpServers": { "dakera": { "command": "dakera-mcp", "env": { "DAKERA_API_URL": "http://your-server-ip:3300", "DAKERA_API_KEY": "your-api-key" } } } }
Once connected, your assistant has access to Dakera's full API surface through MCP tools. The most commonly used:
dakera_batch_recall
dakera_hybrid_search
dakera_session_start / dakera_session_end
dakera_memory_update
dakera_forget
dakera_knowledge_graph
The assistant can discover all 83 tools via MCP's tool-listing capability. You don't need to specify which tools to use — the assistant calls them based on your request.
Restart your AI client after editing the config. The MCP server list is read at startup. Verify the path to dakera-mcp is correct with which dakera-mcp.
which dakera-mcp
Confirm memories were stored: curl http://localhost:3300/health should show healthy status. Try recalling with a broader query or lower top_k.
curl http://localhost:3300/health
Check Docker logs: docker logs dakera. Common causes: port 3300 already in use, or the Docker volume has permission issues.
docker logs dakera
The single-container setup uses a named Docker volume (dakera-data) for local persistence. Without MinIO, very large memory stores may be truncated. Use the full Docker Compose stack for production.
dakera-data