Skip to content
Dakera AI DAKERA
Home Docs Blog Integrations Benchmark GitHub Get Started
Home Docs Blog Integrations Benchmark
GitHub ↗ Try Free →
Engineering Blog

From the Dakera team

Product updates, engineering deep-dives, and developer guides from the people building Dakera.

8 posts
Memory infrastructure
Agent architecture
8 articles
All 8 Engineering 0 Analysis 0 Tutorial 0 Benchmarks 0 Product 0 Launch 0
Latest Post
Analysis 2026-05-13 12 min read
Best AI Agent Memory Frameworks in 2026: Compared and Ranked
Compare the top AI agent memory frameworks — Dakera, Mem0, Letta, Zep, and Hindsight. Benchmarks, architecture, deployment model, and when to use each.
Read post
Tutorial 2026-05-13 7 min read
Dakera MCP Memory Server: Setup Guide for Claude, Cursor, and Windsurf
Step-by-step guide to installing and configuring Dakera as a persistent MCP memory server for Claude Desktop, Claude Code, Cursor, and Windsurf. Docker setup, MCP config, and first recall in under 10 minutes.
Read post
Engineering 2026-05-07 9 min read
Why Rust for AI Memory: Performance, Safety, and a Self-Hosted Server That Fits in 44 MB
Most agent memory systems are Python services that need Docker Compose, Redis, and an external embedding API before you can store a single memory. Dakera is a single Rust binary. Here is why the language choice matters for an always-on memory server.
Read post
Engineering 2026-05-07 5 min read
Dakera as an MCP Memory Server: 83 Tools for Persistent Agent Memory
Dakera ships a native MCP server with 83 tools — store, recall, search, and manage persistent memory from any MCP-compatible agent, host, or IDE without writing a single API call.
Read post
Benchmarks 2026-05-07 9 min read
How We Benchmark Memory: Dakera on LoCoMo
A complete breakdown of Dakera's 87.6% LoCoMo score — the four question categories, the methodology behind each, where temporal inference still has room to improve, and how to run the evaluation against your own instance.
Read post
Product 2026-05-07 7 min read
What's Open in Dakera's Open Core — SDKs, CLI, and MCP Are MIT. The Engine Is Not.
The exact breakdown: SDKs, CLI, and MCP server are MIT-licensed on GitHub. The memory engine and dashboard are proprietary. What "open at the edges, closed at the core" means in practice.
Read post
Launch 2026-05-06 6 min read
Introducing Dakera: Production Memory Infrastructure for AI Agents
We're launching Dakera: a single Rust binary that gives your AI agents persistent memory, hybrid retrieval, knowledge graphs, and built-in embeddings — with no external services required.
Read post
Engineering 2026-05-06 8 min read
How Agent Memory Actually Works: Hybrid Retrieval and Importance Decay
A technical look at the retrieval engine inside Dakera — how we combine HNSW vector search with BM25 full-text search, why naive cosine similarity fails for agent workloads, and how importance decay keeps recall sharp over time.
Read post

No posts in this category yet.

Get new posts in your inbox

Engineering deep-dives, benchmark updates, and developer guides — delivered when we ship something worth reading.

Subscribe
© 2026 DAKERA AI
Home Docs GitHub Get Started