Full CodeThis tutorial explains how to integrate Memobase with the Model Context Protocol (MCP) to provide your AI agents with persistent, long-term memory. By using the Memobase MCP server, your agents can store, retrieve, and search memories, making them stateful and context-aware across conversations.
What is MCP?
The Model Context Protocol is an open standard that allows AI assistants to securely connect to external data sources and tools. This enables them to access real-time information, execute functions, and maintain a persistent state, breaking free from the limitations of their training data.Why Memobase + MCP?
Traditional AI conversations are stateless. The Memobase MCP server changes this by providing:- Persistent Memory: Store conversation history and user preferences across sessions.
- Semantic Search: Find relevant context using natural language queries.
- User Profiles: Build a comprehensive understanding of users over time.
- Cross-Platform Compatibility: Works with any MCP-compatible client, such as Claude Desktop, Cursor, or Windsurf.
Setup
Prerequisites
Installation
We recommend usinguv
for installation:
Environment Configuration
Configure your.env
file:
Running the MCP Server
Start the server usinguv
:
http://localhost:8050
with an SSE endpoint at /sse
.
Client Integration
Configure your MCP client to connect to the Memobase server. For example, in Cursor, add this to your.cursor/mcp.json
:
Available Tools
The Memobase MCP server exposes three powerful tools to your AI agent.1. save_memory
Stores information in long-term memory with semantic indexing.
2. search_memories
Finds relevant context using natural language queries.
3. get_user_profiles
Retrieves a comprehensive, structured user profile.
Real-World Example
Without Memory:User: “I prefer Python for backend development.” AI: “That’s great! Python is excellent for backend work.” Later… User: “What’s the best language for my new API?” AI: “There are many options, like Python, Node.js, or Go…”With Memobase MCP:
User: “I prefer Python for backend development.” AI: “Got it. I’ll remember your preference for Python.” (Memory saved: “User prefers Python for backend development”) Later… User: “What’s the best language for my new API?” AI: (Searches memories) “Based on your preference for Python, I’d recommend using FastAPI or Django.”