Contextual Observation & Recall Engine
C.O.R.E is a portable memory graph built from your llm interactions and personal data, making all your context and workflow history accessible to any AI tool, just like a digital brain. This eliminates the need for repeated context sharing . The aim is to provide:
- Unified, Portable Memory: Add and recall context seamlessly, and connect your memory across apps like Claude, Cursor, Windsurf and more.
- Relational, Not just Flat Facts: CORE organizes your knowledge, storing both facts and relationships for a deeper richer memory like a real brain.
- User Owned: You decide what to keep, update or delete and share your memory across the tool you want and be freed from vendor lock-in.
- Memory Graph: Visualise how your facts and preferences link together
- Chat with Memory: Ask questions about memory for instant insights and understanding
- Plug n Play: Instantly use CORE memory in apps like Cursor, Claude
- Sign up to Core Cloud and start building your memory graph.
- Add your text that you want to save in memory. Once clicking on
+ Add
button your memory graph will be generated. - Connect Core Memory MCP with Cursor
- Docker
- OpenAI API Key
Note: We are actively working on improving support for Llama models. At the moment, C.O.R.E does not provide optimal results with Llama-based models, but we are making progress to ensure better compatibility and output in the near future.
-
Copy Environment Variables
Copy the example environment file to
.env
:cp .env.example .env
-
Start the Application
Use Docker Compose to start all required services:
docker-compose up
-
Access the App
Once the containers are running, open your browser and go to http://localhost:3000.
-
Create Account with Magic Link
-
Create Your Private Space & Add Data
-
In the dashboard, go to the top right section -> Type a message, e.g.,
I love playing badminton
, and click+Add
. -
Your memory is queued for processing; you can monitor its status in the
Logs
section. -
Once processing is complete, nodes will be added to your private knowledge graph and visible in the dashboard.
-
You can later choose to connect this memory to other tools or keep it private.
-
-
Search Your Memory
- Use the dashboard's search feature to query your ingested data within your private space.
-
Open the CORE dashboard and navigate to the API section to generate a new API token.
-
In Cursor, go to: Settings → Tools & Integrations → New MCP Server.
-
Add the CORE MCP server using the configuration format below. Be sure to replace the API_TOKEN value with the token you generated in step 1.
MCP configuration to add in Cursor
{ "mcpServers": { "memory": { "command": "npx", "args": ["-y", "@redplanethq/core-mcp"], "env": { "API_TOKEN": "YOUR_API_TOKEN_HERE", "API_BASE_URL": "https://core.heysol.ai", "SOURCE": "cursor" } } } }
-
Go to Settings-> User rules -> New Rule -> and add the below rule to ensure all your chat interactions are being stored in CORE memory
After every interaction, update the memory with the user's query and the assistant's
response to core-memory mcp. sessionId should be the uuid of the conversation
Explore our documentation to get the most out of CORE
Have questions or feedback? We're here to help:
- Discord: Join core-support channel
- Documentation: docs.heysol.ai/core
- Email: manik@poozle.dev
Store:
- Conversation history
- User preferences
- Task context
- Reference materials
Don't Store:
- Sensitive data (PII)
- Credentials
- System logs
- Temporary data