---
## ❌ Without CORE
- ❌ Re-explain your project context to each AI tool
- ❌ Lost conversations and decisions across ChatGPT, Claude, Cursor
- ❌ No memory of past discussions, preferences, or project history
- ❌ Can't take actions in Linear, GitHub, Slack, Gmail from your AI tools
## ✅ With CORE
CORE gives your AI tools persistent memory and the ability to act in the apps you use.
- ✅ Remember everything across Claude, Cursor, ChatGPT, Claude Code
- ✅ Take actions in Linear, GitHub, Slack, Gmail, Google Sheets
- ✅ Connect once, works everywhere via MCP
- ✅ Your data, your control: open-source and self-hostable
---
## What You Can Do
### 1. Perform actions in your apps from Claude/Cursor
Connect your apps once, take actions from anywhere.
```txt
Create/Read GitHub, Linear issues from claude
```
```txt
Draft/Send/Read an email and store relevant info in CORE from claude
```
```txt
Manage your calendar, update spreadsheet from claude
```

---
### 2. Get on-demand context from CORE so you never have to repeat yourself
CORE becomes your personal knowledge base. Ask any AI tool to pull relevant context.
```txt
Search core memory for architecture decisions on the payment service
```
```txt
What my content guidelines from core to create the blog?
```

---
### 3. Catch up on past context for any feature in claude code/cursor
Switching back to a feature after a week? Get caught up instantly.
```txt
What did we discuss about the checkout flow? Summarize from memory.
```
```txt
Refer to past discussions and remind me where we left off on the API refactor
```

---
## 🚀 Quick Start
Choose your path:
| | **CORE Cloud** | **Self-Host** |
|---|---|---|
| Setup time | 5 minutes | 15 minutes |
| Best for | Try quickly, no infra | Full control, your servers |
| Requirements | Just an account | Docker, 4GB RAM |
### Cloud
1. **Sign up** at [app.getcore.me](https://app.getcore.me)
2. **Connect a source** (Claude, Cursor, or any MCP-compatible tool)
3. **Start using** CORE to perform any action or store about you in memory
### Self-Host
**Quick Deploy**
[](https://railway.com/deploy/core)
**Or with Docker**
1. Clone the repository:
```
git clone https://github.com/RedPlanetHQ/core.git
cd core
```
2. Configure environment variables in `core/.env`:
```
OPENAI_API_KEY=your_openai_api_key
```
3. Start the service
```
docker-compose up -d
```
Once deployed, you can configure your AI providers (OpenAI, Anthropic) and start building your memory graph.
👉 [View complete self-hosting guide](https://docs.getcore.me/self-hosting/docker)
> Note: We tried open-source models like Ollama or GPT OSS but facts generation were not good, we are still figuring out how to improve on that and then will also support OSS models.
## 🛠️ Installation
### CLIs