Skip to content

Getting Started

Recalium runs locally via Docker. This guide takes you from zero to your first memory-augmented AI interaction.


Prerequisites

Required:

Optional but recommended:

  • An OpenAI or Anthropic API key — enables AI-powered extraction and semantic search. Keyword search and manual ingestion work without one.

Install

Clone the repository and start the stack:

git clone https://github.com/recalium/recalium.git
cd recalium
docker compose up

Recalium starts on http://localhost:8080. The MCP server listens on http://localhost:3000.

On first run, a setup wizard walks you through initial configuration.


First Run

1. Configure your AI provider (optional)

In the setup wizard, enter your OpenAI or Anthropic API key. Skip this step to use keyword search only.

2. Import your conversation history

Three import paths:

  • Paste — paste raw text directly into the import UI
  • Upload — upload a ChatGPT export ZIP or Claude export JSON
  • Watched folder — drop files into ~/.recalium/inbox and they’re ingested automatically

3. Run your first search

Once import completes, try a keyword or semantic search from the search bar. Each result shows the source it came from.


MCP Setup

Configure Recalium as an MCP server in your AI client so it can retrieve memory automatically.

Claude Code (~/.claude/settings.json):

{
  "mcpServers": {
    "recalium": {
      "url": "http://localhost:3000/mcp",
      "transport": "http"
    }
  }
}

Cursor (~/.cursor/mcp.json):

{
  "servers": {
    "recalium": {
      "url": "http://localhost:3000/mcp"
    }
  }
}

After configuring, your AI client will automatically call Recalium for relevant context on each request.


Configuration

Copy .env.sample to .env in the repo root and edit as needed:

VariableDefaultDescription
RECALIUM_PORT8080Web UI port
MCP_PORT3000MCP server port
OPENAI_API_KEYOpenAI key for extraction and embeddings
ANTHROPIC_API_KEYAnthropic key (alternative to OpenAI)
RECALIUM_DATA_DIR./dataWhere memory is stored on disk
SENSITIVITY_GATEtrueEnable/disable the sensitivity pre-classifier
LOG_LEVELinfodebug, info, warn, error

Restart the stack after changing .env:

docker compose down && docker compose up