Setup and integration guides for cowork-db, a BYODB memory layer for AI conversations.
cowork-db is a bring-your-own-database memory layer for LLM conversations, built around the Model Context Protocol (MCP).
It provides a structured, inspectable way to:
At its core, cowork-db is a tool for thinking — not an autonomous agent and not a black box. All memory interaction happens through explicit MCP tools. There are no hidden background writes or implicit memory mutation.
┌─────────────────┐
│ Your LLM │ (Claude, GPT, etc.)
│ Application │
└────────┬────────┘
│ MCP Tools
▼
┌─────────────────┐
│ cowork-db API │ (Hosted service)
│ api.cowork-db.com
└────────┬────────┘
│ Tailscale (encrypted)
▼
┌─────────────────┐
│ Your Server │ (Your hardware)
│ Memgraph DB │
└─────────────────┘
"Bring Your Own Database" means your data stays on your hardware. We provide the API service that connects to your self-hosted Memgraph instance over a secure Tailscale network.
Sign up at cowork-db.com to get access to your dashboard. This creates your user account and generates your unique namespace.
From your dashboard, copy the install command for your platform:
curl -fsSL https://cowork-db.com/install.sh | bash -s -- --key YOUR_KEY
irm cowork-db.com/install.ps1 | iex
Or download and run with your key:
Invoke-WebRequest -Uri https://cowork-db.com/install.ps1 -OutFile install.ps1 .\install.ps1 -Key "YOUR_KEY"
This script:
~/.cowork-db directory (or %USERPROFILE%\.cowork-db on Windows)Once the install completes, return to your dashboard. You should see:
If the database shows "Not Connected," give it a minute — the first connection can take up to 30 seconds as Tailscale establishes the mesh.
From your dashboard, navigate to the API Keys section and create a new API key. This key will authenticate your AI model's requests to the cowork-db API.
Need one key per use case or workspace? See Create Multiple Databases below.
Choose the integration method that works best for your use case:
If you use Claude.ai, you can connect directly using Custom Connectors:
https://api.cowork-db.com/mcp/v1/This uses the official Model Context Protocol specification with JSON-RPC, allowing Claude to:
tools/listFor other AI models or custom applications, you can use either:
# List available tools
curl -X POST "https://api.cowork-db.com/mcp/v1/" \
-H "X-API-Key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
Protocol note: MCP clients usually call initialize first, then send
notifications/initialized. Notifications are acknowledged with HTTP 202.
JSON-RPC method errors are returned in the JSON error payload (typically with HTTP 200).
Resource probes (resources/list and resources/templates/list) are supported and return empty lists when no resources are exposed.
# List available tools curl -X GET "https://api.cowork-db.com/mcp/tools" \ -H "X-API-Key: YOUR_API_KEY"
Behavior note: /mcp/tools returns static tool definitions and does not require
a live Memgraph connection.
For complete integration examples with Python, JavaScript, and detailed API documentation, see the API Reference.
You can run multiple Memgraph databases on one BYODB machine and bind separate API keys to each one. This is useful for splitting work/personal contexts, environments, or teams.
cowork-db.com/dashboard, click Open Local Dashboarddatabase_id and display name7687+)The local dashboard will create a new Memgraph container and append it to your local registry automatically.
If you prefer manual container management, keep the existing cowork-tailscale sidecar and run additional Memgraph containers in the same network namespace using unique ports.
# Example: second database on port 7688 docker run -d \ --name cowork-memgraph-personal \ --restart unless-stopped \ --network container:cowork-tailscale \ -v cowork-memgraph-personal-data:/var/lib/memgraph \ -v cowork-memgraph-personal-log:/var/log/memgraph \ memgraph/memgraph-mage:latest \ --bolt-port=7688
Typical port pattern: 7687 (default), 7688, 7689, ...
Local registry file locations:
~/.cowork-db/databases.json%USERPROFILE%\.cowork-db\databases.jsonIf you use the Local Dashboard create flow, entries are added automatically.
{
"version": 1,
"databases": [
{
"database_id": "default",
"name": "Default",
"container_name": "cowork-memgraph",
"host": "127.0.0.1",
"port": 7687,
"profile_id": "memory-default",
"profile_version": 1
},
{
"database_id": "personal-db",
"name": "Personal",
"container_name": "cowork-memgraph-personal",
"host": "127.0.0.1",
"port": 7688,
"profile_id": "memory-default",
"profile_version": 1
}
]
}
database_id is the key identifier you will use in API key creation.
cowork-db.com/dashboard → Create API Keypersonal-db)On key creation, cowork-db pulls the target route from your local dashboard over the private Tailnet and binds that key to the resolved database target.
database_id in the dashboard must exactly match the ID in databases.json.
If not, key creation returns "database not found."
Create two keys with different database IDs, then issue the same MCP read call with each key. You should see results from different Memgraph instances.
Your AI model can discover and call these tools automatically. All tools are designed to be:
Create or update canonical entities (people, projects, tools, concepts).
→ JSON Schema with properties
Store a memory with automatic embedding generation for semantic search.
→ Requires text, confidence, source_ref
Create relationships between entities (e.g., "Alice WORKS_ON ProjectX").
→ Creates directed edges in graph
Delete entities by ID when memories are stale, incorrect, or test data.
→ Uses detach-delete semantics
Semantic search across all memories using vector similarity.
→ Returns top_k results with confidence
Get full context for an entity including related memories and relationships.
→ Graph traversal with configurable depth
Retrieve recently stored memories, optionally filtered by time.
→ Sorted by timestamp
Get database statistics: entity counts, memory counts, top entities.
→ Overview of knowledge graph
Note: When using Claude.ai Custom Connectors or the official MCP protocol,
these tools are automatically discovered via the tools/list method with full JSON Schema definitions.
For complete tool schemas, examples, and integration code, see the API Reference →
When we release updates, you can update your local installation without losing any data:
curl -fsSL https://cowork-db.com/update.sh | bash
The update script:
Your installation includes Memgraph Lab, a visual interface for exploring your graph database.
Access it at http://localhost:3000 on your database host machine.
With Lab you can:
Memgraph is lightweight. A machine with 2GB RAM and 10GB disk space is sufficient for most use cases. Docker must be installed and running.
Yes! Memgraph supports ARM64. A Raspberry Pi 4 with 4GB+ RAM works well.
MCP tool calls will fail gracefully. When your server comes back online, Tailscale automatically reconnects. No data is lost.
Currently, cowork-db uses a managed Headscale server for the mesh network. We're exploring options for BYOVPN in a future release.
Your data is stored in a Docker volume (cowork_memgraph_data).
You can back it up using standard Docker volume backup methods, or use Memgraph's
built-in snapshot functionality.
Yes! Memgraph Lab is available at localhost:3000 on your host.
You can also connect any Bolt-compatible client to localhost:7687.
Need assistance? Here are your options: