Documentation

Setup and integration guides for cowork-db, a BYODB memory layer for AI conversations.

Beta status: pricing is not published yet. Access is currently focused on onboarding and product feedback.

What is cowork-db?

cowork-db is a bring-your-own-database memory layer for LLM conversations, built around the Model Context Protocol (MCP).

It provides a structured, inspectable way to:

  • Store human-readable memories derived from LLM conversations
  • Link those memories to entities (people, projects, tools, concepts)
  • Retrieve relevant context to ground future conversations
  • Keep humans in control of what is remembered and why

At its core, cowork-db is a tool for thinking — not an autonomous agent and not a black box. All memory interaction happens through explicit MCP tools. There are no hidden background writes or implicit memory mutation.

How It Works

┌─────────────────┐
│   Your LLM      │  (Claude, GPT, etc.)
│   Application   │
└────────┬────────┘
         │ MCP Tools
         ▼
┌─────────────────┐
│  cowork-db API  │  (Hosted service)
│   api.cowork-db.com
└────────┬────────┘
         │ Tailscale (encrypted)
         ▼
┌─────────────────┐
│   Your Server   │  (Your hardware)
│   Memgraph DB   │
└─────────────────┘
                        

The BYODB Model

"Bring Your Own Database" means your data stays on your hardware. We provide the API service that connects to your self-hosted Memgraph instance over a secure Tailscale network.

  • Your data, your hardware — Memgraph runs on your machine
  • Secure connection — Tailscale creates an encrypted mesh network
  • Simple setup — One install script gets everything running
  • No vendor lock-in — Standard Memgraph, standard protocols

Getting Started

1

Create Your Account

Sign up at cowork-db.com to get access to your dashboard. This creates your user account and generates your unique namespace.

2

Run the Install Script

From your dashboard, copy the install command for your platform:

curl -fsSL https://cowork-db.com/install.sh | bash -s -- --key YOUR_KEY
irm cowork-db.com/install.ps1 | iex

Or download and run with your key:

Invoke-WebRequest -Uri https://cowork-db.com/install.ps1 -OutFile install.ps1
.\install.ps1 -Key "YOUR_KEY"

This script:

  • Checks that Docker is installed
  • Creates a ~/.cowork-db directory (or %USERPROFILE%\.cowork-db on Windows)
  • Deploys Memgraph + Tailscale via Docker Compose
  • Connects your node to the secure Tailscale network
Requirements:
  • Docker and Docker Compose installed (Docker Desktop for Windows/Mac)
  • Ports 7687 (Bolt), and 3000 (Lab UI) available
  • Linux, macOS, or Windows 10/11
3

Verify Connection

Once the install completes, return to your dashboard. You should see:

  • Node Status: Connected
  • Database Status: Connected

If the database shows "Not Connected," give it a minute — the first connection can take up to 30 seconds as Tailscale establishes the mesh.

4

Create an API Key

From your dashboard, navigate to the API Keys section and create a new API key. This key will authenticate your AI model's requests to the cowork-db API.

Keep your API key secure! It provides full access to your memory database.

Need one key per use case or workspace? See Create Multiple Databases below.

5

Connect Your AI Model

Choose the integration method that works best for your use case:

Option 1: Claude.ai Custom Connector (Recommended)

If you use Claude.ai, you can connect directly using Custom Connectors:

  1. Go to Claude.ai → Settings → Connectors
  2. Click "Add custom connector"
  3. Enter the URL: https://api.cowork-db.com/mcp/v1/
  4. Authenticate with your API key when prompted
  5. Done! Claude can now discover and use all 8 cowork-db tools automatically

This uses the official Model Context Protocol specification with JSON-RPC, allowing Claude to:

  • Automatically discover available tools via tools/list
  • Call tools dynamically based on conversation context
  • Receive structured responses for further reasoning

Option 2: Custom AI Integration

For other AI models or custom applications, you can use either:

  • Official MCP Protocol (JSON-RPC 2.0 over HTTP) — Same as Claude.ai uses
  • REST API (Simple HTTP endpoints) — For easier integration
Quick Test - MCP Protocol:
# List available tools
curl -X POST "https://api.cowork-db.com/mcp/v1/" \
  -H "X-API-Key: YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'

Protocol note: MCP clients usually call initialize first, then send notifications/initialized. Notifications are acknowledged with HTTP 202. JSON-RPC method errors are returned in the JSON error payload (typically with HTTP 200). Resource probes (resources/list and resources/templates/list) are supported and return empty lists when no resources are exposed.

Quick Test - REST API:
# List available tools
curl -X GET "https://api.cowork-db.com/mcp/tools" \
  -H "X-API-Key: YOUR_API_KEY"

Behavior note: /mcp/tools returns static tool definitions and does not require a live Memgraph connection.

For complete integration examples with Python, JavaScript, and detailed API documentation, see the API Reference.

Create Multiple Databases

You can run multiple Memgraph databases on one BYODB machine and bind separate API keys to each one. This is useful for splitting work/personal contexts, environments, or teams.

1) Open your Local Dashboard and create a database

  1. From cowork-db.com/dashboard, click Open Local Dashboard
  2. In Create Database, enter database_id and display name
  3. Leave port empty to auto-assign the next free Bolt port (7687+)
  4. Click Create Database

The local dashboard will create a new Memgraph container and append it to your local registry automatically.

2) Advanced/manual fallback (optional)

If you prefer manual container management, keep the existing cowork-tailscale sidecar and run additional Memgraph containers in the same network namespace using unique ports.

# Example: second database on port 7688
docker run -d \
  --name cowork-memgraph-personal \
  --restart unless-stopped \
  --network container:cowork-tailscale \
  -v cowork-memgraph-personal-data:/var/lib/memgraph \
  -v cowork-memgraph-personal-log:/var/log/memgraph \
  memgraph/memgraph-mage:latest \
  --bolt-port=7688

Typical port pattern: 7687 (default), 7688, 7689, ...

3) Verify local registry entries

Local registry file locations:

  • Linux/macOS: ~/.cowork-db/databases.json
  • Windows: %USERPROFILE%\.cowork-db\databases.json

If you use the Local Dashboard create flow, entries are added automatically.

{
  "version": 1,
  "databases": [
    {
      "database_id": "default",
      "name": "Default",
      "container_name": "cowork-memgraph",
      "host": "127.0.0.1",
      "port": 7687,
      "profile_id": "memory-default",
      "profile_version": 1
    },
    {
      "database_id": "personal-db",
      "name": "Personal",
      "container_name": "cowork-memgraph-personal",
      "host": "127.0.0.1",
      "port": 7688,
      "profile_id": "memory-default",
      "profile_version": 1
    }
  ]
}

database_id is the key identifier you will use in API key creation.

4) Create one API key per database ID

  1. Go to cowork-db.com/dashboardCreate API Key
  2. Expand Database Target (Optional)
  3. Set Database ID (example: personal-db)
  4. Leave host/port/auth fields at defaults to use local registry sync

On key creation, cowork-db pulls the target route from your local dashboard over the private Tailnet and binds that key to the resolved database target.

Important: The database_id in the dashboard must exactly match the ID in databases.json. If not, key creation returns "database not found."

5) Verify routing

Create two keys with different database IDs, then issue the same MCP read call with each key. You should see results from different Memgraph instances.

Available Tools

Your AI model can discover and call these tools automatically. All tools are designed to be:

  • Explicit — No hidden behavior
  • Deterministic — Same input → same effect
  • Inspectable — Results can be audited
  • Composable — Do one thing well

coworkdb_upsert_entities

Create or update canonical entities (people, projects, tools, concepts).

→ JSON Schema with properties

coworkdb_write_memory

Store a memory with automatic embedding generation for semantic search.

→ Requires text, confidence, source_ref

coworkdb_link_entities

Create relationships between entities (e.g., "Alice WORKS_ON ProjectX").

→ Creates directed edges in graph

coworkdb_delete_entities

Delete entities by ID when memories are stale, incorrect, or test data.

→ Uses detach-delete semantics

coworkdb_search

Semantic search across all memories using vector similarity.

→ Returns top_k results with confidence

coworkdb_context

Get full context for an entity including related memories and relationships.

→ Graph traversal with configurable depth

coworkdb_recent

Retrieve recently stored memories, optionally filtered by time.

→ Sorted by timestamp

coworkdb_stats

Get database statistics: entity counts, memory counts, top entities.

→ Overview of knowledge graph

Note: When using Claude.ai Custom Connectors or the official MCP protocol, these tools are automatically discovered via the tools/list method with full JSON Schema definitions.

For complete tool schemas, examples, and integration code, see the API Reference →

Updating Your Installation

When we release updates, you can update your local installation without losing any data:

curl -fsSL https://cowork-db.com/update.sh | bash

The update script:

  • Backs up your current docker-compose.yml
  • Pulls the latest container images
  • Restarts services with zero data loss
  • Preserves your Tailscale authentication

Memgraph Lab

Your installation includes Memgraph Lab, a visual interface for exploring your graph database.

Access it at http://localhost:3000 on your database host machine.

With Lab you can:

  • Run Cypher queries directly
  • Visualize entity relationships as a graph
  • Inspect memory nodes and their embeddings
  • Export query results

Security Model

  • Data stays local — Your Memgraph instance runs on your hardware
  • Encrypted transport — Tailscale provides end-to-end encryption
  • No public exposure — Database is only accessible via private Tailscale network
  • Auth0 authentication — API access requires valid JWT tokens
  • User isolation — Each user's data is namespaced and isolated

FAQ

What are the system requirements?

Memgraph is lightweight. A machine with 2GB RAM and 10GB disk space is sufficient for most use cases. Docker must be installed and running.

Can I run this on a Raspberry Pi?

Yes! Memgraph supports ARM64. A Raspberry Pi 4 with 4GB+ RAM works well.

What happens if my server goes offline?

MCP tool calls will fail gracefully. When your server comes back online, Tailscale automatically reconnects. No data is lost.

Can I use my own Tailscale network?

Currently, cowork-db uses a managed Headscale server for the mesh network. We're exploring options for BYOVPN in a future release.

How do I backup my data?

Your data is stored in a Docker volume (cowork_memgraph_data). You can back it up using standard Docker volume backup methods, or use Memgraph's built-in snapshot functionality.

Can I access my database directly?

Yes! Memgraph Lab is available at localhost:3000 on your host. You can also connect any Bolt-compatible client to localhost:7687.

Getting Help

Need assistance? Here are your options: