MCP Servers & Cloudflare Managers: FAQ

Get answers about our MCP servers (SQLite, PostgreSQL, MySQL, Memory Journal) and Cloudflare managers (D1, KV, R2, DO). Installation guides, configuration, and troubleshooting for AI-assisted development and cloud management.

General Questions

What is a Model Context Protocol (MCP) server?

A Model Context Protocol (MCP) server is a specialized application that extends AI assistants with additional capabilities. MCP servers provide tools, resources, and prompts that allow AI assistants to perform tasks they couldn't do natively, such as database operations, file management, or specialized data processing.

Our MCP servers integrate seamlessly with Claude Desktop, Cursor, ChatGPT, Gemini, and other MCP-compatible clients, giving you powerful capabilities for database management, project journaling, and development workflows.

What applications and servers does Adamic provide?

Adamic provides four production-ready MCP servers and four Cloudflare management applications:

MCP Servers:

  • SQLite MCP Server v2.6.4 - 73 specialized tools across 14 categories including JSON operations, statistical analysis, semantic vector search, and SpatiaLite geospatial operations
  • PostgreSQL MCP Server v1.2.0 - 63 tools, 10 intelligent resources, and 10 guided prompts. Features tool filtering, pgvector, PostGIS, and zero known vulnerabilities
  • MySQL MCP Server v2.1.0 - 191 specialized tools, 18 observability resources, and 19 AI-powered prompts. Features OAuth 2.1 authentication, smart tool filtering, MySQL Router/ProxySQL/InnoDB Cluster integrations, and strict TypeScript with 97% test coverage
  • Memory Journal MCP v3.0.0 - 27 tools, 14 prompts, 14 resources. TypeScript rewrite with Pure JS Stack, backup/restore tools, and semantic search

Cloudflare Managers:

  • D1 Manager v2.0.0 - D1 database management with Drizzle ORM, Time Travel, Read Replication, R2 backups, and FTS5 search
  • KV Manager v2.1.0 - Workers KV management with visual color tags, dual metadata system, bulk operations, and cross-namespace search
  • R2 Manager v2.0.0 - R2 bucket management with AI Search integration, job history, rate limiting, and multi-bucket downloads
  • DO Manager v1.2.0 - Durable Objects management with cross-namespace instance migration, freeze/unfreeze protection, admin hooks, SQL console, and batch operations
Which should I use: SQLite or PostgreSQL MCP Server?

Use SQLite MCP Server when:

  • You need a local, file-based database without setup
  • Working with embedded databases or mobile apps
  • You want 73 specialized tools including JSON helpers and SpatiaLite
  • Tool filtering is important (v2.6.4 feature for MCP client limits)

Use PostgreSQL MCP Server when:

  • You have an existing PostgreSQL database (versions 13-18)
  • Need enterprise features like pgvector embeddings or PostGIS
  • Want intelligent resources for database meta-awareness
  • Tool filtering for client limits and token savings (v1.2.0)

Both servers are production-ready, free, open source, and support Docker deployment.

Are these free to use?

Yes! All MCP servers and Cloudflare managers are completely free and open source under the MIT License. You can use them for personal projects, commercial applications, or modify them to suit your needs.

Available on GitHub and Docker Hub at no cost, with no usage limits or subscription fees.

MCP Servers: Installation & Setup

How do I install MCP servers? (SQLite, PostgreSQL, MySQL, Memory Journal)

All four MCP servers support multiple installation methods:

Method 1: Docker (Recommended)

SQLite (Docker):

docker pull writenotenow/sqlite-mcp-server:latest
docker run -i --rm -v $(pwd):/workspace writenotenow/sqlite-mcp-server:latest --db-path /workspace/database.db

PostgreSQL (Docker):

docker pull writenotenow/postgres-mcp-enhanced:latest
docker run -i --rm -e DATABASE_URI="postgresql://user:pass@localhost:5432/db" writenotenow/postgres-mcp-enhanced:latest --access-mode=restricted

MySQL MCP (npm - recommended)

npm install -g @neverinfamous/mysql-mcp

MySQL MCP (Docker)

docker pull writenotenow/mysql-mcp:latest
docker run -i --rm writenotenow/mysql-mcp:latest --transport stdio --mysql mysql://user:password@host.docker.internal:3306/database

Memory Journal (npm - recommended)

npm install -g memory-journal-mcp

Memory Journal (Docker)

docker pull writenotenow/memory-journal-mcp:latest
docker run -i --rm -v ./data:/app/data writenotenow/memory-journal-mcp:latest

Method 2: Python/PyPI

SQLite (PyPI):

pip install sqlite-mcp-server-enhanced

PostgreSQL (PyPI):

pip install postgres-mcp-enhanced

Memory Journal

pip install memory-journal-mcp

Method 3: From Source

Clone from GitHub and follow README instructions:

How do I configure my MCP client (Cursor, Claude Desktop)?

Add MCP servers to your client's configuration file:

Cursor IDE - Edit ~/.cursor/mcp.json or %USERPROFILE%\.cursor\mcp.json (Windows):

{
  "mcpServers": {
    "sqlite-mcp": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-v", "$(pwd):/workspace", 
               "writenotenow/sqlite-mcp-server:latest", 
               "--db-path", "/workspace/database.db"]
    },
    "postgres-mcp": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-e", "DATABASE_URI", 
               "writenotenow/postgres-mcp-enhanced:latest", 
               "--access-mode=restricted"],
      "env": {
        "DATABASE_URI": "postgresql://user:pass@localhost:5432/db"
      }
    },
    "memory-journal": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-v", "./data:/app/data",
               "writenotenow/memory-journal-mcp:latest",
               "python", "src/server.py"]
    }
  }
}

Claude Desktop - Similar configuration in Claude's settings.

Restart your MCP client after configuration changes.

What are the system requirements for MCP servers?

All MCP Servers:

  • Python 3.12, 3.13, or 3.14 (for SQLite/PostgreSQL/Memory Journal)
  • Node.js 18+ (for MySQL MCP)
  • Docker (optional but recommended)
  • MCP-compatible client (Cursor, Claude Desktop, etc.)

PostgreSQL MCP Server:

  • PostgreSQL 13-18 installed and running
  • DATABASE_URI environment variable
  • Required extensions: pg_stat_statements (built-in)
  • Optional: pgvector v0.8.0, PostGIS v3.5.0

MySQL MCP Server:

  • MySQL 5.7+ or 8.0+ server
  • Connection string or environment variables (MYSQL_HOST, MYSQL_PORT, MYSQL_USER, MYSQL_PASSWORD, MYSQL_DATABASE)
  • Optional: MySQL Router, ProxySQL, or MySQL Shell for ecosystem integrations

SQLite MCP Server:

  • SQLite 3.50+ (for JSONB support)
  • No database setup required

Memory Journal MCP:

  • Optional: GITHUB_TOKEN for GitHub integration
  • Git installed for repository context

SQLite MCP Server

What are the 73 specialized tools in SQLite MCP Server?

SQLite MCP Server v2.6.4 provides 73 tools across 14 categories:

  • Core Database (15 tools) - CRUD operations, schema management, transactions
  • JSON Helper Tools (6 tools) - Simplified JSON operations with auto-normalization
  • Text Processing (9 tools) - Regex, fuzzy matching, phonetic search
  • Statistical Analysis (8 tools) - Descriptive stats, percentiles, time series
  • Virtual Tables (8 tools) - CSV, R-Tree, series generation
  • Semantic Search (8 tools) - Embeddings, vector similarity, hybrid search
  • Geospatial (7 tools) - SpatiaLite spatial operations
  • Full-Text Search (3 tools) - FTS5 creation, indexing, BM25 ranking

See the complete documentation for details.

How do I use tool filtering in SQLite MCP Server v2.6.4?

Tool filtering helps address MCP client tool limits (like Windsurf's 100-tool limit). Use the SQLITE_MCP_TOOL_FILTER environment variable:

Reduce to ~50 tools (Windsurf-compatible)

SQLITE_MCP_TOOL_FILTER="-vector,-stats,-spatial,-text"

Core + JSON only

SQLITE_MCP_TOOL_FILTER="-fts,-vector,-virtual,-spatial,-text,-stats,-admin,-misc"

Disable admin but keep vacuum and backup

SQLITE_MCP_TOOL_FILTER="-admin,+vacuum_database,+backup_database"

Available groups: core, fts, vector, json, virtual, spatial, text, stats, admin, misc

What is JSON auto-normalization?

JSON auto-normalization (v2.6.0+) automatically fixes Python-style JSON for SQLite compatibility:

  • Single quotes → Double quotes
  • Python True/False → JSON true/false
  • Python None → JSON null
  • Trailing commas removed
  • Security validation prevents malicious input

Use JSON Helper Tools like json_insert, json_update, and json_query for simplified operations.

PostgreSQL MCP Server

What are the 63 tools, 10 resources, and 10 prompts?

63 Tools across 9 categories:

  • Core Database (9) - Schema management, SQL execution, health monitoring
  • JSON Operations (11) - JSONB operations, validation, security
  • Text Processing (5) - Similarity search, full-text, fuzzy matching
  • Statistical Analysis (8) - Stats, correlation, regression, time series
  • Performance Intelligence (6) - Query optimization, index tuning
  • Vector/Semantic Search (8) - pgvector embeddings, similarity, clustering
  • Geospatial (7) - PostGIS distance, spatial queries, GIS operations
  • Backup & Recovery (4) - Backup planning, restore validation
  • Monitoring & Alerting (5) - Real-time monitoring, capacity planning

10 Resources - Database meta-awareness: schema, capabilities, performance, health, extensions, indexes, connections, replication, vacuum, locks, statistics

10 Prompts - Guided workflows: optimize_query, index_tuning, database_health_check, setup_pgvector, json_operations, performance_baseline, backup_strategy, setup_postgis, explain_analyze_workflow, extension_setup

What are security modes (restricted vs unrestricted)?

Restricted Mode (Production):

  • Read-only operations
  • Query validation
  • Resource limits
  • Recommended for production use

Unrestricted Mode (Development):

  • Full database access
  • Write operations allowed
  • Parameter binding protection
  • Use only in development environments

Restricted mode

postgres-mcp --access-mode=restricted

Unrestricted mode

postgres-mcp --access-mode=unrestricted
How do I set up pgvector for embeddings?

Install pgvector extension and use the setup_pgvector prompt:

-- Install pgvector
CREATE EXTENSION IF NOT EXISTS vector;

-- Create table with vector column
CREATE TABLE embeddings (
  id SERIAL PRIMARY KEY,
  content TEXT,
  embedding vector(1536)  -- OpenAI embeddings dimension
);

-- Create HNSW index for fast similarity search
CREATE INDEX ON embeddings USING hnsw (embedding vector_cosine_ops);

Use the MCP server's vector search tools for similarity queries and clustering.

MySQL MCP Server

What are the 191 specialized tools in MySQL MCP Server?

MySQL MCP Server v2.1.0 provides 191 tools across 24 categories:

  • Core Database (8 tools) - Read/write queries, tables, indexes
  • Transactions (7 tools) - BEGIN, COMMIT, ROLLBACK, savepoints
  • JSON Operations (17 tools) - MySQL 5.7+ native JSON functions, merge, diff, stats
  • Text Processing (6 tools) - REGEXP, LIKE, SOUNDEX
  • Full-Text Search (4 tools) - Natural language search with BM25
  • Performance (8 tools) - EXPLAIN, query analysis, slow query detection
  • Optimization (4 tools) - Index hints, recommendations
  • Admin (6 tools) - OPTIMIZE, ANALYZE, CHECK tables
  • Monitoring (7 tools) - PROCESSLIST, status variables
  • Backup (4 tools) - Export, import, mysqldump integration
  • Replication (5 tools) - Master/slave, binlog management
  • Partitioning (4 tools) - Partition management
  • Spatial/GIS (12 tools) - Geospatial operations
  • Security (9 tools) - Audit, SSL, encryption, data masking
  • Cluster (10 tools) - Group Replication, InnoDB Cluster
  • Roles (8 tools) - MySQL 8.0 role management
  • DocStore (9 tools) - Document Store / X DevAPI collections
  • Router (9 tools) - MySQL Router REST API integration
  • ProxySQL (12 tools) - ProxySQL management
  • Shell (10 tools) - MySQL Shell utilities

See the complete documentation for details.

How do I use tool filtering in MySQL MCP Server?

AI IDEs like Cursor have tool limits (typically 40-50 tools). With 191 tools available, you must use tool filtering via the --tool-filter argument:

Recommended Shortcuts:

  • starter (38 tools) - Core, JSON, transactions, text - best for most users
  • essential (15 tools) - Minimal footprint
  • dev-power (45 tools) - Core, schema, performance, stats
  • dba-monitor (35 tools) - Core, monitoring, performance, sysschema
  • ecosystem (31 tools) - MySQL Router, ProxySQL, Shell

Use starter shortcut (recommended)

mysql-mcp --transport stdio --mysql mysql://user:pass@localhost:3306/db --tool-filter starter

Add specific groups to a shortcut

--tool-filter "starter,spatial"

Remove specific tools

--tool-filter "starter,-mysql_drop_table"
How do I connect to MySQL Router, ProxySQL, or InnoDB Cluster?

MySQL MCP provides first-class integrations for the MySQL ecosystem:

MySQL Router (for InnoDB Cluster)

Connect to the Router REST API for cluster topology and routing status:

MYSQL_ROUTER_URL=https://localhost:8443
MYSQL_ROUTER_USER=rest_api
MYSQL_ROUTER_PASSWORD=router_password
MYSQL_ROUTER_INSECURE=true

ProxySQL Admin

Manage query rules, server pools, and caching:

PROXYSQL_HOST=localhost
PROXYSQL_PORT=6032
PROXYSQL_USER=radmin
PROXYSQL_PASSWORD=radmin

InnoDB Cluster

Connect directly to a cluster node for Group Replication status:

mysql-mcp --transport stdio --mysql mysql://cluster_admin:password@localhost:3307/mysql --tool-filter cluster

Use --tool-filter ecosystem to load Router, ProxySQL, and Shell tools together.

Memory Journal MCP Server

How does GitHub integration work?

Memory Journal MCP v3.0.0 provides comprehensive GitHub integration:

Automatic Context Capture:

  • Current repository, branch, and commit
  • GitHub Issues - Auto-detect from branch names
  • GitHub Pull Requests - Auto-detect current PR
  • GitHub Projects (user & org)
  • GitHub Actions workflow runs

v3.0.0 TypeScript Rewrite Features:

  • Pure JS Stack with no native dependencies
  • Backup/restore tools with auto-backup safety
  • Health diagnostics via memory://health resource
  • MCP 2025-11-25 compliance with behavioral annotations

Issues & PRs Features:

  • Auto-fetch and link entries
  • PR lifecycle tracking
  • 3 PR workflow prompts: pr-summary, code-review-prep, pr-retrospective
  • 3 new resources: Issue/PR entries, PR timelines
How do I set up GITHUB_TOKEN?

Configure GitHub tokens for Projects, Issues, PRs, and Actions integration:

User repositories and projects

export GITHUB_TOKEN="your_personal_access_token"

Organization projects (optional)

export GITHUB_ORG_TOKEN="your_org_token"
export DEFAULT_ORG="your-org-name"

Required Scopes:

  • repo - Access repositories, issues, and PRs
  • project - Access GitHub Projects
  • read:org - Read org projects (org token only)

Fallback: If tokens not set, Memory Journal uses the gh CLI if available.

What are knowledge graph relationships?

Memory Journal builds knowledge graphs linking related work:

5 Relationship Types:

  • implements - Implementation of a spec or design
  • tests - Test coverage for an implementation
  • depends_on - Dependency relationships
  • related_to - General connections
  • blocks - Blocker relationships

Mermaid Visualization:

Use visualize_relationships tool or access memory://graph/recent resource for live diagrams showing how different pieces of work connect.

Example: Link a spec → implementation → tests → PR for complete context visibility.

Cloudflare Managers: Setup & Authentication

How do I set up Cloudflare Access (Zero Trust) authentication?

All Cloudflare managers (D1, KV, R2, DO) use Cloudflare Access for enterprise authentication:

Step 1: Configure Zero Trust

  1. Go to Cloudflare Zero Trust
  2. Configure authentication provider (GitHub OAuth, Google, etc.)
  3. Create an Access Application for your domain
  4. Copy the Application Audience (AUD) tag

Step 2: Create API Token

  1. Go to Cloudflare API Tokens
  2. Create Custom Token with required permissions (see below)

Step 3: Set Secrets

npx wrangler secret put ACCOUNT_ID
npx wrangler secret put API_KEY
npx wrangler secret put TEAM_DOMAIN
npx wrangler secret put POLICY_AUD

Common Issues:

  • Authentication loop - Verify TEAM_DOMAIN includes https://
  • Access denied - Check POLICY_AUD matches your Access application
What API token permissions do I need?

Required permissions by manager:

ManagerRequired Permissions
D1 ManagerAccount → D1 → Edit
KV ManagerAccount → Workers KV Storage → Edit
R2 ManagerAccount → R2 → Edit
DO ManagerAccount → Workers Scripts → Read
Account → D1 → Edit (if managing D1-backed DOs)

Note: Both API Tokens (Bearer auth) and Global API Keys (X-Auth-Key auth) are supported.

How do I deploy Cloudflare managers?

All managers follow a similar deployment process:

Prerequisites:

  • Node.js 18+ and npm
  • Cloudflare account
  • Wrangler CLI installed
  • Domain managed by Cloudflare (or use workers.dev)

Deployment Steps:

  1. Clone the repository
  2. Install dependencies: npm install
  3. Authenticate: npx wrangler login
  4. Create required resources (D1 database, R2 bucket, etc.)
  5. Configure wrangler.toml
  6. Set secrets (ACCOUNT_ID, API_KEY, TEAM_DOMAIN, POLICY_AUD)
  7. Build and deploy: npm run build && npx wrangler deploy

See individual manager documentation for specific requirements.

Can I use Docker for Cloudflare managers?

Yes! D1 Manager and KV Manager support Docker deployment for development and testing:

D1 Manager

docker pull writenotenow/d1-manager:latest
docker run -d -p 8080:8080 \
  -e ACCOUNT_ID=your_account_id \
  -e API_KEY=your_api_token \
  -e TEAM_DOMAIN=https://yourteam.cloudflareaccess.com \
  -e POLICY_AUD=your_aud_tag \
  writenotenow/d1-manager:latest

KV Manager

docker pull writenotenow/kv-manager:latest
docker run -d -p 8787:8787 \
  -e ACCOUNT_ID=your_account_id \
  -e API_KEY=your_api_token \
  -e TEAM_DOMAIN=https://yourteam.cloudflareaccess.com \
  -e POLICY_AUD=your_aud_tag \
  writenotenow/kv-manager:latest

Note: For production, deploy to Cloudflare Workers for best performance and edge distribution.

D1 Manager

What is D1 Manager and what are its key features?

D1 Manager v2.0.0 is a comprehensive web application for managing Cloudflare D1 databases:

Database Management:

  • Create, rename, clone, delete, download, and optimize databases
  • Grid/List view toggle with quick actions
  • Bulk operations (download, optimize, delete)
  • Import databases (SQL files or paste content)
  • Color picker for visual organization (27 colors)

Advanced Features:

  • Drizzle ORM Console - Introspect schemas, migration status/history, generate SQL, push changes with dry-run
  • Time Travel - Point-in-time recovery with bookmarks and checkpoint history
  • Read Replication - Enable/disable global read replicas
  • R2 Backup/Restore - Automated backups with scheduled daily/weekly/monthly options
  • FTS5 Full-Text Search - Manage virtual tables with quick actions
How do R2 backups work in D1 Manager?

D1 Manager provides comprehensive backup features via R2 integration:

Manual Backups:

  • Backup databases to R2 before delete, rename, STRICT mode, or FTS5 conversion
  • Create backups from database cards
  • Unified hub with undo history and R2 snapshots

Scheduled Backups:

  • Daily, weekly, or monthly schedules per database
  • Cron triggers with next-run tracking
  • Enable/disable controls
  • Job history integration

Setup Requirements:

Create R2 bucket

npx wrangler r2 bucket create d1-manager-backups

Add to wrangler.toml - R2 Bucket

[[r2_buckets]]
binding = "BACKUP_BUCKET"
bucket_name = "d1-manager-backups"

Add to wrangler.toml - Durable Objects

[[durable_objects.bindings]]
name = "BACKUP_DO"
class_name = "BackupDO"
script_name = "d1-manager"

KV Manager

What is the dual metadata system in KV Manager?

KV Manager v2.1.0 provides two metadata storage options:

1. KV Native Metadata (1024 bytes limit):

  • Stored directly in Cloudflare KV
  • Fast access with KV reads
  • Limited to 1024 bytes
  • Good for simple key-value pairs

2. D1 Custom Metadata (unlimited):

  • Stored in Cloudflare D1 database
  • Unlimited size for complex metadata
  • Supports tags and advanced search
  • Better for structured data

Use both systems together: KV Native for fast access, D1 Custom for rich metadata and search capabilities.

What bulk operations does KV Manager support?

KV Manager provides comprehensive bulk operations:

  • Bulk Delete - Delete thousands of keys efficiently
  • Bulk Copy - Copy keys between namespaces
  • Bulk TTL Update - Set expiration on multiple keys
  • Bulk Tag Operations - Add/remove tags from multiple keys
  • Import/Export - JSON/NDJSON support with collision handling (replace/skip/fail)
  • R2 Backup & Restore - Cloud-native backup with batch operations

All bulk operations are tracked in Job History with event timelines and progress monitoring.

R2 Manager

What are the file upload limits by Cloudflare plan?

R2 Manager supports chunked uploads up to 500MB, but Cloudflare enforces plan-based limits:

PlanMax File SizeFeatures
Free100MBBasic features
Pro100MBEnhanced support
Business200MBPriority support
Enterprise500MBFull features + SLA

Features:

  • Chunked uploads with automatic retry (10MB chunks)
  • MD5 checksum verification for integrity
  • Upload progress tracking
How do I set up AI Search integration?

R2 Manager v2.0.0 integrates with Cloudflare AI Search for semantic search:

Setup Steps:

  1. Add AI binding to wrangler.toml:
    [ai]
    binding = "AI"
  2. Check compatibility - View indexable file types (txt, md, json, yaml, etc.)
  3. Create AI Search instance via Cloudflare Dashboard or R2 Manager UI
  4. Sync bucket to index files (up to 4MB each)

Search Modes:

  • AI Search - Get AI-generated answers based on your data
  • Semantic Search - Retrieve relevant documents without AI generation

Features:

  • Compatibility analysis with visual reports
  • Instance management (list, sync, query)
  • Direct Cloudflare Dashboard link

DO (Durable Objects) Manager

How do I set up admin hooks for my Durable Objects?

DO Manager requires admin hooks to manage Durable Objects. Two options:

Option A: NPM Package (Recommended)

npm install do-manager-admin-hooks
import { withAdminHooks } from 'do-manager-admin-hooks';

export class MyDurableObject extends withAdminHooks() {
  async fetch(request: Request): Promise<Response> {
    // Handle admin requests first
    const adminResponse = await this.handleAdminRequest(request);
    if (adminResponse) return adminResponse;

    // Your custom logic here
    return new Response('Hello from my Durable Object!');
  }
}

Option B: Manual Copy-Paste

Click "Get Admin Hook Code" in the namespace view to generate copy-paste TypeScript code for your DO class.

Configuration Options:

export class SecureDO extends withAdminHooks({
  basePath: '/admin',      // Change endpoint path (default: '/admin')
  requireAuth: true,       // Require authentication
  adminKey: 'secret-key',  // Admin key for auth
}) {
  // ...
}
What features does DO Manager provide?

DO Manager v1.2.0 provides comprehensive Durable Objects management:

v1.2.0 Features:

  • Instance Migration - Move instances between namespaces with three cutover modes: Copy Only (non-destructive), Copy + Delete (removes source), Copy + Freeze (locks source)
  • Freeze/Unfreeze - Read-only protection for critical instances with snowflake indicator and API enforcement (423 Locked)

Namespace Management:

  • Auto-discover DO namespaces from Cloudflare API
  • Manual configuration for custom setups
  • Clone namespace configurations with deep clone support
  • System namespace filtering (prevents accidental deletion)
  • Support for SQLite and KV storage backends

Instance Management:

  • Track instances by name or hex ID
  • Create, clone, and delete instances
  • Download instance storage as JSON
  • Color tags and custom freeform tags for organization
  • Instance diff - Compare storage between instances

Advanced Features:

  • SQL Console V2 - Rich editor with syntax highlighting, auto-complete, and query templates
  • Batch Operations - Multi-select for bulk download, delete, and backup
  • Storage Management - View/edit storage with JSON support, import keys from JSON
  • Automated Database Migrations - In-app prompts for schema upgrades
  • R2 Backup & Restore - Snapshot DO storage to R2 with restore capability
  • Global Search - Cross-namespace key and value search

Quick Reference: Feature Comparisons

MCP Servers: Feature Comparison
FeatureSQLite MCPPostgreSQL MCPMySQL MCPMemory Journal
Versionv2.6.4v1.2.0v2.1.0v3.0.0
Tools73 tools63 tools191 tools27 tools
Resources7 resources10 resources18 resources14 resources
Prompts7 prompts10 prompts19 prompts14 prompts
RuntimePython 3.12-3.14Python 3.12-3.14Node.js 18+Python 3.12-3.14
DatabaseSQLite 3.50+ (local)PostgreSQL 13-18MySQL 5.7+ / 8.0+SQLite (embedded)
SetupEasy (no DB setup)Medium (requires PG)Medium (requires MySQL)Easy (auto-creates DB)
Geospatial✓ SpatiaLite (7 tools)✓ PostGIS (7 tools)✓ Spatial (12 tools)-
JSON Operations✓ 6 helpers + normalize✓ 11 JSONB tools✓ 17 JSON tools-
Tool Filtering✓ v2.6.4 feature✓ v1.2.0 feature✓ 24 groups + shortcuts-
Ecosystem--✓ Router, ProxySQL, Shell✓ GitHub integration
SecuritySQL injection protectionZero vulns, security modesOAuth 2.1, TLS/SSLLocal-first
Best ForLocal DBs, embedded appsEnterprise PostgreSQLMySQL, InnoDB ClusterProject journaling, AI dev
Cloudflare Managers: Feature Comparison
FeatureD1 ManagerKV ManagerR2 ManagerDO Manager
Versionv2.0.0v2.1.0v2.0.0v1.0.0
ManagesD1 DatabasesKV NamespacesR2 BucketsDurable Objects
Authentication✓ Zero Trust✓ Zero Trust✓ Zero Trust✓ Zero Trust
Docker Support✓ (dev only)-
Backup/Restore✓ R2 backups, scheduled✓ R2 backupsNative R2 storage✓ R2 backups
Bulk Operations✓ Download, optimize, delete✓ Delete, copy, TTL, tags✓ Multi-bucket download✓ Download, delete, backup
Search✓ FTS5 full-text✓ Cross-namespace✓ Cross-bucket + AI Search✓ Global key/value search
Job History
Unique FeaturesDrizzle ORM, Time Travel, Read ReplicationColor tags, dual metadata, unlimited tagsAI Search integration, rate limiting, chunked uploadsAdmin hooks, SQL console, alarms, instance cloning
Best ForSQLite database management, ORM integrationKey-value storage, metadata managementObject storage, large files, semantic searchStateful objects, real-time apps

Troubleshooting

The MCP server isn't showing up in my MCP client

Common causes and solutions:

  1. Configuration file location incorrect
    • Cursor: ~/.cursor/mcp.json (Mac/Linux) or %USERPROFILE%\.cursor\mcp.json (Windows)
    • Claude Desktop: Check Claude settings for config location
  2. JSON syntax errors
    • Validate JSON syntax at jsonlint.com
    • Check for missing commas, quotes, or brackets
  3. Docker not running
    • Verify Docker is running: docker ps
    • Check Docker Desktop is started
  4. Restart required
    • Restart your MCP client after configuration changes
    • Some clients may need a full application restart
PostgreSQL connection errors - "Connection refused" or "Connection timeout"

Troubleshooting steps:

  1. Verify PostgreSQL is running
    pg_isready -h localhost -p 5432
  2. Check DATABASE_URI format
    • Correct: postgresql://user:pass@localhost:5432/dbname
    • Include port number (default: 5432)
    • URL-encode special characters in password
  3. Firewall/network issues
    • Check if port 5432 is accessible
    • Verify pg_hba.conf allows connections
    • For Docker: Use host.docker.internal instead of localhost
  4. Authentication
    • Verify username and password are correct
    • Check user has sufficient privileges
PostgreSQL extension errors - "extension not found"

Install required extensions:

-- Required (built-in)
CREATE EXTENSION IF NOT EXISTS pg_stat_statements;
CREATE EXTENSION IF NOT EXISTS pg_trgm;
CREATE EXTENSION IF NOT EXISTS fuzzystrmatch;

-- Optional - pgvector
CREATE EXTENSION IF NOT EXISTS vector;

-- Optional - PostGIS
CREATE EXTENSION IF NOT EXISTS postgis;

If extensions are missing:

Cloudflare authentication issues - "Access denied" or authentication loop

Common fixes:

  1. TEAM_DOMAIN format
    • Must include https://
    • Correct: https://yourteam.cloudflareaccess.com
    • Wrong: yourteam.cloudflareaccess.com
  2. POLICY_AUD mismatch
    • Verify AUD tag matches your Access application exactly
    • Get AUD from Zero Trust dashboard → Access → Applications
  3. API token permissions
    • Ensure token has required permissions (see "What API token permissions do I need?" above)
    • Token must not be expired
  4. Clear browser cookies
    • Cloudflare Access stores authentication cookies
    • Clear cookies and try again
Docker permission errors - "permission denied" or "file not found"

Solutions by platform:

Windows:

  • Ensure Docker Desktop is running
  • Enable file sharing in Docker Desktop settings
  • Use absolute paths or $(pwd) for volume mounts

Mac/Linux:

  • Check file permissions: chmod 755 /path/to/directory
  • Run Docker with correct user: docker run --user $(id -u):$(id -g)
  • Verify volume mount paths exist

General:

  • Use absolute paths for volume mounts
  • Ensure directories exist before mounting
  • Check Docker logs: docker logs [container-id]