An MCP (Model Context Protocol) server that provides AI agents with database operations, serverless code deployment, and file management capabilities on the Codehooks.io platform.
- Query and update collections (including metadata) with filters and sorting
- Create and manage collections
- Import/export data (JSON,JSONL,CSV)
- Add schemas and indexes, cap collections
- Deploy JavaScript serverless functions
- Upload files to cloud storage
- List and browse files
- Delete files
- Inspect file metadata
- Store key-value pairs
- Retrieve one or many key-value pairs
- Delete key-value pairs
- Set time-to-live (TTL) for key-value pairs
- View application logs
- Access API documentation (local documentation for the MCP agent)
coho login
coho add-admintoken
Create a folder for your MCP server scripts:
mkdir ~/mcp-servers
cd ~/mcp-servers
For macOS/Linux - Create codehooks.sh
:
#!/bin/bash
# Set PATH to include common Docker locations
export PATH="/usr/local/bin:/opt/homebrew/bin:/usr/bin:/bin:$PATH"
exec docker run --rm -i \
-e CODEHOOKS_PROJECT_NAME=your_project_name \
-e CODEHOOKS_ADMIN_TOKEN=your_admin_token \
-e CODEHOOKS_SPACE=your_space_name \
ghcr.io/restdb/codehooks-mcp:latest
Make it executable:
chmod +x ~/mcp-servers/codehooks.sh
For Windows - Create codehooks.bat
:
@echo off
docker run --rm -i ^
-e CODEHOOKS_PROJECT_NAME=your_project_name ^
-e CODEHOOKS_ADMIN_TOKEN=your_admin_token ^
-e CODEHOOKS_SPACE=your_space_name ^
ghcr.io/restdb/codehooks-mcp:latest
Replace your_project_name
, your_admin_token
, and your_space_name
with your actual values.
Add to your claude_desktop_config.json
:
macOS/Linux:
{
"mcpServers": {
"codehooks": {
"command": "/Users/username/mcp-servers/codehooks.sh"
}
}
}
Windows:
{
"mcpServers": {
"codehooks": {
"command": "C:\\Users\\username\\mcp-servers\\codehooks.bat"
}
}
}
Add to your ~/.cursor/mcp.json
:
macOS/Linux:
{
"mcpServers": {
"codehooks": {
"command": "/Users/username/mcp-servers/codehooks.sh"
}
}
}
Windows:
{
"mcpServers": {
"codehooks": {
"command": "C:\\Users\\username\\mcp-servers\\codehooks.bat"
}
}
}
Replace username
with your actual username.
- "Build a complete survey system: create a database, deploy an API to collect responses, and add search/analytics endpoints"
- "Set up a real-time inventory tracker: import my product CSV, create stock update webhooks, and build low-stock alerts"
- "Build a webhook processing pipeline: receive webhooks from multiple sources, transform and validate data, then trigger automated actions"
- "Build a content management system: create file upload endpoints, set up a metadata database, and deploy content delivery APIs"
- "Set up automated data backups: export my collections to JSON files, store them with timestamps, and create restoration endpoints"
The AI agent would:
- Create collections (
surveys
,responses
) for data storage - Add schemas for data validation and structure
- Deploy JavaScript endpoints like
POST /surveys
andGET /surveys/:id/analytics
- Create indexes on response fields for fast searching and analytics
The AI agent would:
- Import your CSV to populate the
products
collection - Deploy webhook handlers for
POST /inventory/update
andGET /inventory/low-stock
- Set up key-value storage for alert thresholds and settings
- Create indexes on SKU and stock levels for real-time queries
The AI agent would:
- Deploy webhook receivers like
POST /webhooks/stripe
andPOST /webhooks/github
- Create collections for
webhook_logs
,processed_events
, andfailed_events
- Set up data transformation rules and validation schemas for each webhook source
- Use key-value store for rate limiting and duplicate detection with TTL
- Deploy action triggers that send emails, update databases, or call other APIs based on webhook data
The AI agent would:
- Create collections for
content
,media
, andusers
- Deploy file upload endpoints with
POST /upload
andGET /content/:id
- Upload and manage static files for content delivery
- Store metadata linking files to content records with search indexes
The AI agent would:
- Export collections to JSON format with timestamps
- Upload backup files to cloud storage automatically
- Deploy restoration APIs like
GET /backups
andPOST /restore/:backup-id
- Store backup metadata in key-value store for tracking and management
Each example demonstrates how multiple MCP tools work together to create complete, production-ready systems through natural conversation with your AI agent.