DEV Community

Cover image for Deploying an MCP server on AWS
Daniel Alejandro Figueroa Arias
Daniel Alejandro Figueroa Arias

Posted on • Originally published at medium.alejofig.com

Deploying an MCP server on AWS

The goal of this post is to demonstrate how to build and deploy a Model Context Protocol (MCP) on AWS using a FastAPI-based API.

📂 Repo: https://github.com/alejofig/mcp-berghain


Solution Architecture

  • Data Extraction: Using Firecrawl MCP to scrape data from berghain.berlin.
  • Data Persistence: Storing events in AWS DynamoDB.
  • REST API: Exposing the data through FastAPI.
  • MCP Publication: Publishing endpoints as MCP tools using FastMCP.
  • Deployment: Using Docker and AWS App Runner to deploy the solution.
  • AI Agent: Querying the MCP through PydanticAI using natural language.

Data Extraction with Firecrawl MCP

Firecrawl MCP simplifies web scraping by providing structured JSON output through an API.

Configuration example:

"firecrawl-mcp": {
  "command": "npx",
  "args": ["-y", "firecrawl-mcp"],
  "env": { "FIRECRAWL_API_KEY": "fc-..." }
}
Enter fullscreen mode Exit fullscreen mode

After running the script, JSON files containing the event data are generated and ready for processing.


Storing Data in DynamoDB

Data is stored in a DynamoDB table using id as the partition key.

Creating the table:

python create_dynamodb_table.py --table berghain --region us-east-1
Enter fullscreen mode Exit fullscreen mode

Table schema:

table = dynamodb.create_table(
    TableName=table_name,
    KeySchema=[{'AttributeName': 'id', 'KeyType': 'HASH'}],
    AttributeDefinitions=[{'AttributeName': 'id', 'AttributeType': 'S'}],
    ProvisionedThroughput={'ReadCapacityUnits': 5, 'WriteCapacityUnits': 5}
)
Enter fullscreen mode Exit fullscreen mode

Loading JSON data into the table:

python import_events.py --path ./events --table berghain --region us-east-1
Enter fullscreen mode Exit fullscreen mode

Each record contains:

  • id (UUID)
  • date
  • title
  • location
  • artists (array)
  • url

Building the API with FastAPI

The API is built using FastAPI to expose the event data with filtering and pagination.

Main Endpoints:

  • /{event_id} → Event by ID.
  • /year/{year}/month/{month} → Events by year and month.
  • /location/{location} → Events by location.
  • /artist/{artist} → Events by artist.

Example: Get Events by Month

@router.get("/year/{year}/month/{month}", response_model=List[Event])
async def get_events_by_month(year: int, month: int, repo: EventRepository = Depends()):
    events = await repo.get_by_year_month(year, month)
    return events
Enter fullscreen mode Exit fullscreen mode

The repository layer abstracts DynamoDB access and handles pagination using limit and last_evaluated_key.


Publishing the API as MCP Tools with FastMCP

Using FastMCP, the API endpoints are exposed as MCP tools, making them accessible by AI agents.

Configure endpoints as MCP tools:

custom_maps = [
    RouteMap(methods=["GET"], pattern=r"^/api/v1/year/.*", route_type=RouteType.TOOL),
    RouteMap(methods=["GET"], pattern=r"^/api/v1/location/.*", route_type=RouteType.TOOL),
    RouteMap(methods=["GET"], pattern=r"^/api/v1/artist/.*", route_type=RouteType.TOOL),
]

if __name__ == "__main__":
    async def main():
        mcp = FastMCP.from_fastapi(app=app, route_maps=custom_maps)
        await check_mcp(mcp)
        await mcp.run_async(transport="sse", host="0.0.0.0", port=8000)

    asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

You can also use the operation_id parameter in the FastAPI router to explicitly name each tool.


Deployment on AWS App Runner

The solution is containerized and deployed using AWS App Runner, a fully managed service for running containerized web applications.

Deployment Steps:

  1. Build and push the Docker image to Amazon ECR.
  2. Deploy the image using AWS App Runner and configure the public endpoint.

You can also use the main.tf provided in the repository to deploy App Runner with a role that grants DynamoDB access.


Consuming the MCP Using an AI Agent (PydanticAI)

agent = Agent(
    "openai:gpt-4o-mini",
    system_prompt="You are a helpful assistant that can answer questions and help with tasks.",
    mcp_servers=[MCPServerHTTP(url="https://mcp-berghain.alejofig.com/sse")]
)

user_query = "I want to know the events of Berghain for March 2025"
result = await agent.run(user_query)
print(result)
Enter fullscreen mode Exit fullscreen mode

Test the live solution here:

https://mcp-berghain.alejofig.com/sse


Conclusion

This solution demonstrates how to efficiently orchestrate data extraction, API development, and AI integration using a fully serverless and scalable approach on AWS.

Key outcomes:

  • Automated web data extraction with Firecrawl MCP.
  • Efficient data storage and querying with AWS DynamoDB.
  • MCP-compatible API publication with FastMCP.
  • Seamless deployment using Docker and AWS App Runner.
  • Natural language interaction with APIs through AI agents using PydanticAI.

This architecture can be reused for any use case requiring structured data exposure to AI models, enabling intelligent, conversational access to real-time information.

Top comments (0)