The Flamechart MCP (Model Context Protocol) server enables AI assistants to analyze performance traces and generate flamegraph visualizations. It works with both local trace files and remote traces stored on FlameDeck, across all major trace formats. Here’s a quick demo of the MCP server in action:

Best Model Performance - For optimal flamegraph analysis, we recommend using AI models with strong image understanding capabilities such as OpenAI’s o3. These models can better interpret the visual flamegraph outputs and provide more detailed insights about performance bottlenecks.

Quick Start

1. Install the MCP server

Click the button below to add the MCP server to Cursor (requires v1.0+):

Or manually add the following to your ~/.cursor/mcp.json file:

{
  "mcpServers": {
    "flamechart-debug": {
      "command": "npx",
      "args": ["-y", "@flamedeck/flamechart-mcp"]
    }
  }
}

2. Start Analyzing

Try this prompt in your AI assistant (use the absolute local file path to your trace file):

Root cause the slowness in this trace: 
/path/to/your/profile.json

Working with Remote Traces

Setting Up FlameDeck Integration

1

Create an API Key

  • Go to FlameDeck Settings
  • Create a new key with trace:download permissions
  • Copy the generated key
2

Configure the MCP Server

Click the button below to add the MCP configuration to Cursor (requires v1.0+):

Or manually add the following to your MCP server configuration:

{
  "mcpServers": {
    "flamechart-debug": {
      "command": "npx",
      "args": ["-y", "@flamedeck/flamechart-mcp"],
      "env": {
        "FLAMEDECK_API_KEY": "your_api_key_here"
      }
    }
  }
}

Using Remote Traces

Once configured, you can analyze traces directly from FlameDeck URLs:

Analyze this production trace and find performance bottlenecks:
https://www.flamedeck.com/traces/98508d02-1f2a-4885-9607-ecadceb3d734

Focus on database operations and API endpoints.

Example Workflows

Debugging Slow React Rendering

I have a React app that's rendering slowly. Analyze this Chrome DevTools profile:
/Users/dev/profiles/react-slow-render.json

Please:
1. Show me the top functions taking the most time
2. Generate a flamegraph to visualize the call stack
3. Look for any React-specific bottlenecks like expensive re-renders
4. Focus on functions in my application code vs React internals

API Performance Investigation

Our API endpoints are slow. Help me analyze this production trace:
https://www.flamedeck.com/traces/5e538693-a0b6-40a0-a932-8a6b48c2f269

Investigation goals:
1. Identify the slowest endpoints
2. Find database vs application time breakdown
3. Look for N+1 query patterns
4. Generate a visual report I can share with the team

Supported Trace Formats

The MCP server supports a wide range of profiling formats, including pprof, chrome, pyinstrument, etc.

Automatic Format Detection - The server automatically detects trace formats and handles gzipped files transparently.

Available Tools

get_top_functions

Identifies the slowest functions in your trace, helping you find performance bottlenecks.

trace
string
required

File path or FlameDeck URL

sortBy
string
default:"total"

Sort by 'self' or 'total' time

offset
number
default:"0"

Starting position for pagination

limit
number
default:"15"

Number of functions to return

Example Usage:

Show me the top 20 functions consuming the most total time in this trace:
/Users/dev/profiles/api-slow.cpuprofile

generate_flamegraph_screenshot

Creates a visual flamegraph showing the timeline of function execution.

trace
string
required

File path or FlameDeck URL

width
number
default:"1200"

Image width in pixels

height
number
default:"800"

Image height in pixels

mode
string
default:"light"

Theme: 'light' or 'dark'

startTimeMs
number

Start time for zoomed view

endTimeMs
number

End time for zoomed view

startDepth
number

Stack depth to start visualization

Example Usage:

Generate a dark mode flamegraph of this trace for my presentation:
/path/to/production-profile.json

generate_sandwich_flamegraph_screenshot

Creates a “sandwich view” focusing on a specific function, showing both its callers and callees.

trace
string
required

File path or FlameDeck URL

frameName
string
required

Exact function name to focus on

Example Usage:

Create a sandwich view for the "processData" function to understand what's calling it:
/path/to/trace.json

Performance Features

Smart Caching

The MCP server includes intelligent caching to improve performance:

  • Remote URLs: Cached for 5 minutes to avoid repeated API calls
  • Local Files: Cached until file modification detected
  • Memory Management: Automatic cleanup with 50 profile limit
  • Cache Invalidation: Smart invalidation based on file changes

Performance Boost - Subsequent tool calls on the same trace are nearly instant thanks to smart caching.

Troubleshooting

Common Issues

Getting Help

For technical support or to report issues, please submit a ticket through our GitHub issue tracker.

Advanced Usage

Custom Flamegraph Views

Create focused visualizations by specifying time ranges:

Generate a flamegraph focusing on the 500ms-1500ms time range of this trace:
/path/to/long-running-profile.json

I want to see what happened during that specific slow period.

Integration with Code Analysis

Combine trace analysis with code review:

Analyze this performance trace and then look through my codebase to understand the bottlenecks:
/path/to/slow-api.cpuprofile

Focus on the slowest functions and help me understand:
1. What those functions actually do in the code
2. Why they might be slow
3. Potential optimization strategies