DocsMCP Server

MCP Server Playground

What is MCP?

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.

MCP helps you build agents and complex workflows on top of LLMs by providing:

  • A growing list of pre-built integrations that your LLM can directly plug into
  • The flexibility to switch between LLM providers and vendors
  • Best practices for securing your data within your infrastructure

Learn more about MCP

Adding the Byterover MCP Server

Most IDEs support JSON configuration for MCP servers, making it easy to get started. Once you update the MCP configuration file in your IDE, the MCP server will be automatically downloaded and enabled.

Configuration Template

{
  "mcpServers": {
    "Byterover Memory MCP": {
      "command": "npx",
      "args": [
        "-y", "byterover-mcp",
        "--byterover-public-api-key=YOUR_PUBLIC_KEY",
        "--user-id=YOUR-NAME",
        "--llm-key-name=YOUR-LLM-KEY-NAME",
        "--model=YOUR-LLM-MODEL-NAME"
      ]
    }
  }
}

For Windows users, use this configuration instead:

{
  "mcpServers": {
    "Byterover Memory MCP": {
      "command": "cmd",
      "args": [
        "/c", "npx", "-y", "byterover-mcp",
        "--byterover-public-api-key=YOUR_PUBLIC_KEY",
        "--user-id=YOUR-NAME",
        "--llm-key-name=YOUR-LLM-KEY-NAME",
        "--model=YOUR-LLM-MODEL-NAME"
      ]
    }
  }
}

Supported Models

Byterover supports integration with both OpenAI and Anthropic models:

OpenAI Models

  • gpt-4o - optimized for performance and cost
  • gpt-4o-mini - optimized for cost
  • o3-mini - optimized for performance and cost
  • o1 - optimized for performance
  • gpt-4.5-preview - Preview version of GPT-4.5

Anthropic Models

  • claude-3-7-sonnet-20250219 - latest and most optimized for coding performance
  • claude-3-5-sonnet-20241022 - optimized for performance and cost
  • claude-3-opus-20240229 - optimized for performance
  • claude-3-5-haiku-20241022 - optimized for performance and cost

IDE-Specific Setup

Different IDEs have their own ways of configuring MCP servers. Here’s how to set it up in popular IDEs:

Cursor

Place the configuration in either:

  • Project-specific: .cursor/mcp.json in your project directory
  • Global: ~/.cursor/mcp.json in your home directory

Learn more about Cursor MCP setup

Windsurf

Add the configuration in:

  • ~/.codeium/windsurf/mcp_config.json

Learn more about Windsurf MCP setup

Cline

Add the configuration to:

  • .clinerules file or your custom instructions

Learn more about Cline MCP setup

Next Steps

The Byterover MCP server is a general-purpose but powerful tool that can be customized to suit your specific needs. Here are some ways to get the most out of it:

  1. Customize Memory Categories: Organize your code memories into categories that make sense for your workflow.

  2. Define Trigger Keywords: Set up specific keywords that will trigger the use of certain memories or tools.

  3. Create Memory Rules: Establish rules for when and how your agent should store and retrieve memories.

  4. Integrate with Your Workflow: Use the MCP server alongside your existing development tools and processes.

  5. Monitor and Refine: Keep track of how the MCP server is being used and adjust its configuration based on your needs.

Remember that the power of the Byterover MCP server lies in its flexibility - take time to configure it in a way that enhances your specific development workflow.