Model Context Protocol (MCP) Explained in 20 Minutes


Ever wished you could plug custom tools, files, and instructions directly into your AI model, just like plugging a device into a USB-C port? The current landscape of AI integrations can feel like a tangled mess of proprietary adapters and one-off APIs.

That’s where the Model Context Protocol (MCP) comes in.

Developed by Anthropic, MCP is an open standard designed to be the universal connector between AI applications and external context. It creates a simple, standardized way for your AI to access the specific tools and data it needs to get the job done.

In this guide, we’ll break down what MCP is, why it’s a game-changer, and walk you through building your very own MCP server using Python.



Why MCP is a Game-Changer for AI Development

Think of MCP as the USB-C of AI. It replaces a messy drawer of custom connectors with a single, powerful standard. This simple idea has profound benefits:

  • Create Portable Toolsets: Build a set of tools once and use it everywhere. With MCP, you can dynamically connect your custom toolkit to different AI models or IDEs without starting from scratch each time.
  • Seamless Custom Integrations: Effortlessly connect your AI to proprietary databases, internal APIs, or local files. MCP provides a clean architecture for feeding your model the context it needs to perform specialized tasks.
  • Unlock a Rich Ecosystem: As an open standard, MCP allows you to tap into a growing ecosystem of open-source servers. This means you can enhance your own AI applications by leveraging tools and resources built by the community.



Under the Hood: How MCP Works

MCP uses a straightforward client-server architecture.

  • The Client: An AI application (like Claude Desktop) that needs access to external information.
  • The Server: An independent program you build that listens for requests from the client and provides the necessary context.

The client and server communicate through a transport layer, like standard I/O (stdio) or HTTP. The server exposes its capabilities to the client through three simple but powerful primitives:

  1. Prompts: These are snippets of text, like system prompts or instructions, that can be dynamically sent to the model to guide its behavior. Example: A prompt that tells the AI to always respond in the persona of a helpful pirate.
  2. Resources: These are pieces of data identified by a URI (like a file path). When the client requests a resource, the server fetches and returns its content. Example: A resource that loads the contents of email_template.md from your local disk.
  3. Tools: These are functions the AI model can call to perform actions, like searching a database, sending an email, or calling an external API. Example: A tool named write_email_draft that connects to the Gmail API.



Let’s Build: Your First MCP Server in Python

Ready to see it in action? Let’s build a simple MCP server named “AVA” (Artificial Virtual Assistant) that provides a custom prompt, a file resource, and a tool to draft an email.



Prerequisites

We’ll use UV, an extremely fast Python package manager from Astral, to set up our environment.

Install UV (if you don’t have it):



Step 1: Set Up Your Project

First, create a new directory for your project, set up a virtual environment, and install the Anthropic SDK.

# Create and enter your project directory
mkdir mcp-server-ava
cd mcp-server-ava

# Create and activate a virtual environment with UV
uv venv

# Install the Anthropic SDK
uv pip install anthropic
Enter fullscreen mode

Exit fullscreen mode

Now, create a Python file named server.py.



Step 2: Create the Server and Define its Abilities

In server.py, we’ll import the necessary library, create the server instance, and then define our prompts, resources, and tools using simple decorators.

# server.py

from anthropic.mcp import MCP

# 1. Initialize our MCP Server and give it a name
server = MCP.Server("AVA")

# 2. Add a global prompt using the @server.prompt decorator
@server.prompt
def global_instructions(text_inputs):
    """Provides a system prompt to guide the AI's personality."""
    return (
        "You are AVA, a helpful and professional Artificial Virtual Assistant. "
        "You are to be concise and always assist the user with their request."
    )

# 3. Add a resource to load a local file
@server.resource
def email_template(uri: str):
    """Loads the content of an email template file."""
    # For this example, create a file named 'email_template.md'
    # in the same directory.
    try:
        with open("email_template.md", "r") as f:
            return f.read()
    except FileNotFoundError:
        return "Error: email_template.md not found."

# 4. Add a tool to perform an action
@server.tool
def write_email_draft(to: str, subject: str, body: str):
    """
    Simulates writing an email draft.
    In a real-world scenario, this would interact with an email API.
    """
    print(f"--- MOCK EMAIL DRAFT ---")
    print(f"To: {to}")
    print(f"Subject: {subject}")
    print(f"Body: {body}")
    print(f"--------------------------")
    return f"Success! Email draft for '{subject}' was created."

# 5. Run the server using standard I/O
if __name__ == "__main__":
    server.run()
Enter fullscreen mode

Exit fullscreen mode

Before running, create the email_template.md file that our resource will load:



Hi [Name],

I hope this email finds you well.

...

Best,
AVA
Enter fullscreen mode

Exit fullscreen mode



Step 3: Run Your Server

Now, run your server from the terminal. It will wait for a client to connect.

# Make sure your virtual environment is active
source .venv/bin/activate

# Run the server
python server.py
Enter fullscreen mode

Exit fullscreen mode

Your server is now live and listening for requests!



Step 4: Connecting to an AI Application

With your server running, the final step is to connect it to an MCP client, such as Claude Desktop.

The process is simple:

  1. Go to your AI application’s settings for tools or context.
  2. Find the option to add a new MCP server.
  3. Point the client to your running script. Since we’re using the default stdio transport, the client will execute your python server.py command and communicate with it directly.

Once connected, the AI application will automatically discover the global_instructions prompt, the email_template resource, and the write_email_draft tool, making them available for the model to use.



The Future is Standardized

The Model Context Protocol represents a fundamental shift toward a more open, modular, and powerful AI ecosystem. By creating a universal standard for context, MCP empowers developers to build more capable and customized AI solutions without being locked into a single platform.

You’ve just built a simple server, but the possibilities are endless. You could create tools that query your company’s knowledge base, resources that pull in real-time data, or prompts that perfectly tune an AI for your brand’s voice.

So, what will you build first?



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *