Build implementations of dynamic artificial intelligence systems using the Model Context Protocol (MCP) for real-time resource and tool integration
In this tutorial, we explore the high-level Model Context Protocol (MCP) and demonstrate how it can be used to solve one of the most unique challenges in modern AI systems: enabling real-time interaction between AI models and external data or tools. Traditional models run in isolation, limited to training data, but with MCP, we create a bridge that enables models to access real-time resources, run specialized tools, and dynamically adapt to changing environments. We built the MCP server and client from the ground up to show how each component contributes to this powerful intelligent collaboration ecosystem. Check The complete code is here.
import json
import asyncio
from dataclasses import dataclass, asdict
from typing import Dict, List, Any, Optional, Callable
from datetime import datetime
import random
@dataclass
class Resource:
uri: str
name: str
description: str
mime_type: str
content: Any = None
@dataclass
class Tool:
name: str
description: str
parameters: Dict[str, Any]
handler: Optional[Callable] = None
@dataclass
class Message:
role: str
content: str
timestamp: str = None
def __post_init__(self):
if not self.timestamp:
self.timestamp = datetime.now().isoformat()
We first define the basic building blocks of MCP: resources, tools, and messages. We design these data structures to represent how information flows between an AI system and its external environment in a clean, structured way. Check The complete code is here.
class MCPServer:
def __init__(self, name: str):
self.name = name
self.resources: Dict[str, Resource] = {}
self.tools: Dict[str, Tool] = {}
self.capabilities = {"resources": True, "tools": True, "prompts": True, "logging": True}
print(f"✓ MCP Server '{name}' initialized with capabilities: {list(self.capabilities.keys())}")
def register_resource(self, resource: Resource) -> None:
self.resources[resource.uri] = resource
print(f" → Resource registered: {resource.name} ({resource.uri})")
def register_tool(self, tool: Tool) -> None:
self.tools[tool.name] = tool
print(f" → Tool registered: {tool.name}")
async def get_resource(self, uri: str) -> Optional[Resource]:
await asyncio.sleep(0.1)
return self.resources.get(uri)
async def execute_tool(self, tool_name: str, arguments: Dict[str, Any]) -> Any:
if tool_name not in self.tools:
raise ValueError(f"Tool '{tool_name}' not found")
tool = self.tools[tool_name]
if tool.handler:
return await tool.handler(**arguments)
return {"status": "executed", "tool": tool_name, "args": arguments}
def list_resources(self) -> List[Dict[str, str]]:
return [{"uri": r.uri, "name": r.name, "description": r.description} for r in self.resources.values()]
def list_tools(self) -> List[Dict[str, Any]]:
return [{"name": t.name, "description": t.description, "parameters": t.parameters} for t in self.tools.values()]
We implemented an MCP server that manages resources and tools while handling execution and retrieval operations. We ensured that it supports asynchronous interactions, making it efficient and scalable for real-world AI applications. Check The complete code is here.
class MCPClient:
def __init__(self, client_id: str):
self.client_id = client_id
self.connected_servers: Dict[str, MCPServer] = {}
self.context: List[Message] = []
print(f"n✓ MCP Client '{client_id}' initialized")
def connect_server(self, server: MCPServer) -> None:
self.connected_servers[server.name] = server
print(f" → Connected to server: {server.name}")
async def query_resources(self, server_name: str) -> List[Dict[str, str]]:
if server_name not in self.connected_servers:
raise ValueError(f"Not connected to server: {server_name}")
return self.connected_servers[server_name].list_resources()
async def fetch_resource(self, server_name: str, uri: str) -> Optional[Resource]:
if server_name not in self.connected_servers:
raise ValueError(f"Not connected to server: {server_name}")
server = self.connected_servers[server_name]
resource = await server.get_resource(uri)
if resource:
self.add_to_context(Message(role="system", content=f"Fetched resource: {resource.name}"))
return resource
async def call_tool(self, server_name: str, tool_name: str, **kwargs) -> Any:
if server_name not in self.connected_servers:
raise ValueError(f"Not connected to server: {server_name}")
server = self.connected_servers[server_name]
result = await server.execute_tool(tool_name, kwargs)
self.add_to_context(Message(role="system", content=f"Tool '{tool_name}' executed"))
return result
def add_to_context(self, message: Message) -> None:
self.context.append(message)
def get_context(self) -> List[Dict[str, Any]]:
return [asdict(msg) for msg in self.context]
We create MCP clients that connect to the server, query resources, and execute tools. We maintain a contextual memory of all interactions, enabling continuous, stateful communication with the server. Check The complete code is here.
async def analyze_sentiment(text: str) -> Dict[str, Any]:
await asyncio.sleep(0.2)
sentiments = ["positive", "negative", "neutral"]
return {"text": text, "sentiment": random.choice(sentiments), "confidence": round(random.uniform(0.7, 0.99), 2)}
async def summarize_text(text: str, max_length: int = 100) -> Dict[str, str]:
await asyncio.sleep(0.15)
summary = text[:max_length] + "..." if len(text) > max_length else text
return {"original_length": len(text), "summary": summary, "compression_ratio": round(len(summary) / len(text), 2)}
async def search_knowledge(query: str, top_k: int = 3) -> List[Dict[str, Any]]:
await asyncio.sleep(0.25)
mock_results = [{"title": f"Result {i+1} for '{query}'", "score": round(random.uniform(0.5, 1.0), 2)} for i in range(top_k)]
return sorted(mock_results, key=lambda x: x["score"], reverse=True)
We define a set of asynchronous tool handlers, including sentiment analysis, text summarization, and knowledge search. We use them to simulate how MCP systems perform various operations through modular, pluggable tools. Check The complete code is here.
async def run_mcp_demo():
print("=" * 60)
print("MODEL CONTEXT PROTOCOL (MCP) - ADVANCED TUTORIAL")
print("=" * 60)
print("n[1] Setting up MCP Server...")
server = MCPServer("knowledge-server")
print("n[2] Registering resources...")
server.register_resource(Resource(uri="docs://python-guide", name="Python Programming Guide", description="Comprehensive Python documentation", mime_type="text/markdown", content="# Python GuidenPython is a high-level programming language..."))
server.register_resource(Resource(uri="data://sales-2024", name="2024 Sales Data", description="Annual sales metrics", mime_type="application/json", content={"q1": 125000, "q2": 142000, "q3": 138000, "q4": 165000}))
print("n[3] Registering tools...")
server.register_tool(Tool(name="analyze_sentiment", description="Analyze sentiment of text", parameters={"text": {"type": "string", "required": True}}, handler=analyze_sentiment))
server.register_tool(Tool(name="summarize_text", description="Summarize long text", parameters={"text": {"type": "string", "required": True}, "max_length": {"type": "integer", "default": 100}}, handler=summarize_text))
server.register_tool(Tool(name="search_knowledge", description="Search knowledge base", parameters={"query": {"type": "string", "required": True}, "top_k": {"type": "integer", "default": 3}}, handler=search_knowledge))
client = MCPClient("demo-client")
client.connect_server(server)
print("n" + "=" * 60)
print("DEMONSTRATION: MCP IN ACTION")
print("=" * 60)
print("n[Demo 1] Listing available resources...")
resources = await client.query_resources("knowledge-server")
for res in resources:
print(f" • {res['name']}: {res['description']}")
print("n[Demo 2] Fetching sales data resource...")
sales_resource = await client.fetch_resource("knowledge-server", "data://sales-2024")
if sales_resource:
print(f" Data: {json.dumps(sales_resource.content, indent=2)}")
print("n[Demo 3] Analyzing sentiment...")
sentiment_result = await client.call_tool("knowledge-server", "analyze_sentiment", text="MCP is an amazing protocol for AI integration!")
print(f" Result: {json.dumps(sentiment_result, indent=2)}")
print("n[Demo 4] Summarizing text...")
summary_result = await client.call_tool("knowledge-server", "summarize_text", text="The Model Context Protocol enables seamless integration between AI models and external data sources...", max_length=50)
print(f" Summary: {summary_result['summary']}")
print("n[Demo 5] Searching knowledge base...")
search_result = await client.call_tool("knowledge-server", "search_knowledge", query="machine learning", top_k=3)
print(" Top results:")
for result in search_result:
print(f" - {result['title']} (score: {result['score']})")
print("n[Demo 6] Current context window...")
context = client.get_context()
print(f" Context length: {len(context)} messages")
for i, msg in enumerate(context[-3:], 1):
print(f" {i}. [{msg['role']}] {msg['content']}")
print("n" + "=" * 60)
print("✓ MCP Tutorial Complete!")
print("=" * 60)
print("nKey Takeaways:")
print("• MCP enables modular AI-to-resource connections")
print("• Resources provide context from external sources")
print("• Tools enable dynamic operations and actions")
print("• Async design supports efficient I/O operations")
if __name__ == "__main__":
import sys
if 'ipykernel' in sys.modules or 'google.colab' in sys.modules:
await run_mcp_demo()
else:
asyncio.run(run_mcp_demo())
We put everything together into a complete demo where the client interacts with the server, gets data, runs tools and maintains context. We witnessed the full potential of MCP as it seamlessly integrates AI logic with external knowledge and computation.
All in all, the uniqueness of the problem we address here lies in breaking the boundaries of static AI systems. Rather than treating models as closed boxes, we design an architecture that enables models to query, reason about, and process real-world data in a structured, context-driven way. This dynamic interoperability enabled by the MCP framework represents a major shift toward modular, tool-augmented intelligence. By understanding and implementing MCP, we position ourselves to build the next generation of adaptive AI systems that can think, learn, and connect beyond their original limitations.
Check The complete code is here. Please feel free to check out our GitHub page for tutorials, code, and notebooks. In addition, welcome to follow us twitter And don’t forget to join our 100k+ ML SubReddit and subscribe our newsletter. wait! Are you using Telegram? Now you can also join us via telegram.
Asif Razzaq is the CEO of Marktechpost Media Inc. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of artificial intelligence for the benefit of society. His most recent endeavor is the launch of Marktechpost, an artificial intelligence media platform that stands out for its in-depth coverage of machine learning and deep learning news that is technically sound and easy to understand for a broad audience. The platform has more than 2 million monthly views, which shows that it is very popular among viewers.
🙌 FOLLOW MARKTECHPOST: Add us as your go-to source on Google.