Alliance DAO Researcher: A Deep Dive into the Recent Hot MCP Concept in the AI Field

CN
2 days ago

For AI applications, MCP is like USB-C in hardware.

Author: Mohamed ElSeidy

Compiled by: Deep Tide TechFlow

Introduction

Yesterday, the AI-related token $Dark launched on Binance Alpha, reaching a market cap of around $40 million.

In the latest crypto AI narrative, $Dark is closely related to "MCP" (Model Context Protocol), which is also a field that recent Web2 tech companies like Google are focusing on and exploring.

However, there are currently few articles that clearly explain the concept of MCP and its narrative impact.

The following is an insightful article about the MCP protocol by Alliance DAO researcher Mohamed ElSeidy, which explains the principles and positioning of MCP in very simple language, potentially helping us quickly understand the latest narrative.

Deep Tide TechFlow has compiled the full text.

In my years at Alliance, I have witnessed countless founders building their own dedicated tools and data integrations, all embedded into their AI agents and workflows. However, these algorithms, formalizations, and unique datasets are locked behind custom integrations, with very few people using them.

With the emergence of the Model Context Protocol (MCP), this situation is rapidly changing. MCP is defined as an open protocol that standardizes how applications communicate with large language models (LLMs) and provide context. One metaphor I particularly like is: “ForAIapplications, MCP is likeUSB-Cin hardware”; it is standardized, plug-and-play, multifunctional, and transformative.

Why Choose MCP?

Large language models (like Claude, OpenAI, LLAMA, etc.) are very powerful, but they are limited by the information currently accessible. This means they often have knowledge cutoffs, cannot browse the web independently, and cannot directly access your personal files or dedicated tools without some form of integration.

Previously, developers faced three main challenges when connecting LLMs to external data and tools:

  1. Integration Complexity: Building separate integrations for each platform (like Claude, ChatGPT, etc.) requires repetitive effort and maintaining multiple codebases.

  2. Tool Fragmentation: Each tool's functionality (e.g., file access, API connections, etc.) requires its own dedicated integration code and permission model.

  3. Limited Distribution: Dedicated tools are confined to specific platforms, limiting their reach and impact.

MCP addresses these issues by providing a standardized way for any LLM to securely access external tools and data sources through a universal protocol. Now that we understand the role of MCP, let’s see what people are building with it.

What Are People Building with MCP?

The MCP ecosystem is currently in a phase of explosive innovation. Here are some recent examples of developers showcasing their work on Twitter:

  1. AI-Driven Storyboard: An MCP integration that allows Claude to control ChatGPT-4o, automatically generating complete storyboards in the style of Studio Ghibli without any human intervention.

  2. ElevenLabs Voice Integration: An MCP server that allows Claude and Cursor to access the entire AI audio platform through simple text prompts. This integration is powerful enough to create voice agents capable of making outbound calls. It demonstrates how MCP can extend current AI tools into the audio domain.

  3. Browser Automation with Playwright: An MCP server that enables AI agents to control web browsers without screenshots or visual models. This standardizes LLM control over browser interactions, creating new possibilities for web automation.

  4. Personal WhatsApp Integration: A server that connects to personal WhatsApp accounts, allowing Claude to search messages and contacts and send new messages.

  5. Airbnb Search Tool: An Airbnb apartment search tool that showcases the simplicity of MCP and its ability to create practical applications for interacting with web services.

  6. Robot Control System: An MCP controller for robots. This example bridges the gap between LLMs and physical hardware, demonstrating the potential of MCP in IoT applications and robotics.

  7. Google Maps and Local Search: Connecting Claude to Google Maps data to create a system that can find and recommend local businesses (like coffee shops). This extension enables AI assistants to provide location-based services.

  8. Blockchain Integration: The Lyra MCP project brings MCP capabilities to StoryProtocol and other web3 platforms. This allows interaction with blockchain data and smart contracts, opening new possibilities for AI-enhanced decentralized applications.

What is particularly striking about these examples is their diversity. In the short time since the launch of MCP, developers have created integrations covering creative media production, communication platforms, hardware control, location services, and blockchain technology. These various applications follow the same standardized protocol, showcasing the versatility of MCP and its potential to become a universal standard for AI tool integration.

If you want to see a comprehensive collection of MCP servers, you can visit the official MCP server repository on GitHub. Please read the disclaimers carefully and exercise caution regarding the content you run and authorize before using any MCP server.

Promises and Hype

When facing any new technology, it is worth asking: Does MCP truly have transformative potential, or is it just another overhyped tool that will eventually fade away?

After observing numerous startups, I believe that MCP represents a genuine turning point in AI development. Unlike many trends that promise revolution but only bring incremental changes, MCP is a productivity enhancer that addresses the infrastructure issues hindering the development of the entire ecosystem.

What makes it special is that it does not attempt to replace or compete with existing AI models but rather enhances their utility by connecting them to the external tools and data they need.

Nevertheless, reasonable concerns about security and standardization remain. As with any protocol in its early stages, we may see growing pains as the community navigates best practices in auditing, permissions, authentication, and server validation. Developers need to trust the functionality of these MCP servers and not blindly rely on them, especially as they become more abundant. This article discusses some recent vulnerabilities exposed by the blind use of unvetted MCP servers, even when running locally.

The Future of AI is Contextual

The most powerful AI applications will no longer be standalone models but rather a specialized ecosystem of capabilities connected through standardized protocols like MCP. For startups, MCP represents an opportunity to build specialized components suited for these growing ecosystems. It is a chance to leverage your unique knowledge and capabilities while benefiting from the substantial investments in foundational models.

Looking ahead, we can expect MCP to become a fundamental component of AI infrastructure, much like HTTP is for the web. As the protocol matures and adoption grows, we are likely to see the emergence of a dedicated MCP server market, enabling AI systems to leverage almost any imaginable capability or data source.

Has your startup attempted to implement MCP? I would love to hear your experiences in the comments. If you have built something interesting in this space, please reach out to us through @alliancedao and apply.

Appendix

For those interested in understanding how MCP works in practice, the following appendix provides a technical breakdown of its architecture, workflows, and implementation.

Behind the Scenes of MCP

Similar to how HTTP standardized the way to access external data sources and information on the web, MCP does this for AI frameworks, creating a common language that allows different AI systems to communicate seamlessly. Let’s explore how it achieves this.

MCP Architecture and Workflow

The main architecture follows a client-server model, with four key components working together:

  1. MCPHost: Includes desktop AI applications like Claude or ChatGPT, IDEs like cursorAI or VSCode, or other AI tools that need access to external data and functionalities.

  2. MCPClient: A protocol processor embedded in the host that maintains a one-to-one connection with the MCP server.

  3. MCPServer: A lightweight program that exposes specific functionalities through a standardized protocol.

  4. Data Sources: Including files, databases, APIs, and services that the MCP server can securely access.

Now that we have discussed these components, let’s look at how they interact in a typical workflow:

  1. User Interaction: The user asks a question or makes a request in the MCP host (e.g., Claude Desktop).

  2. LLM Analysis: The LLM analyzes the request and determines that external information or tools are needed to provide a complete response.

  3. Tool Discovery: The MCP client queries the connected MCP server to discover available tools.

  4. Tool Selection: The LLM decides which tools to use based on the request and available functionalities.

  5. Permission Request: The host requests permission from the user to execute the selected tools, ensuring transparency and security.

  6. Tool Execution: Upon approval, the MCP client sends the request to the appropriate MCP server, which utilizes its specialized access to data sources to perform the operation.

  7. Result Processing: The server returns the results to the client, which formats them for LLM use.

  8. Response Generation: The LLM integrates the external information into a comprehensive response.

  9. User Presentation: Finally, the response is presented to the end user.

The strength of this architecture lies in the fact that each MCP server focuses on a specific domain while using a standardized communication protocol. This way, developers do not need to rebuild integrations for each platform; they can develop tools once to serve the entire AI ecosystem.

How to Build Your First MCP Server

Now let’s see how to implement a simple MCP server in just a few lines of code using the MCP SDK.

In this simple example, we want to extend the capabilities of Claude Desktop to answer questions like “What coffee shops are near Central Park?” with information sourced from Google Maps. You can easily extend this functionality to fetch reviews or ratings. But for now, we will focus on the MCP tool findnearbyplaces, which will allow Claude to directly retrieve this information from Google Maps and present the results in a conversational manner.

As you can see, the code is very simple. First, it converts the query into a Google Maps API search and then returns the top results in a structured format. This way, the information is passed back to the LLM for further decision-making.

Now we need to let Claude Desktop know about this tool, so we register it in its configuration file as follows:

macOS path: ~/Library/Application Support/Claude/claudedesktopconfig.json

Windows path: %APPDATA%\Claude\claudedesktopconfig.json

That's it! You have successfully extended Claude's capabilities to look up locations in real-time from Google Maps.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

币安:注册返10%、领$600
链接:https://accounts.suitechsui.blue/zh-CN/register?ref=FRV6ZPAF&return_to=aHR0cHM6Ly93d3cuc3VpdGVjaHN1aS5hY2FkZW15L3poLUNOL2pvaW4_cmVmPUZSVjZaUEFG
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink