phh 3 days ago

I agree with the article, and I love how the author is (mis-)using MCP. I just want to rephrase what the accident actually is.

The accident isn't that somehow we got a protocol to do things we couldn't do before. As other comments point out MCP (the specificaiton), isn't anything new or interesting.

No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned.

I don't know how long it'll last, but I sure appreciate it.

  • sshine 3 days ago

    Hype, certainly.

    But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user?

    Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because that’s practically free labor, so does everyone want everything connected.

    For those of us who rode the API wave before integrating became hand-wavey, it sure feels like the world caught up.

    I hope it will last, but I don’t know either.

    • mh- 3 days ago

      Unfortunately, I think we're equally likely to see shortsighted lock-in attempts like this [0] one from Slack.

      I tried to find a rebuttal to this article from Slack, but couldn't. I'm on a flight with slow wifi though. If someone from Slack wants to chime in that'd be swell, too.

      I've made the argument to CFOs multiple times over the years why we should continue to pay for Slack instead of just using Teams, but y'all are really making that harder and harder.

      [0]: https://www.reuters.com/business/salesforce-blocks-ai-rivals...

      • jetsnoc 3 days ago

        I wasn’t aware of this, it’s extremely shortsighted. My employees’ chats are my company’s data, and I should be able to use them as I see fit. Restricting API access to our own data moves them quickly in to the 'too difficult to continue doing business with' category.

        The reality is that Slack isn’t that sticky. The only reason I fended off the other business units who've demanded Microsoft Teams through the years is my software-engineering teams QoL. Slack has polish and is convenient but now that Slack is becoming inconvenient and not allowing me to do what I want, I can't justify fending off the detractors. I’ll gladly invest the time to swap them out for a platform that respects our ownership and lets us use our data however we need to. We left some money on the table but I am glad we didn’t bundle and upgrade to Slack Grid and lock ourselves into a three-year enterprise agreement...

        • diggan 2 days ago

          > I wasn’t aware of this, it’s extremely shortsighted. My employees’ chats are my company’s data, and I should be able to use them as I see fit.

          True, and if you're the only one sitting on the data and using it, then what you say is true.

          The moment you use another platform, entering agreements of terms of service and more, it stops being "your and/or your company's data" though, and Slack will do whatever they deem fit with it, including preventing you from getting all of the data, because then it gets easier for you to leave.

          Sucks, yeah, but it is the situation we're in, until lawmakers in your country catch up. Luckily, other jurisdictions are already better for things like this.

        • mh- 3 days ago

          Precisely the situation I'm in. I've fought off slack-to-teams migrations at multiple orgs for the same QoL reasons, but this will make that much (much) harder to justify.

        • KronisLV 3 days ago

          We migrated from Slack to Teams and while it does work, it’s also not very good (UI/UX wise). We also did try out Rocket.Chat and Mattermost and out of all of those Mattermost was the closest to Slack and the most familiar to us.

          • ljm 2 days ago

            I’d go for Discord if it had a business version without all the gaming stuff.

            The dedicated voice/video channels are great for ad-hoc conversations when remote and a lot better than Slack’s huddles. They’re like dedicated remote meeting rooms except you’re not limited by office space.

            • diggan 2 days ago

              > I’d go for Discord if it had a business version without all the gaming stuff.

              Granted, my Discord usage been relatively limited, but what "gaming stuff"? In the servers unrelated to gaming I don't think I see anything gaming related, but maybe I'm missing something obvious.

          • shadowtree 2 days ago

            We've migrated a 1000+ product team to Mattermost 2 years ago.

            Super happy with it. No bullshit upgrades that break your way of working. Utilitarian approach to everything, the basics just work. Still has some rough edges, but in a workhorse kind of way.

            Endorse.

      • SoftTalker 3 days ago

        Sounds sort of like an innovator's dilemma response. New technology appears and the response is gatekeeping and building walls rather than adaptation.

        • brabel 3 days ago

          Slack was never an innovator. By the time they showed up there were lots of chats apps. They just managed to go beyond the others by basically embedding a browser engine into their app at a time most thought of that as heresy, I mean a chat app that requires 1Gb to run was a laughable proposition to us, techies. But here we are… MS Teams is even heavier, but users seem to care nothing about that anyway.

          • Valodim 3 days ago

            They were never an innovator, they just did this thing nobody else did, that some years later became the norm?

            • secos 2 days ago

              Blitzscaled? Yep.

      • dgacmu 3 days ago

        I'm happier we went with Zulip each day.

        • diggan 2 days ago

          (Still) such an overvalued alternative to all these "ephemeral but permanent" chat apps. For folks who like a bit more structure and organization, but still want "live communication" like what Slack et al offers, do yourself a favor and look into Zulip.

      • msgodel a day ago

        A big part of my short thesis with Apple is that they'll try to do this sort of thing and it will mean real AI integration like what their customers want will simply never be available, driving them to more open platforms.

        I think you'll see this everywhere. LLMs mean "normal" people will suddenly see computers the way we do and a lot of corporate leadership just isn't intuitively prepared for that.

      • kimjune01 3 days ago

        If you are interested in scraping slack for personal use, I made a local-only slack scraper mcp: https://github.com/kimjune01/slunk-mcp

        • tra3 2 days ago

          Thanks for building this, but also ridiculous that you had to do it. I miss irc even though slack is objectively better.

          • tomrod 2 days ago

            Hard disagree, IRC is still the best chat application out there.

          • ajuc 2 days ago

            Jabber & XMPP was the peak of instant messaging. Since then it's been downhill.

      • ebiester 3 days ago

        It's going to take more people willing to move away from slack for those purposes.

        As it is, I'm going to propose that we move more key conversations outside of slack so that we can take advantage of feeding it into ai. It's a small jump from that to looking for alternatives.

        • sshine 3 days ago

          The argument used to be “Let’s move FOSS conversation out of {Slack, Discord} because they prevent conversations from being globally searchable, and they force individuals into subscription to access history backlog.”

          Getting indexed by AI crawlers appears to be the new equivalent to getting indexed by search engines.

      • solumunus 2 days ago

        I use both daily and Teams absolutely sucks.

        • mh- a day ago

          Very aware, zero desire to use Teams, that's why I've fought to keep Slack despite the cost.

          But now they're actively making it more difficult for people like me to say "engineers like it more" and that be a compelling-enough argument.

    • troupo 3 days ago

      > But the way I see it, AI agents created incentives for interoperability.

      There are no new incentives for interoperability. Compare that were already providing API access added MCP servers of varying quality.

      The rest couldn't care less, unless they can smell an opportunity to monetize hype

    • citizenpaul 17 hours ago

      MCP it's like API but with 100,000x the operating cost!

    • iechoz6H 3 days ago

      Well, interoperability requires competition and if there's one thing we've learnt it's that the tech industry loves a private monopoly.

  • alexpotato 3 days ago

    Reminds me of the days of Winsock.

    For those that don't remember/don't know, everything network related in Windows used to use their own, proprietary setup.

    Then one day, a bunch of vendors got together and decided to have a shared standard to the benefit of basically everyone.

    https://en.wikipedia.org/wiki/Winsock

    • ggambetta 3 days ago

      Trumpet Winsock! Brings back memories :)

      • pyman 3 days ago

        I think we're seeing a wave of hype marketing on YouTube, Twitter and LinkedIn, where people with big followings create posts or videos full with buzzwords (MCP, vibe coding, AI, models, agentic) with the sole purpose of promoting a product like Cursor, Claude Code or Gemini Code, or get people to use Anthropic's MCP instead of Google's A2A.

        It feels like 2 or 3 companies have paid people to flood the internet with content that looks educational but is really just a sales pitch riding the hype wave.

        Honestly, I just saw a project manager on LinkedIn telling his followers how MCP, LLMs and Claude Code changed his life. The comments were full of people asking how they can learn Claude Code, like it's the next Python.

        Feels less like genuine users and more like a coordinated push to build hype and sell subscriptions.

        • zarathustreal 2 days ago

          They’re not being paid, at least not directly. They don’t need to be. “Educational” “content” is a play to increase the personal profile as a “thought leader.” This turns into invitations to conferences and ultimately funnels into sales of courses and other financial opportunities

          • pyman 2 days ago

            Hype marketing can look spontaneous, but it's almost always planned. And once the momentum starts, others jump in. Influencers and opportunists ride the wave to promote themselves

        • fennecbutt 2 days ago

          Nah it's the same motivation as all the gen z tiktok kids. It's all for clout.

          People write those medium articles wanting engagement/clout/making it big/creating a brand.

      • mycall 2 days ago

        Trumpet + PPP on a university library mainframe was first experience on the internet.

  • visarga 3 days ago

    The main benefit is not that it made interoperability fashionable, or that it make things easy to interconnect. It is the LLM itself, if it knows how to wield tools. It's like you build a backend and the front-end is not your job anymore, AI does it.

    In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly.

    • freeone3000 3 days ago

      The issue holding us back was never that we had to write a frontend — it was the data locked behind proprietary databases and interfaces. Gated behind API keys and bot checks and captchas and scraper protection. And now we can have an MCP integrator for IFTTT and have back the web we were promised, at least for a while.

      • TeMPOraL 3 days ago

        Indeed, the frontend itself is usually the problem. If not for data lock in, we wouldn't need that many frontends in the first place - most of the web would be better operated through a few standardized widgets and a spreadsheet and database interfaces - and non-tech people would be using it and be more empowered for it.

        (And we know that because there was a brief period in time where basics of spreadsheets and databases were part of curriculum in the West and people had no problem with that.)

      • troupo 3 days ago

        So... How do MCPs magically unlock data behind proprietary databases and interfaces?

        • gillesjacobs 3 days ago

          It doesn't do it magically. The "tools" an LLM agent calls to create responses are typically REST APIs for these services.

          Previously, many companies gated these APIs but with the MCP AI hype they are incentivized to expose what you can achieve with APIs through an agent service.

          Incentives align here: user wants automations on data and actions on a service they are already using, company wants AI marketing, USP in automation features and still gets to control the output of the agent.

          • troupo 2 days ago

            > Previously, many companies gated these APIs but with the MCP AI hype they are incentivized to expose what you can achieve with APIs through an agent service.

            Why would they be incentivized to do that if they survived all the previous hype waves and still have access gated?

            > user wants automations on data and actions on a service they are already using,

            How many users want that? Why didn't companies do all this before, since the need for automation has always been there?

            • diggan 2 days ago

              > Why would they be incentivized to do that if they survived all the previous hype waves and still have access gated?

              Because they suddenly now don't want to be left out of the whole AI hype/wave.

              Is it stupid? Yes. Can we still reap the benefits of these choices driven by stupid motivations? Also yes.

      • whitten 3 days ago

        what is IFTTT ?

        • lazyasciiart 3 days ago

          If this then that - a zapier type glue provider.

          • Mtinie 3 days ago

            Minor chronological point but Zapier is an IFTTT-type glue provider.

            IFTTT was announced Dec. 14, 2010 and launched on Sept. 7. 2011.

            Zapier was first pitched Sept. 30, 2011 and their public beta launched May 2012.

            • TeMPOraL 3 days ago

              Minor point but arguably both are Yahoo Pipes-type glue provider, which itself is basic no-code glue thing. The difference is that IFTTT erred on the side of dumbing the product down too much, and Zapier erred on the side of being too much B2B-focused - so they both missed the mark on becoming the universal glue.

              • diggan 2 days ago

                Then of course we have Node-Red too, which probably is too developer focused (and generally just lacking any sort of focus at the same time, strangely), so also doesn't sit somewhere closer to the middle.

                Do we really not having anything coming close to the usefulness of Yahoo Pipes yet? What would a modern alternative look like and how would it work? Someone has to thinking about this.

                • TeMPOraL 2 days ago

                  If we're considering Node-Red, it would not be amiss to also mention N8N - which, mirroring the IFTTT/Zapier split, is basically the opposite of Node-Red on the "let's turn this into an enterprise product" scale.

                  > Do we really not having anything coming close to the usefulness of Yahoo Pipes yet?

                  I don't know of anything. There's some new products I saw heavily promoted on LinkedIn, but at first glance they feel like IFTTT with a shiny coat of paint.

                  > What would a modern alternative look like and how would it work?

                  At this point I think ComfyUI or the node editor in Blender would be the best; they're oriented for different kinds of workflows, but UIs of both are excellent in their own ways, and the rest is a matter of implementing the right blocks.

            • lazyasciiart 2 days ago

              But zapier is easily Google-able and therefore useful as a reference name even if the commenter hasn't heard of it.

    • HumanOstrich 3 days ago

      I don't understand what you mean.

      > It (the main benefit?) is the LLM itself, if it knows how to wield tools.

      LLMs and their ability to use tools are not a benefit or feature that arose from MCP. There has been tool usage/support with various protocols and conventions way before MCP.

      MCP doesn't have any novel aspects that are making it successful. It's relatively simple and easy to understand (for humans), and luck was on Anthropic's side. So people were able to quickly write many kinds of MCP servers and it exploded in popularity.

      Interoperability and interconnecting tools, APIs, and models across providers are the main benefits of MCP, driven by its wide-scale adoption.

    • secos 2 days ago

      To me it feels like an awkward API that creates opportunities to work the limitations of a normal API... which to me is not a great thing. Potentially useful, sure, but not great.

  • mellosouls 3 days ago

    the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned

    Perhaps but we see current hypes like Cursor only using MCP one way; you can feed into Cursor (eg. browser tools), but not out (eg. conversation history, context etc).

    I love Cursor but this "not giving back" mentality originally reflected in it's closed source forking of VS Code leaves an unpleasant taste in the mouth and I believe will ultimately see it lose developer credibility.

    Lock-in still seems to be locked in.

    • talos_ 3 days ago

      The VSCode extension Continue provides similar capabilities and gives you full access to your interaction traces (local database and JSON traces)

  • bitwize 3 days ago

    Remember Web 2.0? Remember the semantic web? Remember folksonomies? Mash-ups? The end of information silos? The democratizing power of HTTP APIs?Anyone? Anyone?

    • apgwoz 3 days ago

      I think we found a new backronym for MCP: Mashup Context Protocol.

      (The mashup hype was incredible, btw. Some of the most ridiculous web contraptions ever.)

      • bigiain 3 days ago

        And some of the most ridiculous songs. I remember (vaguely) a Bootie show at DNA Lounge back in the early 2000s that was entirely mashups. It was hilarious. Lady Gaga mashes up with Eurythmics, Coldplay and Metallica, Robert Palmer and Radiohead.

        (I hereby claim the name "DJ MCP"...)

    • kasey_junk 3 days ago

      Yes. Pieces of all of those things surround us now. And where we are wrt locking and interop is far beyond where we were when each of those fads happened.

      Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t. The benefits are too high for companies to allow vendors to lock the data away from them.

      • bigiain 3 days ago

        > Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t.

        I'd bet that while "shoveling data at llm agents" might not be a fad, sometime fairly soon doing so for free while someone else's VC money picks up the planet destroying data center costs will stop being a thing. Imagine if every PHP or Ruby on Rails, or Python/Django site had started out and locked themselves into a free tier Oracle database, then one day Oracle's licensing lawyers started showing up to charge people for their WordPress blog.

      • fragmede 3 days ago

        I know we're all just soaked by a wave of hype right now but I think MCP will go the way of other "but it works" tech, like zip files, RSS and shell scripts.

        • baobun 3 days ago

          Remember when GraphQl was making REST obsolete? This rhymes.

      • overfeed 3 days ago

        > Mcp is a fad, it’s not long term tech

        One that won't be supported by any of the big names except to suck data into their walled gardens and lock it up. We all know the playbook.

    • overfeed 3 days ago

      Yahoo Pipes, XMPP, self-hosted blogs, RSS-based social networks, pingbacks; the democratized p2p web that briefly was. I bet capitalism will go 2 for 2 against the naive idealism that gatekeepers will stop gatekeeping.

      • baq 3 days ago

        Giving value away is unacceptable… for MBAs and VCs, anyway.

    • karaterobot 3 days ago

      I don't understand your point. Some of those things were buzzwords, some were impossible dreams, some changed the way the web works completely. Are you just saying that the future is unknown?

      • klabb3 3 days ago

        No. What they are saying is best said with a quote from Battlestar Galactica:

        > All of this has happened before, and all of this will happen again.

        ”It” here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If you’re older, your dreams were probably crushed already by something earlier. If you’re younger, and you’re genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins.

        • bwfan123 3 days ago

          And then, there are "architecture astronaut"s dreaming of an entire internet of MCP speaking devices - an "internet of agents" if you will. That is now requiring a separate DNS, SMTP, BGP etc. for that internet.

        • danielrico 3 days ago

          I think battlestar Galactica must be quoting one of the Eddas. I've only read if it from Borges in Spanish, but Conner the same meaning: "Estas cosas han pasado. Estas cosas también pasarán."

        • fragmede 3 days ago

          I'm older and would like a discount please. The "this time it's different" energy is because assuming a human can interact with the system, and that vision models can drive a gui, who cares if there's an actual API, just have the AI interact with the system as if it was coming in as a human.

        • Art9681 3 days ago

          What's the point of stating the obvious if the obvious won't change anything? Things evolve. Winners win and losers lose. Change is constant. And? Does that somehow mean there's nothing to see here and we should move on?

          • klabb3 3 days ago

            No, things can change but we programmers tend to see everything as a technical problem, and assume that if only we can find a good technical solution we can fix it. But the problem isn’t technical – the APIs were shut down because consumer tech is governed by ads, which are not part of APIs (or would be trivial to remove). You have surely noticed that APIs are alive and well in enterprise, why? Because they have customers who pay money, and API access does not generally break their revenue stream (although even there some are skittish). As mere ”users” our economic function in consumer tech is to provide data and impressions to advertisers. Thy shall not bypass their sidebar, where the ads must be seen.

            Nowadays the shutdown is not just of APIs but even anti-scraping, login walls, paywalls, fingerprinting, and so on and so forth. It’s a much more adversarial landscape today than during Web 2.0. When they threw copyright in the trash with the fair use loophole for AI, that obviously causes even more content lockdown panic. And in the midst of this giant data Mexican standoff, people are gonna take down their guns and see the light because of a new meta—API protocol?

      • potatolicious 3 days ago

        I take their point to be that the underlying incentives haven't changed. The same forces and incentives that scuttled those things are likely to scuttle this as well.

        I actually disagree with the OP in this sub-thread:

        > "No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned."

        I don't think that's happened at all. I think some interoperability will be here to say - but those are overwhelmingly the products where interoperability was already the norm. The enterprise SaaS that your company is paying for will support their MCP servers. But they also probably already support various other plugin interfaces.

        And they're not doing this because of hype or new-fangledness, but because their incentives are aligned with interoperability. If their SaaS plugins into [some other thing] it increases their sales. In fact the lowering of integration effort is all upside for them.

        Where this is going to run into a brick wall (and I'd argue: already has to some degree) is that closed platforms that aren't incentivized to be interoperable still won't be. I don't think we've really moved the needle on that yet. Uber Eats is not champing at the bit to build the MCP server that orders your dinner.

        And there are a lot of really good reasons for this. In a previous job I worked on a popular voice assistant that integrated with numerous third-party services. There has always been vehement pushback to voice assistant integration (the ur-agent and to some degree still the holy grail) because it necessarily entails the service declaring near-total surrender about the user experience. An "Uber Eats MCP" is one that Uber has comparatively little control over the UX of, and has poor ability to constrain poor customer experiences. They are right to doubt this stuff.

        I also take some minor issue with the blog: the problem with MCP as the "everything API" is that you can't really take the "AI" part out of it. MCP tools are not guaranteed to communicate in structured formats! Instead of getting an HTTP 401 you will get a natural language string like "You cannot access this content because the author hasn't shared it with you."

        That's not useful without the presence of a NL-capable component in your system. It's not parseable!

        Also importantly, MCP inputs and outputs are intentionally not versioned nor encouraged to be stable. Devs are encouraged to alter their input and output formats to make them more accessible to LLMs. So your MCP interface can and likely will change without notice. None of this makes for good API for systems that aren't self-adaptive to that sort of thing (i.e., LLMs).

    • aorloff 3 days ago

      Nobody remembers the semantic web anymore

      • 1dom 3 days ago

        It was supposed to be Web 3.0, but then Web3 happened.

        In all seriousness though, I think HN has a larger-than-average amount of readers who've worked or studied around semantic web stuff.

        • overfeed 2 days ago

          > It was supposed to be Web 3.0, but then Web3 happened

          There's a shocking overlap of GPU-slinging tech-bros hyping both. Crypto-bros turned LLM-experts out of necessity when mining crypo on their rigs became unprofitable.

  • stavros 3 days ago

    I haven't seen an app that didn't have an API create one via MCP. The only MCP servers I've seen were for things that I could already access programmatically.

    • bigiain 3 days ago

      I am pondering if I should do this.

      If I have a app who's backend needs to connect to, say, a CRM platform - I wonder if instead of writing APIs to connect to Dynamics or Salesforce or Hubspot specifically, if there's benefit in abstracting a CRM interface with an MCP so that switching CRM providers later (or adding additional CRMs) becomes easier?

      • stavros 3 days ago

        One of us doesn't understand MCP well enough, and it might very well be me, but how can MCP be used without an LLM? Most of the structure is in human language.

        • asteroidburger 3 days ago

          MCP itself doesn't require the use of the LLM. There are other concepts, but for this use, Tools are key. A Tool is an operation, like a search.

          Have a look at the Filesystem example MCP server - https://github.com/modelcontextprotocol/servers/blob/main/sr.... It has a collection of Tools - read_file, read_multiple_files, write_file, etc.

          The LLM uses MCP to learn what tools the server makes available; the LLM decides when to use them. (The process is a little more complicated than that, but if you're just trying to call tools without an LLM, those parts aren't really important.) If you ask the LLM, "Find all files with an asterisk," it might deduce that it can use the search_files tool provided by that MCP to gain the needed context, invoke that tool, then process the results it returns. As an engineer, you can just call search_files if you know that's what you want.

          • fdye 2 days ago

            Yeah MCP isn't really doing a whole lot. You can give an LLM a generic HTTP extension. Then list a series of GET/POST/PUT and ask it to form the calls and parse the response. The problem is its not really ideal as the calls aren't natural language and its common for it to misguess the next token and mess up things like the route, body, headers, with a hallucination. So people started shortening these calls to simple things like read_file, etc. Prior to MCP there was a ton of playgrounds doing this with simple Playwright functions.

            The thing that surprises me is with MCP we have shirked all of the existing tools around OpenAPI specs, OIDC, etc. We could have created a system where all 'services' expose a mcp.slack.com/definition endpoint or something that spit back a list of shortcut terms like send_message and a translation function that composes it into the correct HTTP API (what most MCP servers do). For security we could have had the LLM establish its identity via all our existing systems like OIDC that combine authentication and authorization.

            In the system above you would not "install an mcp package" as in a code repo or server. Instead you would allow your LLM to access slack, it would then prompt you to login via OIDC and establish your identity and access level. Then it would grab the OpenAPI spec (machine readable) and the LLM focused shortcuts 'send_message', 'read_message', etc. LLM composes 'send_message Hello World' -> translates to HTTP POST slack.com/message or whatever and bob's your uncle.

            If you wanted to do fancy stuff with local systems then you could still build your own server the same way we have all built HTTP servers for decades and just expose the mcp.whatever.com subdomain for discovery. Then skip OIDC or allow ALL or something to simplify if you want.

          • skeeter2020 a day ago

            getting the LLM to use your tool is actually tougher than it should be. You don't get to decide that deterministically. I don't get what benefit there would be to build an MCP server without an LLM-based agent. You might as well build an API and get the value from a strict, predictable interface & results.

          • stavros 3 days ago

            That's my understanding as well, but you'll be missing the discovery part. You'll have to hardcode the API, at which point you may as well just use the computer API the MCP also uses under the hood.

        • HarHarVeryFunny 2 days ago

          My thought too when I read TFA about MCP as a universal interface, but I suppose one can distinguish between interface and automated discovery/usage.

          The MCP exposed API is there for anyone/anything to use, as long as you are able to understand how it works - by reading the natural language description of the exposed methods and their JSON input schemas. You could read this yourself and then use these methods in any way you choose.

          Where LLMs come in is that they understand natural language, so in the case where the MCP client is an LLM, then there is automatic discovery and integration - you don't need to grok the MCP interface and integrate it into your application, but instead the LLM can automatically use it based on the interface description.

        • baq 3 days ago

          You write a script which pretends to be an LLM to get the data you want reliably.

          But… you don’t really need to pretend you’re an LLM, you can just get the data using the same interface as a model would.

        • bigiain 3 days ago

          I will also freely admit to not understanding MCP much, but using it without an LLM was (at least to my reading) pretty much the main thesis of the linked article.

          "Okay but. But. What if you just... removed the AI part?

          What if it's just "a standardized way to connect literally anything to different data sources and tools"?"

          (which had struck out text 'AI models' in between 'connect' and 'literally')

          • brabel 3 days ago

            You’re right. It’s unclear, however, how your application will handle knowing when to call a tool/MCP. That’s the part where LLMs are so good at: understanding that to do a certain job, this or that tool would be useful, and then knowing how to provide the necessary parameters for the tool call (I say tool here because MCP is just a convenient way to pack a normal tool, in other words, it’s a plugin system for tools).

  • adregan 3 days ago

    How ironic given the amount of APIs that were locking down access in response to AI training!

    Though the general API lockdown was started long before that, and like you, I’m skeptical that this new wave of open access will last if the promise doesn’t live up to the hype.

    • miki123211 3 days ago

      MCP seems to be about giving you access to your own data. Your Slack conversations, your Jira tickets, your calendar appointments. Those wouldn't go into AI training datasets anyway, locked down APIs or not.

      The APIs of old were about giving you programmatic access to publicly available information. Public tweets, public Reddit posts, that sort of thing. That's the kind of data AI companies want for training, and you aren't getting it through MCP.

      • seunosewa 3 days ago

        Interesting perspective because MCPs are safer when they give you access to your own content from trusted providers or local apps on your computer than when they give you access to public data which may have prompt injection booby traps.

    • TimTheTinker 3 days ago

      MCP is supposed to grant "agency" (whatever that means), not merely expose curated data and functionality.

      In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs.

      • adregan 3 days ago

        Might be another reflection of McLuhan‘s “the medium is the message” in that APIs are built with the intended interface in mind.

        To this point, GUIs; going forward, AI agents. While the intention rhymes, the meaning of these systems diverge.

    • notatoad 3 days ago

      i don't think it's ironic at all. the AI boom exposed the value of data. there's two inevitable consequences when the value of something goes up: the people who were previously giving it away for free start charging for it, and the people who weren't previously selling it at all start selling it.

      the APIs that used to be free and now aren't were just slightly ahead of the game, all these new MCP servers aren't going to be free either.

  • Animats 3 days ago

    > Want spell check? MCP server.

    > Want it to order coffee when you complete 10 tasks? MCP server.

    With a trip through an LLM for each trivial request? A paid trip? With high overhead and costs?

    • notatoad 3 days ago

      the whole point of the article is that it doesn't need to be an LLM, MCP is just a standard way to expose tools to things that use tools. LLMs can use tools, but so can humans.

      • therein 3 days ago

        So the whole point of the article is that an API is an API and anything can call an API?

        • tveita 3 days ago

          There is a long tail of applications that are not currently scriptable or have a public API. The kind that every so often make you think "if only I could automate this instead of clicking through this exact same dialog 25 times"

          Before, "add a public API to this comic reader/music player/home accounting software/CD archive manager/etc." would be a niche feature to benefit 1% of users. Now more people will expect to hook up their AI assistant of choice, so the feature can be prioritized.

          The early MCP implementations will be for things that already have an API, which by itself is underwhelming.

          You would think Apple would have a leg up here with AppleScript already being a sanctioned way to add scriptable actions across the whole of macOS, but as far as I can tell they don't hook it up to Siri or Apple Intelligence in any way.

          • Animats 3 days ago

            That's not where the money is. It's in adding a toll charge for tokens to talk to widely used APIs.

          • troupo 3 days ago

            > There is a long tail of applications that are not currently scriptable or have a public API.

            So how does MCP help with this?

            • mike_hearn 3 days ago

              The theory is, I guess, that creating an MCP API is a lot easier than creating a regular API. A regular API is a very costly thing to develop and it has on-going costs too because it's so hard to change. You have to think about data structures, method names, how to expose errors, you have to document it, make a website to teach devs how to use it, probably make some SDKs if you want to do a good job, there's authentication involved probably, and then worst of all: if you need to change the direction of your product you can't because it'd break all the connected apps.

              An MCP API dodges all of that. You still need some data structures but beyond that you don't think too hard, just write some docs - no fancy HTML or SDKs needed. MCP is a desktop-first API so auth mostly stops being an issue. Most importantly, if you need to change anything you can, because the LLM will just figure it out, so you're way less product constrained.

        • lazyasciiart 3 days ago

          Maybe phrasing it this way will be the lightbulb moment for everyone who hasn’t got that yet.

        • dstanko 3 days ago

          I see the point as "let's not overcomplicate the API with complex schemas and such. Lets not use GraphQL for everything. Just create a simple API and call it to extend stuff. Am I wrong?

      • OkGoDoIt 3 days ago

        Part of the reason AI agents and MCP work is because AI can programmatically at runtime determine what plug-ins to use. Without the AI part, how does the host app know when to call a MCP server function?

        • baq 3 days ago

          Same way it would call any other api: exactly when it was programmed to.

        • notatoad 3 days ago

          same as any other api function call in an app - because an app developer programmed it to call that function.

          • brabel 3 days ago

            That only works for the MCPs your app knows about, which is not that great. The usefulness of a plugin system like MCP is that an app can automatically use it. But MCPs are literally just a function, with some metadata about what it does and how to invoke it. The only thing generic enough to figure out how to use a function given only this metadata seems to be an LLM. And not even all of them, only some support “toll calling “.

  • zackify 3 days ago

    It is new and exciting if you just learned to vibe code, and you don’t even know what a rest api is

  • ljm 2 days ago

    I genuinely believe that low-code workflow orchestrators like Zapier or IFTTT will be the first major victims of agentic LLM workflows. Maybe not right now but already it’s easier to write a prompt describing a workflow than it is to join a bunch of actions and triggers on a graph.

    The whole hype around AI replacing entire job functions does not have as much traction as the concept of using agents to handle all of the administrative stuff that connects a workflow together.

    Any open source model that supports MCP can do it, so there’s no vendor lock in, no need to learn the setup for different workflow tools, and a lot of money saved on seats for expensive SaaS tools.

  • exe34 3 days ago

    > made interoperability hype, and vendor lock-in old-fashioned

    I always imagined software could be written with a core that does the work and the UI would be interchangeable. I like that the current LLM hype is causing it to happen.

  • iLoveOncall 3 days ago

    > I don't know how long it'll last

    I'm just baffled no software vendor has already come up with a subscription to access the API via MCP.

    I mean obviously paid API access is nothing new, but "paid MCP access for our entreprise users" is surely on the pipeline everywhere, after which the openness will die down.

    • adamesque 3 days ago

      I think for enterprise it’s going to become part of the subscription you’re already paying for, not a new line item. And then prices will simply rise.

      Optionality will kill adoption, and these things are absolutely things you HAVE to be able to play with to discover the value (because it’s a new and very weird kind of tool that doesn’t work like existing tools)

    • Bjartr 3 days ago

      And I expect there'll eventually be a way for an AI to pay for an MCP use microtransaction style.

      Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done! Ads (and scams) for AIs to fall for!

      • iLoveOncall 3 days ago

        > Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done!

        We're already there, just take a look at the people spending $500 a day on Claude Code.

    • pininja 3 days ago

      Mapbox is just a small step away from that with their MCP server wrapping their pay-by-use API. I wouldn’t be surprised to see a subscription offering with usage limits if that somehow appealed to them. MapTiler already offers their service as a subscription so they’re even closer if they hosted a server like this on their own.

      https://github.com/mapbox/mcp-server

  • tempodox 3 days ago

    The pressure to monetize after all those humungous investments into AI will surely move some things that have been stuck in their ways and stagnant. It looks like this time the IT industry itself will be among those that are being disrupted.

  • conradev 3 days ago

    AI agents didn't only make adversarial interoperability hype, they've also made it inevitable! From here all the way until they're probing hardware to port Linux and write drivers.

  • qudat 3 days ago

    I joked with people in IRC that it took LLMs to bring answer engines into the terminal

  • MomsAVoxell 3 days ago

    All vendor lock in is being transmuted to model access.

  • bjornsing 3 days ago

    Exactly. It’s a crack in the MBA’s anti-commoditization wall. Right now it’s like USB-C. If we’re lucky it will turn into TCP/IP and transform the whole economy.

jadar 3 days ago

I don’t want to undermine the author’s enthusiasm for the universality of the MCP. But part of me can’t help wondering: isn’t this the idea of APIs in general? Replace MCP with REST and does that really change anything in the article? Or even an Operating System API? POSIX, anyone? Programs? Unix pipes? Yes, MCP is far simpler/universal than any of those things ended up being — but maybe the solution is to build simpler software on good fundamental abstractions rather than rebuilding the abstractions every time we want to do something new.

  • Jonovono 3 days ago

    MCP is not REST. In your comparison, its more that MCP is a protocol for discovering REST endpoints at runtime and letting users configure what REST endpoints should be used at runtime.

    Say i'm building a app and I want my users to be able to play spotify songs. Yea, i'll hit the spotify api. But now, say i've launched my app, and I want my users to be able to play a song from sonofm when they hit play. Alright, now I gotta open up the code and do some if statements hard code the sonofm api and ship a new version, show some update messages.

    MCP is literally just a way to make this extensible so instead of hardcoding this in, it can be configured at runtime

    • Too 3 days ago

      That only works if you let the LLM do the interpretation of the MCP descriptions, in the case of TFA the idea was to use MCP without LLM, which is essentially same as any old API.

      • Jonovono 2 days ago

        You can use MCP to dynamically call different services, without ever having to use an LLM to decide.

        With an LLM it would go

        List MCP Tools -> Get User prompt -> Feed both into LLM -> LLM tells you what tools to call

        You could skip the LLM aspect completely and get all tools and let the user at runtime pick the tool that "playsSong" for example

    • layer8 3 days ago

      HATEOAS was supposed to be that.

      https://en.wikipedia.org/wiki/HATEOAS

      • mort96 3 days ago

        Wait was it? HATEOAS is all about hypermedia, which means there must be a human in the loop being presented the rendered hypermedia. MCP seems like it's meant to be for machine<->machine communication, not human<->machine

        • layer8 3 days ago

          I agree that HATEOAS never made sense without a human in the loop, although I also have never seen it be described as such. IMO that’s an important reason why it never gained useful traction.

          There is a confused history where Roy Fielding described REST, then people applied some of that to JSON HTTP APIs, designating those as REST APIs, then Roy Fielding said “no you have to do HATEOAS to achieve what I meant by REST”, then some people tried to make their REST APIs conform to HATEOAS, all the while that change was of no use to REST clients.

          But now with AI it actually can make sense, because the AI is able to dynamically interpret the hypermedia content similar to a human.

        • chriswarbo 3 days ago

          Hypermedia isn't just for human consumption. Back in the 90s, the Web was going to be crawled by "User Agents": software performing tasks on behalf of people (say, finding good deals on certain items; or whatever). Web browsers (human-driven interfaces) were the most common User Agent, but ended up being a lowest-common-denominator; the only other User Agents to get any widespread support were Google crawlers.

        • NomDePlum 3 days ago

          My understanding was that the discoverable part of HATEAOS was meant for machine to machine. Actually all of REST is machine to machine except in very trivial situations.

          Not sure I'm understanding your point in hypermedia means there is human in the loop. Can you expand?

          • renerick 3 days ago

            H in HATEOAS stands for "hypermedia". Hypermedia is a type of document that includes hypermedia controls, which are presented by the hypermedia client to a user for interaction. It's the user who makes decision what controls to interact with. For example, when I'm writing this comment, HN server gave a hypermedia document, which contains your comment, a textarea input and a button to submit my reply, and me, the human in the loop, decides what to put in it the input and when to press the button. A machine can't do that on its own (but LLMs potentially can), so a user is required. That also means that JSON APIs meant for purely machine to machine interactions, commonly referred to as REST, can't be considered HATEOAS (and REST) due to absence of hypermedia controls.

            Further reading:

            - https://htmx.org/essays/how-did-rest-come-to-mean-the-opposi...

            - https://htmx.org/essays/hateoas/

            • NomDePlum 3 days ago

              So that's not my understanding. Hypermedia, as I understand it, are embedded links in responses that present possible forward actions.

              They are structured in a way that machine program could parse and use.

              I don't believe it requires human-in-the-loop, although that is of course possible.

              • recursivedoubts 3 days ago

                HTML is a hypermedia format, the most widely used, and it's designed mainly for human consumption. Machines parsing and using something is too broad an idea to engage with meaningfully: browsers parse HTML and do something with it: they present it to humans to select actions (i.e. hypermedia controls) to perform.

                Your understanding is incorrect, the links above will explain it. HATEOAS (and REST, which is a superset of HATEOAS) requires a consumer to have agency to make any sense (see https://intercoolerjs.org/2016/05/08/hatoeas-is-for-humans.h...)

                MCP could profitably explore adding hypermedia controls to the system, would be interesting to see if agentic MCP APIs are able to self-organize:

                https://x.com/htmx_org/status/1938250320817361063

                • NomDePlum 3 days ago

                  I've programmed machines to use those links so I'm pretty certain machines can use it. I've never heard of the HTML variation but so will have a look at those links.

                  • renerick 3 days ago

                    > I've programmed machines to use those links so I'm pretty certain machines can use it

                    I'm curious to learn how it worked.

                    The way I see it, the key word here is "programmed". Sure, you read the links from responses and eliminated the need to hardcode API routes in the system, but what would happen if a new link is created or old link is unexpectedly removed? Unless an app somehow presents to the user all available actions generated from those links, it would have to be modified every time to take advantage of newly added links. It would also need a rigorous existence checking for every used link, otherwise the system would break if a link is suddenly removed. You could argue that it would not happen, but now it's just regular old API coupling with backward compatibility concerns.

                    Building on my previous example of hn comments, if hn decides to add another action, for example "preview", the browser would present it to the user just fine, and the user would be able to immediately use it. They could also remove the "reply" button, and again, nothing would break. That would render the form somewhat useless of course, but that's the product question at this point

                    • NomDePlum 3 days ago

                      Yes, most of your observations about limitations are true. That doesn't mean it's not a useful technique.

                      This is a reasonable summary of how I understand it: https://martinfowler.com/articles/richardsonMaturityModel.ht...

                      This aligns with how I've seen it used. It helps identify forward actions, many of which will be standard to the protocol or to the domain. These can be programmed for and traversed, called or aggregated data presented using general or specific logic.

                      So New actions can be catered for but Novel actions cannot, custom logic would require to be added. Which then becomes part of the domain and the machine can now potentially handle it.

                      Hope that helps illustrate how it can be used programmatically.

                  • recursivedoubts 3 days ago

                    how did the machines you programmed react to new and novel links/actions in the response?

                    • NomDePlum 3 days ago

                      New are fine, Novel need catered for. I have left a fuller explanation to a sibling comment.

                      • recursivedoubts 2 days ago

                        Right, so that means that the primary novel aspect of REST (according to the coiner of that term, Roy Fielding), the uniform interface, is largely wasted. Effectively you get a level of indirection on top of hard-coded API response endpoints. Maybe better, but I don't think by much and a lot of work for the payoff.

                        To take advantage of the uniform interface you need to have a consumer with agency who can respond to new and novel interactions as presented in the form of hypermedia controls. The links above will explain more in depth, the section on REST in our book is a good overview:

                        https://hypermedia.systems/components-of-a-hypermedia-system...

          • mort96 3 days ago

            If you have machine <-> machine interaction, why would you use HTML with forms and buttons and text inputs etc? Wouldn't JSON or something else (even XML) make more sense?

    • gavinray 3 days ago

      MCP is a JSON RPC implementation of OpenAPI, or, get this, XML and WSDL/SOAP.

      • causal 2 days ago

        WSDL triggered me ha. I'm afraid you're right

    • emporas 3 days ago

      >Alright, now I gotta open up the code and do some if statements hard code the sonofm api and ship a new version, show some update messages.

      You will need to do that anyway. Easier discovery of the API doesn't say much.

      The user might want a complicated functionality, which combines several API calls, and more code for filtering/sorting/searching of that information locally. If you let the LLM to write the code by itself, it might take 20 minutes and millions of wasted tokens of the LLM going back and forth in the code to implement the functionality. No user is going to find that acceptable.

    • nikolayasdf123 3 days ago

      so... is this OpenAPI then?

      • lobsterthief 3 days ago

        Basically, yes. But with much more enthusiasm!

      • doug_durham 3 days ago

        OpenAPI doesn't have a baked in discoverability mechanism. It isn't compatible with LLMs out of the box. It is a lower level abstraction. I don't want to write a blob of code that talks to an Open API service every time I want to do something with an LLM.

        • falcor84 3 days ago

          >OpenAPI doesn't have a baked in discoverability mechanism.

          Well, Swagger was there from the start, and there's nothing stopping an LLM from connecting to an openapi.json/swagger.yaml endpoint, perhaps meditated by a small xslt-like filter that would make it more concise.

    • navigate8310 3 days ago

      Can't you just build a simple REST that takes this abstraction of plugging in different song providers away?

    • jaredsohn 3 days ago

      Feels like segment.com but for calling APIs rather than adding libraries to the frontend.

      • Jonovono 3 days ago

        Now make the segment for MCPs ;p

  • kvdveer 3 days ago

    The main difference between MCP and Rest is that MCP is self described from the very start. REST may have OpenAPI, but it is a later addon, and we haven't quite standardised on using it. The first step of exposing an MCP is describing it, for Rest is is an optional step that's often omitted.

    • Szpadel 3 days ago

      isn't also SOAP self described?

      • kerng 3 days ago

        When I read about MCP the first time and saw that it requires a "tools/list" API reminded me of COM/DCOM/ActiveX from Microsoft, it had things like QueryInterface and IDispatch. And I'm sure that wasn't the first time someone came up with dynamic runtime discovery of APIs a server offers.

        Interestingly, ActiveX was quite the security nightmare for very similar reasons actually, and we had to deal with infamous "DLL Hell". So, history repeats itself.

    • xg15 3 days ago

      Is it "self-described" in the sense I can get a list of endpoints or methods, with a human- (or LLM-) readable description for each - or does it supply actual schemata that I could also use with non-AI clients?

      (Even if only the former, it would of course be a huge step forward, as I could have the LLM generate schemata. Also, at least, everyone is standardizing on a base protocol now, and a way to pass command names, arguments, results, etc. That's already a huge step forward in contrast to arbitrary Rest+JSON or even HTTP APIs)

      • Spivak 3 days ago

        For each tool you get the human description as well as a JSON schema for the parameters needed to call the function.

        • talos_ 3 days ago

          You're getting an arbirary string back though...

          • jcelerier 3 days ago

            how else would you describe an arbitrary tool?

    • light_hue_1 3 days ago

      But you're describing it in a way that is useless to anything but an LLM. It would have been much better if the description language had been more formalized.

      • Majromax 3 days ago

        > It would have been much better if the description language had been more formalized.

        To speculate about this, perhaps the informality is the point. A full formal specification of something is somewhere between daunting and Sisyphean, and we're more likely to see supposedly formal documentation that nonetheless is incomplete or contains gaps to be filled with background knowledge or common sense.

        A mandatory but informal specification in plain language might be just the trick, particularly since vibe-APIing encourages rapid iteration and experimentation.

  • caust1c 3 days ago

    In my mind the only thing novel about MCP is requiring the schema is provided as part of the protocol. Like, sure it's convenient that the shape of the requests/response wrappers are all the same, that certainly helps with management using libraries that can wrap dynamic types in static types, but everyone was already doing that with APIs already we just didn't agree on what that envelope's shape should be. BUT, with the requirement that schema be provided with the protocol, and the carrot of AI models seamlessly consuming it, that was enough of an impetus.

    • marcosdumay 3 days ago

      > the only thing novel about MCP is requiring the schema is provided as part of the protocol

      You mean, like OpenAPI, gRPC, SOAP, and CORBA?

  • gdecaso 3 days ago

    The main difference between MCP and REST is `list-tools`.

    REST APIs have 5 or 6 ways of doing that, including "read it from our docs site", HATEOAS, OAS running on an endpoint as part of the API.

    MCP has a single way of listing endpoints.

    • OJFord 3 days ago

      > The main difference between MCP and REST is `list-tools`.

      > REST APIs have 5 or 6 ways of doing that

      You think nobody's ever going to publish a slight different standard to Anthropic's MCP that is also primarily intended for LLMs?

      • doug_durham 3 days ago

        Why would they? I'm sure the "Enterprise" folks are putting together some working group to develop ANSI-xyzzy standard for Enterprise operability which will never see the light of day.

        • OJFord 3 days ago

          Because they genuinely think it'll work better, because they think it will build brand awareness/moat, because they're upset MCP comes from a competitor

    • gavinray 3 days ago

      WSDL + XML API's have been around since 1998.

      OpenAPI, OData, gRPC, GraphQL

      I'm sure I'm missing a few...

      • anon7000 3 days ago

        In other words, there’s no commonly-used, agreed upon standard for creating APIs. The closest is REST-like APIs, which are really no more specific than “hit a URL and get some data back”.

        So why are we all bitching about it? Programmatically communicating with an ML model is a new thing, and it makes sense it might need some new concepts. It’s basically just a wrapper with a couple of opinions. Who cares. It’s probably better to be more opinionated about what exactly you put into MCP, rather than just exposing your hundreds of existing endpoints.

        • jjfoooo4 3 days ago

          I don’t think the comments here are complaining, they are pointing out that what’s being claimed as being new is not actually new

      • doug_durham 3 days ago

        Where is "list-tools" in any of those low level protocols?

        • opliko 3 days ago

          I don't know enough about OData, but:

          - Introspection (__schema queries) for every graphQL server. You can even see what it exposes because most services expose a web playground for testing graohQL APIs, e.g. GitHub: https://docs.github.com/en/graphql/overview/explorer

          - Server Reflection for gRPC, though here it's optional and I'm not aware of any hosted web clients, so you'll need a tool like gRPCCurl if you want to see how it looks in real services yourself.

          - OpenAPI is not a protocol, but a standard for describing APIs. It is the list-tools for REST APIs.

        • debugnik 3 days ago

          All of them already provide an IDL with text descriptions and a way to query a server's current interface, what else do we need? Just force those two optional features to be required for LLM tool calls and done.

          Is there anything stopping generic MCP servers for bridging those protocols as-is? If not, we might as well keep using them.

  • adverbly 2 days ago

    Apis do not need to necessarily tell you everything about themselves. Anyone who has used poorly documented or fully undocumented apis knows exactly what I'm talking about here.

    Obviously, for http apis you might often see something like an open API specification or graphql which both typically allow an api to describe itself. But this is not commonly a thing for non-http, which is something that mcp supports.

    MCP might be the first standard for self-described apis across all protocols(I might be misusing protocols here but not sure what the word technically should be. I think the MCP spec calls it transport but I might be wrong there), making it slightly more universal.

    I think the author is wrong to discount the importance of an llm as an interface here though. I do think the majority of mcp clients will be llms. An API might get you 90% of the way there but if the llm gets you 99.9% by handling that last bit of plumbing it's going to go mainstream.

  • TZubiri 3 days ago

    Damn, I just read this and it's comforting to see how similar it is to my own response.

    To elaborate on this, I don't know much about MCP, but usually when people speak about it is in a buzzword-seeking kind of way, and the people that are interested in it make these kinds of conceptual snafus.

    Second, and this applies not just to MCP, but even things like JSON, Rust, MongoDB. There's this phenomenon where people learn the complex stuff before learning the basics. It's not the first time I've cited this video on Homer studying marketing where he reads the books out of order https://www.youtube.com/watch?v=2BT7_owW2sU . It makes sense that this mistake is so common, the amount of literature and resources is like an inverted pyramid, there's so little classical foundations and A LOT of new stuff, most of which will not stand the test of time. Typically you have universities to lead the way and establish a classical corpus and path, but being such a young discipline, 70 years in and we are still not finding much stability, Universities have gone from teaching C, to teaching Java, to teaching Python (at least in intro to CS), maybe they will teach Rust next, but this buzzwording seems more in line with trying to predict the future, and there will be way more losers than winners in that realm. And the winners will have learned the classicals in addition to the new technology, learning the new stuff without the classics is a recipe for disaster.

  • int_19h 2 days ago

    This is exactly what the author is saying. It is "the idea of APIs in general" that has suddenly become a fad under the guise of MCP, riding the AI wave. And it may well be a very imperfect way to build APIs, but if it eventually becomes the standard to the point where every app has to offer it, it's still "good enough" and would massively improve interoperability all around as a side effect.

  • spenczar5 3 days ago

    honestly, yes - but MCP includes a really simple 'reflection' endpoint to list the capabilities of an API, with human readable docs on methods and types. That is something that gRPC and OpenAPI and friends have supported as an optional extension for ages, but it has largely been a toy. MCP makes it central and maybe that makes all the difference.

    • spudlyo 3 days ago

      At a previous job most of our services supported gRPC reflection, and exploring and tinkering with these APIs using the grpc_cli tool was some of the most fun I had while working there. Building and using gRPC services in golang left a strong positive impression on me.

      • lobsterthief 3 days ago

        I had the same experience working with GQL :)

  • rco8786 3 days ago

    One major difference is that MCP has discovery built into the protocol. There’s nothing in REST that informs clients what the API can do, what resources are available, etc.

  • bayesianbot 3 days ago

    My first thought as well. But maybe at least people wanting to plug their apps to their AI forces developers to actually implement the interface, unlike APIs that are mostly unheard of in general population and thus not offered?

jampa 3 days ago

I don't want to sound like a skeptic, but I see way more people talking about how awesome MCP is rather than people building cool things with it. Reminds me of blockchain hype.

MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.

  • qsort 3 days ago

    Regardless of how good a model gets, it can't do much if it doesn't have access to deterministic tools and information about the state of the world. And that's before you take into account security: you can't have a model running arbitrary requests against production, that's psychotic.

    I don't have a high opinion of MCP and the hype it's generating is ridicolous, but the problem it supposedly solves is real. If it can work as an excuse to have providers expose an API for their functionality like the article hopes, that's exciting for developers.

    • ramoz 3 days ago

      > Regardless of how good a model gets

      I don't think this is true.

      My Claude Code can:

      - open a browser, debug a ui, or navigate to any website

      - write a script to interact with any type of accessible api

      All without MCP.

      Within a year I expect there to be legitimate "computer use" agents. I expect agent sdks to take over llm apis as defacto abstractions for models, and MCP will have limited use isolated to certain platforms - but with that caveat that an MCP-equipped agent performs worse than a native computer-use agent.

      • codybontecou 2 days ago

        They are kind of the same thing...

        These are just tools Anthropic provides for you. Just like the tools a non-Anthropic service provides through their MCP server.

        A community-led effort of tool creation via MCP will surely be faster and more powerful than waiting for in-house implementations.

      • anon7000 3 days ago

        > open a browser, debug a ui, or navigate to any website

        I mean, that’s just saying the same thing — at the end of the day, there’s are underlying deterministic systems that it uses

        • ramoz 3 days ago

          Yes my response was poorly oriented the parent comment

  • mtkd 3 days ago

    It's very different to blockchain hype

    I had similar skepticism initially, but I would recommend you dip toe in water on it before making judgement

    The conversational/voice AI tech now dropping + the current LLMs + MCP/tools/functions to mix in vendor APIs and private data/services etc. really feels like a new frontier

    It's not 100% but it's close enough for a lot of usecases now and going to change a lot of ways we build apps going forward

    • moooo99 3 days ago

      Probably my judgement is a bit fogged. But if I get asked about building AI into our apps just one more time I am absolutely going to drop my job and switch careers

      • mtkd 3 days ago

        That's likely because OG devs have been seeing the hallucination stuff, unpredicability etc. and questioning how that fits with their carefully curated perfect system

        What blocked me initially was watching NDA'd demos a year or two back from a couple of big software vendors on how Agents were going to transform enterprise ... what they were showing was a complete non-starter to anyone who had worked in a corporate because of security, compliance, HR, silos etc. so I dismissed it

        This MCP stuff solves that, it gives you (the enterprise) control in your own walled garden, whilst getting the gains from LLMs, voice etc. ... the sum of the parts is massive

        It more likely wraps existing apps than integrates directly with them, the legacy systems becoming data or function providers (I know you've heard that before ... but so far this feels different when you work with it)

        • bwfan123 3 days ago

          There are 2 kinds of usecases that software automates. 1) those that require accuracy and 2) those that dont (social media, ads, recommendations).

          Further, there are 2 kinds of users that consume the output of software. a) humans, and b) machines.

          Where LLMs shine are in the 2a usecases, ie, usecases where accuracy does not matter and humans are end-users. there are plenty of these usecases.

          The problem is that LLMs are being applied to 1a, 1b usecases where there is going to be a lot of frustration.

        • ptx 3 days ago

          How does MCP solve any of the problems you mentioned? The LLM still has to access your data, still doesn't know the difference between instructions and data, and still gives you hallucinated nonsense back – unless there's some truly magical component to this protocol that I'm missing.

          • doug_durham 3 days ago

            The information returned by the MCP server is what makes it not hallucinate. That's one of the primary use cases.

        • moooo99 3 days ago

          > That's likely because OG devs have been seeing the hallucination stuff, unpredicability etc. and questioning how that fits with their carefully curated perfect system

          That is the odd part. I am far from being part of that group of people. I‘m only 25, I joined the industry in 2018 as part of an training program in a large enterprise.

          The odd part is, many of the promises are a bit Déjà-vu even for me. „Agents going to transform the enterprise“ and other promises do not seem that far off the promises that were made during the low code hype cycle.

          Cynically, the more I look at the AI projects as an outsider, the more I think AI could fail in enterprises largely because of the same reason low code did. Organizations are made of people and people are messy, as a result the data is often equally messy.

      • mindwok 3 days ago

        Rule of thumb: the companies building the models are not selling hype. Or at least the hype is mostly justified. Everyone else, treat with extreme skepticism.

    • djhn 2 days ago

      Is there anything new that’s come out in conversational/voice? Sesame Maya and Miles were kind of impressive demos, but that’s still in ’research preview’. Kyutai presented really a cool low latency open model, but I feel like we’re still closer to Siri than actually usable voice interfaces.

  • ashwinsundar 3 days ago

    I had a use case - I wanted to know what the congresspeople from my state have done this week. This information is surprisingly hard to just get from the news. I learned about MCP a few months ago and thought that it might be a cool way to interact with the congress.gov API.

    I made this MCP server so that you could chat with real-time data coming from the API - https://github.com/AshwinSundar/congress_gov_mcp. I’ve actually started using it more to find out, well, what the US Congress is actually up to!

  • bryancoxwell 3 days ago

    But this whole post is about using MCP sans AI

    • iLoveOncall 3 days ago

      MCP without AI is just APIs.

      MCP is already a useless layer between AIs and APIs, using it when you don't even have GenAI is simply idiotic.

      The only redeeming quality of MCP is actually that it has pushed software vendors to expose APIs to users, but just use those directly...

      • ricardobeat 3 days ago

        And that’s the whole point - it’s APIs we did not have. Now app developers are encouraged to have a public, user friendly, fully functional API made for individual use, instead of locking them behind enterprise contracts and crippling usage limits.

        • candiddevmike 3 days ago

          Do you have an example of a company who previously had an undiscoverable API now offering a MCP-based alternative?

          • ricardobeat 2 days ago

            I do have one: Atlassian now allows connecting their MCP server (Jira et al) for personal use with a simple OAuth redirect, where before you needed to request API keys via your org, which is something no admin would approve unless you were working specifically on internal tooling/integrations.

            Another way to phrase it is that MCP normalizes individual users having access to APIs via their clients, vs the usual act of connecting two backend apps where the BE owns a service key.

        • iLoveOncall 3 days ago

          Right, but we would have had them even if MCP did not exist. The need to access those APIs via LLM-based "agents" would have existed without MCP.

          At work I built an LLM-based system that invoke tools. We started before MCP existed, and just used APIs (and continue to do so).

          Its engineering value is nil, it only has marketing value (at best).

          • Maxious 3 days ago

            As https://www.stainless.com/blog/mcp-is-eating-the-world--and-... recaps, tool calling existed before MCP, some vague standards existed, nothing took off, no really normal users don't want to just download the OpenAPI spec.

            Anthropic wants to define another standard now btw https://www.anthropic.com/engineering/desktop-extensions

            • iLoveOncall 3 days ago

              Normal users don't know what MCP is and will never use an MCP server (knowingly or unknowingly) in their life. They use ChatGPT through the web UI or the mobile app, that's it.

              MCP is for technical users.

              (Maybe read the link you sent, it has nothing to do with defining a new standard)

              • int_19h a day ago

                Normal users will increasingly use MCP servers without even knowing they do so - it will be their apps. And having e.g. your music player or your email client light up in the ChatGPT app as something that you can tell it to automate is not just for technical users.

        • drivers99 3 days ago

          > it’s APIs we did not have

          Isn't that what we had about 20 years ago (web 2.0) until they locked it all up (the APIs and feeds) again? ref: this video posted 18 years ago: https://www.youtube.com/watch?v=6gmP4nk0EOE

          (Rewatching it in 2025, the part about "teaching the Machine" has a different connotation now.)

          Maybe it's that the protocol is more universal than before, and they're opening things up more due to the current trends (AI/LLM vs web 2.0 i.e. creating site mashups for users)? If it follows the same trend then after a while it will become enshittified as well.

  • beefnugs 3 days ago

    I can't believe there isn't a universal "api/firewall" by now. You know like a middle program that can convert any input api to any output api. With middleware features like logging/firewall/stateful denial and control.

    Once cryptocurrency was a thing this absolutely needed to exist to protect your accounts from being depleted by a hack. (like via monthly limits firewall)

    Now we need universal MCP <-> API to allow both programmatic and LLM to the same thing. (because apparently these AGI precursors arent smart enough to be trained on generic API calling and need yet another standard: MCP?)

  • dghlsakjg 3 days ago

    > we will point to the tool's documentation or OpenAPI

    You can already do this as long as your client has access to a HTTP MCP.

    You can give the current generation of models an openAPI spec and it will know exactly what to do with it.

    • nikolayasdf123 3 days ago

      you don't even need MCP for that. just access to hosted swagger file.

      • dghlsakjg 3 days ago

        That's what I mean. Give an LLM the swagger file, and it can make those calls itself given the ability to make an HTTP request (which is what the MCP is for)

  • 3abiton 3 days ago

    > MCP seems like a more "in-between" step until the AI models get better. I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.

    I doubt the middleware will disappear, it's needed to accomdate the evolving architecture of LLMs.

  • doug_durham 3 days ago

    Me and my colleagues are building cools stuff with it. I see many examples of truly useful things being build today.

  • TZubiri 3 days ago

    I wasn't able to find a good source on it, but I read a couple of times that Anthropic (builders of MCP) do astroturfing/shilling/growth hacking/SEO/organic advertisement. Everything I've read so far with MCP and Claude and the hype I see on social media is consistent with that, hype and no value.

  • jcelerier 3 days ago

    > I imagine in 2 years, instead of using an MCP, we will point to the tool's documentation or OpenAPI, and the AI can ingest the whole context without the middle layer.

    how would ingesting Ableton Live's documentation help Claude create tunes in it for instance?

  • arbuge 3 days ago

    I could see that happening... perhaps instead of plugging in the URL of the MCP server you'd like to use, you'd just put in the URL of their online documentation and trust your AI assistant of choice to go through all of it.

alex-moon 3 days ago

I always say this whenever anyone asks about whether something is "just hype". One day I will write a blog post on it. Long story short: every piece of new tech is "just hype" until the surrounding ecosystem is built for it. Trains are just hype until you cover the country in railway lines. Telephony is just hype until everyone has a telephone. Email is just hype until everyone has a personal computer (and a reason to sit in front of it every day).

Typically, in these kinds of developments there are two key things that need to appear at the same time: 1. Ubiquitous hardware, so e.g. everyone buys a car, or a TV, or a toaster. 2. An "interface" (whether that's a protocol or a UI or an API or a design standard) which is hyper low cognitive load for the user e.g. the flush button on a toilet is probably the best example I've ever seen, but the same can be said for the accelerator + brake + steering wheel combo, or indeed in digital/online it's CSV for me, and you can also say the same about HTTP like this article does.

Obviously these two factors feed into each other in a kind of feedback loop. That is basically what the role of "hype" is, to catalyse that loop.

  • purerandomness a day ago

    The differentiating factor is when the hype is justified with the implementation of the actual implementation. Then it's "hype", followed by adaptation and commoditization.

    With "just hype" however, there is no such step. It's hype without the following implementation: NFTs, the "Metaverse", Blockchain and "smart contracts", and of course their ancestors (tulip bulbs, 3D VRML worlds, ...) weren't simply new technology: They were solutions in search of a problem; a Ponzi Scheme where it was clear that there will be no actual implementation following, because it wouldn't make sense.

    • alex-moon 12 hours ago

      I'm inclined to agree on Blockchain myself. I'd happily be proven wrong, I just think there was no conceivable "ecosystem" that would make it useful.

alangpierce 3 days ago

> What if you just... removed the AI part?

Maybe I'm not fully understanding the approach, but it seems like if you started relying on third-party MCP servers without the AI layer in the middle, you'd quickly run into backcompat issues. Since MCP servers assume they're being called by an AI, they have the right to make breaking changes to the tools, input schemas, and output formats without notice.

  • tomqueue 2 days ago

    Exactly my thoughts after reading the article. I am surprised that so few have pointed this out because it entirely invalidates the article’s conclusion for any serious usage. To stay at the USB-C example: it‘s like plugging in a Toaster into a monitor but the Toaster changes its communication protocol every time it gets reconnected.

  • mkagenius 3 days ago

    Yes! Once the first integration is done. It will be static unless someone manually changes it.

    Maybe the author is okay with that and just want new APIs (for his toaster).

thiht 3 days ago

Can someone help me find an actual explanation of what MCP does? The official MCP documentation completely fails at explaining how it works and what it does. For example the quick start for server developers[1] doesn’t actually explain anything. Sure in the Python examples they add @mcp annotations but WHAT DOES IT DO? I feel like I’m going crazy reading their docs because there’s nothing of substance in there.

Let’s assume I want to write an MCP HTTP server without a library, just an HTTP handler, how do I do it? What’s its schema? If I want to call an MCP server from curl what endpoint do I call? Can someone help me find where this is documented?

[1]: https://modelcontextprotocol.io/quickstart/server

  • troupo 3 days ago

    MCP is a server that exposes API endpoints (new vibe term is "tools")

    MCP clients can query these endpoints (new vibe term is "invoke tools")

    That is almost the entirety of it.

    The difference with traditional API endpoints is: they are geared towards LLMs, so LLMs can ask servers to list "tools" and can call these tools at will during execution.

    It's a vibe-coded spec for an extremely hype-based space.

    • thiht 3 days ago

      Yes I understand that, but how do I write these endpoints myself without using magic @mcp annotations?

      After like an hour of searching I finally found the Lifecycle page: https://modelcontextprotocol.io/specification/2025-06-18/bas... and I think it contains the answers I’m looking for. But I feel this should be roughly explained in the first introduction.

      Agree that most of the pages feel LLM generated, and borderline unreadable

      • edgolub 3 days ago

        The reason why they do not expose the underlying server schema, is because you aren't supposed to write your own MCP server from zero, in the same way you aren't supposed to write your own GraphQL Server from zero.

        Yes, technically you could, but you are "supposed" to just use a library that builds the actual endpoints based on the schema for the version of MCP you are using. And only worry about building your tools, to expose them to a LLM, so it can be consumed (LLM function calling, but with lots of abstractions to make it more developer friendly)

        • drewnoakes 3 days ago

          I agree with this, and my preference is generally to use a nice library for such things, but understanding the low level protocol and its capabilities helps me conceptualise the interactions more concretely, and understand more of what a given library is doing for me when I use it. In that way, a clear explanation of a protocol has a lot of value for me.

        • thiht 3 days ago

          I can get behind that, but if you’re not supposed to write your MCP server yourself, it makes even more sense to explain how it works so people understand why

        • orliesaurus 2 days ago

          Sadly you are right, everything is easy and stupidly made convenient these days ... Or maybe I am just a grumpy guy today! But I do remember when you had to struggle to get software to work, there was no developer experience back then :)

neoden 3 days ago

So much scepticism in the comments. I spent last week implementing an MCP server and I must say that "well-designed" is probably an overstatement. One of the principles behind MCP is that "an MCP server should be very easy to implement". I don't know, maybe it's a skill issue but it's not that easy at all. But what is important imo, is that so many eyes are looking in one direction right now. That means, it has good chances to have all the problems to be solved very quickly. And second, often it's so hard to gather a critical mass of attention around something to create an ecosystem but this is happening right now. I wish all the participants patience and luck)

  • newtwilly 3 days ago

    It's pretty easy if you just use the MCP Python library. You just put an annotation on a function and there's your tool. I was able to do it and it works great without me knowing anything about MCP. Maybe it's a different story if you actually need to know the protocol and implement more for yourself

    • neoden 3 days ago

      Yes, I am using their Python SDK. But you can't just add MCP to your existing API server if it's not ready to async Python. Probably, you would need to deploy it as a separate server and make server-to-server to your API. Making authentication work with your corporate IAM provider is a path of trial and error — not all MCP hosts implement it the same way so you need to compare behaviours of multiple apps to decide if it's your setup that fails or bugs in VS Code or something like that. I haven't even started to think about the ability of a server to message back to the client to communicate with LLM, AFAIK modern clients don't support such a scenario yet, at least don't support it well.

      So yes, adding a tool is trivial, adding an MCP server to your existing application might require some non-trivial work of probably unnecessary complexity.

  • mattmanser 3 days ago

    We've done it before, it hasn't worked before and it's only a matter of years if not months before apps starting locking down the endpoints so ONLY chatgpt/claude/etc. servers can use them.

    Interoperability means user portability. And no tech bro firm wants user portability, they want lock in and monopoly.

  • klabb3 3 days ago

    > One of the principles behind MCP is that "an MCP server should be very easy to implement".

    I’m not familiar with the details but I would imagine that it’s more like:

    ”An MCP server which re-exposes an existing public/semi-public API should be easy to implement, with as few changes as possible to the original endpoint”

    At least that’s the only way I can imagine getting traction.

sureglymop 3 days ago

I've thought of this as well but in reality, aren't MCP servers mostly just clients for pre existing APIs?

For example, the Kagi MCP server interacts with the Kagi API. Wouldn't you have a better experience just using that API directly then?

On another note, as the number of python interpreters running on your system increases with the number of MCP servers, does anyone think there will be "hosted" offerings that just provide a sort of "bridge" running all your MCP servers?

  • mkagenius 3 days ago

    My understanding is MCP = original APIs + 1 more API

    The additional API is /list-tools

    And all the clients consume the /list-tools first and then rest of the APIs depending on which tool they want to call.

    • sureglymop 3 days ago

      Yes. But in order to do that you run the MCP server for that API locally. Is it really worth doing that just to have the additional /list-tools, when it is otherwise basically just a bridge/proxy?

      • falcor84 3 days ago

        From my perspective, remote MCP servers are gradually becoming the norm for external services.

      • mkagenius 3 days ago

        Not quite sure I get what you mean by 'MCP server for that API locally'.

        Locally you just need a consumer/client, isn't?

        • sureglymop 2 days ago

          Check out the overview in the MCP spec. Locally you run the "host application" (e.g. ollama or Claude Desktop). Then you have clients which are inside the host application and maintain 1:1 connections with servers.

          Then you have servers, which are separate processes running on your machine that the clients connect to. For example, you program a server to "manipulate your local filesystem" in python and then run it locally.

          Most MCP servers are written for python or node and to install and use them you run them locally. They then are like a "bridge" to whichever API they abstract.

  • graerg 3 days ago

    This has been my take, and maybe I'm missing something, but my thinking has been that in an ideal case there's an existing API with an OpenAPI spec you can just wrap with your FastMCP instantiation. This seemed neat, but while I was trying to do authenticated requests and tinkering with it with Goose I ended up just having Goose do curl commands against the existing API routes and I suspect with a sufficiently well documented OpenAPI spec, isn't MCP kinda moot?

    On the other hand, in the absence of an existing API, you can implement your MCP server to just [do the thing] itself, and maybe that's where the author sees things trending.

inheritedwisdom 3 days ago

Lowering the bar to integrate and communicate is what has historically allowed technology to reach critical mass and enabled adoption. MCP is an evolution in that respect and shouldn’t be disregarded.

We had a non technical team member write an agent to clean up a file share. There are hundreds of programming languages, libraries, and apis that enabled that before MCP but now people don’t even have to think about it. Is it performant no, is it the “best” implementation absolutely not. Did it create enormous value in a novel way that was not possible with the resources, time, technology we had before 100%. And that’s the point.

  • citizenpaul 3 days ago

    >non technical team member write an agent to clean up a file share

    This has to be BS(or you think its true) unless it was like 1000 files. In my entire career I've seen countless crazy file shares that are barely functional chaos. In nearly ever single "cleanup" attempt I've tried to get literally ANYONE from the relevant department to help with little success. That is just for ME to do the work FOR THEM. I just need context from them. I've on countless occasion had to go to senior management to force someone to simply sit with me for an hour to go over the schema they want to try to implement. SO I CAN DO IT FOR THEM and they don't want to do it and literally seemed incapable of doing so when forced to. COUNTLESS Times. This is how I know AI is being shilled HARD.

    If this is true then I bet you anything in about 3-6 months you guys are going to be recovering this file system from backups. There is absolutely no way it was done correctly and no one has bothered to notice yet. I'll accept your downvote for now.

    Cleaning up a file share is 50% politics, 20% updating procedures, 20% training and 10% technical. I've seen companies go code red and practically grind to a halt over a months long planned file share change. I've seen them rolled back after months of work. I've seen this fracture the files shares into insane duplication(or more) because despite the fact it was coordinated, senior managers did not as much as inform their department(but attended meetings and signed off on things) and now its too late to go back because some departments converted and some did not. I've seen helpdesk staff go home "sick" because they could not take the volume of calls and abuse from angry staff afterwards.

    Yes I have trauma on this subject. I will walk out of a job before ever doing a file share reorg again.

    You'll roll it out in phases? LOL

    You'll run it in parallel? LOL

    You'll do some <SUPER SMART> thing? LOL.

afro88 3 days ago

The author is missing the bit that the LLM provides: automatically mapping input parameters to things the user wants to do, and responses to the way the UI displays them.

Take out the LLM and you're not that far away from existing protocols and standards. It's not plugging your app into any old MCP and it just works (like the USB-C example).

But, it is a good point that the hype is getting a lot of apps and services to offer APIs in a universal protocol. That helps.

quotemstr 3 days ago

It's articles like this that tell you we're close to peak hype. There's nothing revolutionary about a text encoding plus a schema. SOAP could do this 20 years ago.

  • rikafurude21 3 days ago

    This reminded me of that HN comment on the Dropbox announcement post where the user says that theres nothing new about it since FTP and USB-sticks exist. Also, anyone who ever had the misfortune of using SOAP know how horrendeous it is. Truth is, sometimes the "new thing" does it better and wins out. Applications have standardized APIs now because of AI hype. This is a step in the right direction

    • data-ottawa 3 days ago

      Being able to plug MCPs into my desktop Claude app has been awesome. I gave it a K/V store and access to my project folder and it uses the tools very well with minimal guidance.

      Today there's no way I can talk an average person into getting MCP working without them having to modify some config files hidden away somewhere.

      I would bet big money that as soon as Claude and ChatGPT add 1 click "app store" experiences everyone will be using them in a week.

      It is not easy to "just" use an API as a human, plus a lot of APIs force you to deal with tokens and storing+executing code. In some cases it's easier for the LLM to simply fetch or curl basic APIs directly than waste context tokens on the overhead of an MCP tool call (e.g. all these weather tool examples), but with MCP consistency is much better, so depending on the use case MCP vs API both have advantages.

      Since my comment is already pretty long: LLM+RSS+Fetch is a killer combination for me, and it's almost all off the shelf these days. Once I add an RSS merge tool I think it will be an excellent way to consume content.

  • AnotherGoodName 3 days ago

    SOAP was worse than horrendous though. I’m sure i’m not the only one hit by ‘well Java SOAP and .net SOAP encode differently so they don’t work together well’ (let alone all the other different implementations each with their own similar differences).

    Or how about ‘oh it looks like your client is using SOAP 1.2 but the server is 1.1 and they are incompatible’. That was seriously a thing. Good luck talking to many different servers with different versions.

    SOAP wasn’t just bad. It was essentially only useable between same languages and versions. Which is an interesting issue for a layer whose entire purpose was interoperability.

vinkelhake 3 days ago

While reading this, the old ARexx (Amiga Rexx) popped into my head. It was a scripting language that in itself wasn't very noteworthy. However, it also made it easy for applications to expose functionality through an ARexx port. And again, offering up an API itself isn't noteworthy either. But it shipped by default in the system and if an application wanted to open itself up for scripting, ARexx was the natural choice. As a result, a ton of applications did have ARexx ports and there was a level of universality that was way ahead of its time.

Come to think of it - I don't know what the modern equivalent would be. AppleScript?

  • Hilift 3 days ago

    IBM was big on Rexx when OS/2 2.x was released in ~1993.

    "IBM also once engaged in a technology transfer with Commodore, licensing Amiga technology for OS/2 2.0 and above, in exchange for the REXX scripting language. This means that OS/2 may have some code that was not written by IBM, which can therefore prevent the OS from being re-announced as open-sourced in the future. On the other hand, IBM donated Object REXX for Windows and OS/2 to the Open Object REXX project maintained by the REXX Language Association on SourceForge."

    https://en.wikipedia.org/wiki/Rexx

    https://en.wikipedia.org/wiki/OS/2#Petitions_for_open_source

    https://www.oorexx.org/status/index.rsp

    • vaxman 3 days ago

      I’m not reading thru all that, but believe me when I say the ground truth is that IBM developed REXX on its mainframes and a genius guy from that world (poor bstrd) recreated it on Amiga as a third-party product called ARexx that was, in turn, adopted and promoted by CBM Dev Relations. One of the things the fake frenching innovation team in charge of Apple for a few years did was go down in the basement where they had an Amiga in various forms of dissection and transplanted ARexx out of it, mixed with HyperCard’s vocab (which they also didn’t invent) and announced to the World their great invention: AppleScript. But I digress. Here in the (dystopian) Future, some Apple operating systems are just now gaining the kind of power that ARexx had, because these types of systems require the cooperation of the developers and they had little incentive to “donate” functions to the system just to gain integration with Siri/Shortcuts, but they can get fired/bankrupted for not doing it to integrate with Apple Intelligence (I know, I hate that identifier too). ARexx could not be ported to MacOS back in the day because it would have had to have been championed by Apple Developer Relations (Guy Kawasaki & Co —did I just wreck the “Hackers” movie reference?) and, even/especially in the 80s, they wouldn’t have approved a tech that “competed” with AppleScript. Microsoft didn’t jump on this because it descends from DEC tech (under the direction of the Cutler) which had nothing like REXX.

  • billmcneale 3 days ago

    Microsoft introduced this in Windows in 1993, it's called COM and is still in (heavy) use today.

    It basically powers all inter communication in Windows.

    • vaxman 3 days ago

      Not really. COM/OLE is a different paradigm, their answer to an infamous vaporware called Taligent/OpenDoc that bankrupted many developers. Microsoft was sort of stuck with that security nightmare though

      • billmcneale 3 days ago

        COM is exactly what OP was talking about.

        Apps can expose endpoints that can be listed, and external processes can call these endpoints.

        • int_19h a day ago

          "COM" by itself is a rather broad umbrella. What you're describing seems to be OLE Automation. That's the one that has type libraries (which you need for discoverability).

          And then Active Scripting was supposed to be how you'd script those endpoints...

  • layer8 3 days ago

    PowerShell with COM interfaces.

gavinray 3 days ago

> "The author discovers API's/JSON RPC"

I'm too young to be posting old_man_yells_at_cloud.jpg comments...

  • orliesaurus 2 days ago

    That's honestly the best tl;Dr in this whole thread

bravesoul2 3 days ago

This is well written and fun. Thanks OP!

Now I am excited by MCP and would be all in except security.

Security is a huge issue.

Forget AI and imagine a system where you call APIs and you get both data and JS. And that JS executes at global scope with full access to other APIs. And so do all the other MCP servers. Furthermore the MCP server may go to arbitrary Web pages and download JS. And that JS e.g. from a strangers Github issue or Web search gets executes with full API privileges.

    <cute animal interject> This isn't something MCP can fix. It is built into the dice rolling nature of LLMs. Turning predictions into privileged executions. And those dice can be loaded by any MCP server.
Or imagine surfing the Web using a 2001 browser with no protections against cross domain scripting. Then having a page where you choose what init scripts to run and then it cascades from there. You are logged into your bank at the time!

This is what worries me. It's not USBC. It's sort of USBC but where you are ordering all your peripherals from Amazon, Ali express and Temu and the house is made of tinder.

taytus 3 days ago

The author glosses over some practical realities. Just because something can be repurposed doesn't mean it should be. MCP was designed with specific assumptions about how AI models consume and process information. Using it as a general plugin system might work, but you'd likely hit limitations around things like authentication, real-time communication, or complex data flows that weren't priorities for the AI use case.

moron4hire 3 days ago

This isn't a snide comment, I am legitimately asking. I don't understand the difference between MCP and REST. I know there are differences because I've used it a little. I mean, like, on an existential level. Why isn't it just REST? What parts do MCP give us that REST doesn't?

  • gunalx 3 days ago

    I cannot really answer, but it seems you can just wrap mcp in a rest wrapper, as thqt is how open web ui seems to integrate mcp into its tooling..

    • moron4hire 2 days ago

      Anthropic's example of creating an MCP server is just wrapping MCP around a REST weather forecasting service.

      Maybe it's just that agentic LLMs have created a lot of interest in being interoperable, whereas efforts like Open API just didn't have any carrot to warrant the stick other than "wouldn't it be nice".

  • kimjune01 2 days ago

    mcp forces a standard documentation, whereas documentation is optional for rest

thegreatwhale8 2 days ago

Offtopic: The article reads very much like it's chatgpt generated. But it's not surprising giving the subject matter. I just dislike how a computer tries to be entertaining and uses this default "voice" when writing anything. I hope there will be some way to personalize the output text, so it will be correct, but not soulless.

  • ls-a 2 days ago

    I like how all the jokes/memes about what customers ask for vs. what developers produce, now apply to what developers ask for vs. what AI produces.

superluserdo 3 days ago

Seems like we're just currently in the top-right of this comic https://xkcd.com/2044/

  • sgt 2 days ago

    Makes me think of Kafka as well.

  • readthenotes1 3 days ago

    Someone should write an AI tool that evaluates every top article in hacker News and provides the appropriate XKCD comic as a comment.

    • dexterdog 3 days ago

      And then a few steps later it's just bots talking to bots. Then what did we read when we're on the loo?

cpard 2 days ago

I wonder how this interoperability hype in the industry, induced by MCPs, will affect companies where the lack of interoperability is almost a moat.

I remember when I first interacted with Marketo and I was wondering why people even bother trying to use this tool just to learn that Marketo has the best integration with Salesforce and thus, it’s almost a certainty that as you scale you’ll get to use it.

Salesforce in particular, relies a lot on the vendor ecosystem built on a platform that is so painful to inter operate with.

I’m very curious to see what effect this will have to them.

belter 3 days ago

This all starting to look like autonomous driving. We are nowhere near solving it but everybody acts like it's here.

neonbrain 3 days ago

I believe Microsoft's usual "Embrace, Expand, Extinguish" strategy is at work here. For system stability and security reasons, you wouldn't actually want agents to dynamically discover tools without proper governance. Alternatives like PydanitcAI are lost in this steady MCP noise maintained by Microsoft - their "Embrace" phase for MCP, declared during Build 2025 event. Anthropic released this open standard with weak tooling and no governance for the specs, making it easy for Microsoft to dominate.

The next step would be Microsoft attempting to make their registry the de facto choice for developers and extending with Windows-specific verbs.

Then, by controlling what's considered "secure", they can marginalize competitors.

bigmattystyles 3 days ago

I thought MCPs just ‘figured out’ using docs how to call a program’s API. Won’t it matter that many APIs just suck?

rubatuga 3 days ago

Can someone link to this supposed toaster with DP-alt mode? That supposedly runs on 240W? (Max PD power)

brap 3 days ago

I guess I’m finally old enough to become old-man-yelling-at-cloud.

I’m convinced that the only reason why MCP became a thing is because newcomers weren’t that familiar with OpenAPI and other existing standards, and because a protocol that is somehow tied to AI (even though it’s not, as this article shows) generates a lot of hype these days.

There’s absolutely nothing novel about MCP.

metalrain 3 days ago

MCP works for small, well defined actions (like examples in the article), but enterprise APIs can have hundreds/thousands of endpoints/schemas for different concepts and variations of operations.

For example how would you MCP Google Ads rpc API https://developers.google.com/google-ads/api/reference/rpc/v... so that LLM and user will understand that? Seems like we can't escape complexity.

dgrabla 2 days ago

If MCP gets used this way I see big trouble when people hardcode stuff and then the provider updates the endpoints. MCP does not have versions as the list-tools is a living document, you are supposed to fetch and read the current version. AI would be totally fine with it because it would be able to reason the change and adapt, but the hardcoded app is going to break badly.

  • tomqueue 2 days ago

    Exactly my thoughts after reading the article. I am surprised that so few have pointed this out because it entirely invalidates the article’s conclusion for any serious usage. To stay at the USB-C example: it‘s like plugging in a Toaster into a monitor but the Toaster changes its communication protocol every time it gets reconnected.

furyofantares 3 days ago

Agents (presumably) increase the demand for APIs and if those APIs as well as already existing APIs get exposed as MCPs then I can see it.

It is dependent on agents actually creating new demand for APIs and MCP being successful as a way to expose them.

Workaccount2 3 days ago

I know this is nit-picky and not really relevant to the actual meat of the story, but a toaster (outside of a gag gift or gimmick) cannot run on USB-C since your typical toaster draws ~1kW and USB-C power spec tops out at 240W.

  • jcul 3 days ago

    I assumed they were controlling the toaster over usb c or getting some data from it, interfacing with it, rather than actually powering it!

    > But it worked, and now Rex's toast has HDMI output.

    > Toaster control protocols? Rex says absolutely.

  • hnlmorg 3 days ago

    A car lighter also cannot run a pizza oven for the same reason.

    But you’re right, it does kind of miss the point.

mudkipdev 3 days ago

Anyone else feel like this article was written with ChatGPT

  • neuronic 3 days ago

    Not in this particular case. At this point I am starting to wonder if the

    > Anyone else feel like this article was written with ChatGPT

    comments are actually written by ChatGPT.

  • orliesaurus 2 days ago

    I do, especially the weird thing at the end that says something about MCP bread

    Here

    > P.S. If you build an MCP server that makes your computer emit the smell of fresh bread, we need to talk.

tankenmate 3 days ago

"Want your AI agents to respond like peons from Warcraft 3 when you assign them a task?" -- I'd rather be sailing.

  • falcor84 3 days ago

    That's from Warcraft 2, right? I think in Warcraft 3 it's "I'd rather be flying"

zzo38computer 3 days ago

Although it has a benefit that it is possible to use like that (e.g. in case you do not have a better system), I think it isn't the best way to do. USB, HTTP, MCP, etc have many problems, despite whatever benefit they may have (including unintentional ones).

vaxman 3 days ago

> The protocol doesn't judge your life choices. This brings me to something I discovered about MCP (Model Context Protocol) while trying to make my calendar app order takeout. Stay with me here.

What was that character in “South Park” that has a hand puppet? (White noise, flatline sound)

spauldo 3 days ago

There's a reason no one uses cigarette lighters in cars anymore. If you actually try to use a cigarette lighter in a modern car, you'll likely melt your dashboard. They don't make them for that purpose anymore.

iambateman 3 days ago

Where do I get started with MCP? I’m all in, but kinda…confused?

A REST API makes sense to me…but this is apparently significantly different and more useful. What’s the best way to think about MCP compared to a traditional API? Where do I get started building one? Are there good examples to look at?

  • randomcatuser 3 days ago

    Yeah, one way to think about it is like... protocols restrict things, so that people can expect the same stuff.

    With a traditional API, people can build it any way they want, which means you (the client) need API docs.

    With MCP, you literally restrict it to 2 things: get the list of tools, and call the tool (using the schema you got above). Thus the key insight is just about: let's add 1 more endpoint that lists the APIs you have, so that robots can find it.

    Example time: - Build an MCP server (equivalent of "intro to flask 101"): https://developers.cloudflare.com/agents/guides/remote-mcp-s... - Now you can add it to Claude Desktop/Cursor and see what it does - That's as far as i got lol

  • stefan_ 3 days ago

    I think I'm living in a parallel universe. You can tell an LLM in a million ways what "tools" it can "call". Anthropic & co standardized a shitty variant so they have an uniform way of letting others play in their sandbox, until they invariably decide which of these things make sense and then usurp them in a desperate way out of the commodity rat race to the bottom.

  • Rury 3 days ago

    Do you know WSDL? If you do, it's kind of the same concept behind consuming WSDL, just for AI applications...

  • airstrike 3 days ago

    It's kinda like a REST API in which the schema tags along with the requests.

    The use case in AI is sort of reversed such that the code runs on your computer

nimish 3 days ago

Interoperability is, and always was, the hardest part of programming systems together. It's telling that the ai tooling needed sustained non ai effort to expose the interfaces via MCP (or ws-* or rest or an enterprise service bus or xml or CORBA or EJB or...)

chopete3 3 days ago

The real accident is that the prompts became a programming language. I don't think the ML Engineers set out to create a general purpose programming language.

A2A (agent 2 agent) mechanism is an another accidental discovery for the interoperability across agent boundaries

  • namtab00 3 days ago

    "The hottest new programming language is English" - Karpathy

    I call bullshit, mainly because any natural language is ambiguous at best, and incomplete at worst.

tamersalama 3 days ago

I love the article & the protocol. However, MCP reminded me (somewhat) of microservices & SOA. Are we creating failure vectors nightmare? Or, is it, because of agents, we can gracefully increase reliability?

roenxi 3 days ago

... MCP is almost literally just a JSON schema and a "yo, this stuff exists" for AI. It is great to have it standardised and we're all very thankful not to be using XML but there just isn't that much there.

MCP is fulfilling the promise of AI agents being able to do their own thing. None of this is unintended, unforeseen or particularly dependent on the existence of MCP. It is exciting, the fact that AI has this capability captures the dawn of a new era. But the important thing in the picture isn't MCP - it is the power of the models themselves.

  • layer8 3 days ago

    XML actually works better with LLMs than JSON.

    • zahlman 3 days ago

      Why?

      • layer8 3 days ago

        Presumably because XML tags give better context. You have closing tags, and each array element has its own tags. The tag syntax is different from the value syntax, whereas in JSON both labels and string values use the same syntax. JSON strings are delimited by the same character ("), whereas XML uses two different characters (>…<). Non-string values in JSON have more variations in their delimitation than values in XML.

bovermyer 3 days ago

> emotional support portable fan

I can't be the only person that non-ironically has this.

  • orliesaurus 2 days ago

    Me too, I turn it on when things get annoying

MontagFTB 3 days ago

Bret Victor had an old video where he talked about a world in which computers very organically figured out how to interoperate. MCP feels like the first realization of that idea.

mehulashah 2 days ago

Yup. Makes sense. The biggest value of MCP is that everyone is using MCP. This too is not a new insight.

TZubiri 3 days ago

>What if it's just "a standardized way to connect AI models literally anything to different data sources and tools"?

Then you aren't exploring a novel concept, and you are better served learning about historical ways this challenge has been attempted rather than thinking it's the first time.

Unix pipes? APIs? POSIX? Json? The list is endless, this is one of the requirements that you can identify as just being a basic one of computers. Another example is anything that is about storing and remembering information. If it's so foundational, there will be tools and protocols to deal with this since the 70s.

For the love of god, before diving into the trendy new thing, learn about he boring old things.

phreeza 3 days ago

Is this basically the XML/RSS/semantic web of this tech wave?

fariszr 3 days ago

APM looks interesting, would love to try it out.

fennecbutt 2 days ago

Omg I wish people would just shut up about mcp already.

It's not magic, it's just smashing tool descriptions into your prompt, and the implementation sucks for various reasons, including there being no standard for tool use tags, nor the mcp spec including a common way of handling function call responses.

It's literally just "json into prompt & then it's all you buddy!" batteries NOT included, lmao

shalev123 3 days ago

Oh boy, if only our personal AI assistants could be as reliable as a good old-fashioned pizza run by the CEO. The irony is delicious - we're moving towards universal plugins not because of some grand vision, but because everyone's desperate to make sure their AI doesn't go on an energy-saving nap mid-task.

It's almost poetic how we're now incentivizing interoperability simply because our digital buddies have to eat (or rather, drink) to stay awake. Who would've thought that the quest for connectivity would be driven by the humble Watt hour?

I guess when it comes down to it, even AI needs a good power-up - and hopefully, this time around, the plugins will stick. But hey, I'll believe it when my assistant doesn't crash while trying to order takeout.

mseepgood 2 days ago

Is it some kind of CORBA?

spiritplumber 3 days ago

Yes, I'm old. Old enough to remember the MCP when he was just a chess program! He started small, and he'll end small!

  • rlboston 3 days ago

    Sounds like programming! What happened to low code/no code? I AM old, retired in fact. IT has more "middleware" than the library of congress and mainframes still exist. But I will dig around, because I'm still curious. Carry on. LOL

    • vaxman 3 days ago

      Erm, hey Mac, he’s quoting “TRON” -and getting downvoted by peeps who don’t remember or are from alien cultures that don’t incorporate it.

simulacra8 3 days ago

what Anthropic did right with MCP that Google didn't with A2A?

croes 3 days ago

Universal but insecure

  • gabriel_export 3 days ago

    Easy, just add an S to the end (for secure).

OJFord 3 days ago

HTTP: A (Deliberately) Universal Plugin System

ummadi 3 days ago

So does that mean mcp is good to integrate along with Agentic AI

DonHopkins 2 days ago

>P.S. If you build an MCP server that makes your computer emit the smell of fresh bread, we need to talk.

https://news.ycombinator.com/item?id=29225777

DonHopkins on Nov 15, 2021 | parent | context | favorite | on: Xerox scanners/photocopiers randomly alter numbers...

The iSmell developers were hoping to make money the same way, by selling big smell combination pack cartridges that you have to entirely replace after any one of the smells ran out.

https://en.wikipedia.org/wiki/ISmell

>The iSmell Personal Scent Synthesizer developed by DigiScents Inc. is a small device that can be connected to a computer through a Universal serial bus (USB) port and powered using any ordinary electrical outlet. The appearance of the device is similar to that of a shark’s fin, with many holes lining the “fin” to release the various scents. Using a cartridge similar to a printer’s, it can synthesize and even create new smells from certain combinations of other scents. These newly created odors can be used to closely replicate common natural and manmade odors. The cartridges used also need to be swapped every so often once the scents inside are used up. Once partnered with websites and interactive media, the scents can be activated either automatically once a website is opened or manually. However, the product is no longer on the market and never generated substantial sales. Digiscent had plans for the iSmell to have several versions but did not progress past the prototype stage. The company did not last long and filed for bankruptcy a short time after.

This Wired Magazine article is a classic Marc Canter interview. I'm surprised they could smell the output of the iSmell USB device over the pungent bouquet from all the joints he was smoking:

You've Got Smell!

https://web.archive.org/web/20160303130915/https://www.wired...

>DigiScent is here. If this technology takes off, it's gonna launch the next Web revolution. Joel Lloyd Bellenson places a little ceramic bowl in front of me and lifts its lid. "Before we begin," he says, "you need to clear your nasal palate." I peer into the bowl. "Coffee beans," explains Bellenson's partner, Dexster Smith. […]

>"You know, I don't think the transition from wood smoke to bananas worked very well." -Marc Canter

The failed quest to bring smells to the internet (thehustle.co)

https://thehustle.co/digiscents-ismell-fail

https://news.ycombinator.com/item?id=17476460

DigiScent had a booth at the 1999 Game Developers Conference, with scantily dressed young women in skunk costumes.

I told them about a game called "The Sims" I had been working on for a long time, and was hoping to finish and release some time soon.

They unsuccessfully tried to convince me to make The Sims support the iSmell, and even gave me a copy of the SDK documentation, because they thought it would enrich the player's experience of all those sweaty unwashed sims, blue puddles of piss on the floor, stopped up toilets in the bathroom, and plates of rotting food with flies buzzing around on the dining room table.

orliesaurus 2 days ago

MCP over stdio

MCP over SSE

MCP over streamable HTTP

Authorization? Hahaha nah just put a token in your config file

Actually never mind.

Let's add OAuth...

Oh wait why are 90% of the server still using studio

Oh no

Let's deprecate SSE now

Uml2013 3 days ago

So does that mean it’s good to integrate mcp with agentic ai

Uml2657 3 days ago

So does that mean mcp is good to integrate along with Agentic AI