ondrsh
4 months ago
0
5
Exactly. An AI-web based on the principles of HATEOAS is the next step, where instead of links, we would have function calls.

As you said, HATEOAS requires a generic client that can understand anything at runtime — a client with general intelligence. Until recently, humans were the only ones fulfilling that requirement. And because we suck at reading JSON, HATEOAS had to use HTML. Now that we have strong AI, we can drop the Hypermedia from 'H'ATEOAS and use JSON instead.

I wrote about that exact thing in Part 2: https://www.ondr.sh/blog/ai-web

thierrydamiba4 months ago
Both blog posts were excellent. Thanks for the breakdown.

I’m bullish on MCP-what is are some non-obvious things I shod consider that might dampen my fire?

ondrshthierrydamiba4 months ago
TL;DR: IMHO, the MCP enforces too much structure, which makes it vulnerable to disruption by less structured protocols that can evolve according to user needs.

The key reason the web won out over Gopher and similar protocols was that the early web was stupidly simple. It had virtually no structure. In fact, the web might have been the greatest MVP of all time: it handed server developers a blank canvas with as few rules as possible, leading to huge variance in outputs. Early websites differed far more from each other than, for example, Gopher sites, which had strict rules on how they had to work and look.

Yet in a server-client "ping-pong" system, higher variance almost always wins. Why? Because clients consume more of what they like and less of what they don't. This creates an evolutionary selection process: bad ideas die off, and good ideas propagate. Developers naturally seem to develop what people want, but they are not doing so by deliberate choice — the evolutionary process makes it appear so.

The key insight is that the effectiveness of this process stems from a lack of structure. A lack of structure leads to high variance, which lets the protocol escape local minima and evolve according to user needs.

The bear case for MCP is that it's going the exact opposite route. It comes with tons of features, each adding layers of abstractions and structure. While that might work in narrowly understood fields, it's much harder to pull off in novel domains where user preferences aren't clear — knowing what users want is hard. The MCP's rigid structure inherently limits variance in server styles (a trend already observable IMHO), making MCP vulnerable to competition by newer, less structured protocols — similar to how the web steamrolled Gopher, even though the latter initially seemed too far ahead to catch. The fact that almost all MCP servers are self-contained (they don't link to other MCP servers) further means the current lead is not as effective, as the lock-in effect is weaker.

yourapostasy ondrsh4 months ago
Under this thesis, then SLOP would win, except I don’t yet see how it can be composed by the user, which MCP is supposed to have moved the composability into?

https://i-love-slop.com/

ondrshyourapostasy4 months ago
Seems nice because it's stateless and thus simpler. But it still enforces lots of structure (static entry points, memory, etc.). So if MCP reminds me of FTP/Telnet (bi-directional, stateful), SLOP reminds me of Gopher.

In any case, protocols need killer applications to take off — for the web this killer app was Mosaic. Right now I don't see any application supporting SLOP. If they are able to come up with one that outperforms other MCP-based LLM applications, they will have a chance.

My personal belief is that the winning protocol will be web-like. Right now there is no such protocol. Maybe I'm wrong, let's see.

thierrydamiba ondrsh4 months ago
Thanks again for the thorough response.