Just trying to explain it to you made me think of a very good reason why an MCP is preferable to just telling it to fetch a page. When you tell ChatGPT or Sonnet or even cursor/windsurf/whatever to fetch a website do you know exactly what it is fetching? Does it load the raw html into the context? Does it parse the page and return just the text? What about the navigation elements, footer and other “noise” or does it have the LLM itself waste precious context window trying to figure the page out? Is it loading the entire page into context or truncating it? If it is truncated, how is the truncation being done?

With an MCP there is no question about what gets fed to the model. It’s exactly what you programmed to feed into it.

I’d argue that right there is one of the key reasons you’d want to use MCP over prompting it to fetch a page.

There are many others too though like exposing your database via MCP rather than having it run random “psql” commands and then parsing whatever the command returns. Another thing is letting it paw through splunk logs using an MCP, which provides both a structure way for the LLM to write queries and handle the results… note that even calling out to your shell is done via an MCP.

It’s also a stateful protocol, though I haven’t really explored that aspect.

It’s one of those things that once you play with it you’ll go “oh yeah, I see how this fits into the puzzle”. Once you see it though, it becomes pretty cool.

I don’t mind schemas and repositories, but I feel it’s a bit backwards. That’s the kind of work I would hope we can avoid with AI.

MCP is written for the AI we’ve got not the ones doing all the hyping want us to believe exists.

With a long enough context window it wouldn’t matter the difference. But “long enough” in this context to me means where you view its length as big enough where size no longer matters. Kind of like modern hard drives that are “big enough that I don’t care about a 1gb file” (I was thinking megabyte files but that might be too large of an order of magnitude )