Yes, but then you have to add that yourself to every prompt. It would be nice to tell your LLM provider just once "here is a tool you can use" along with a description of the API documentation so that you could use it in a bunch of different chat's without having to remind it every single time. That way, when you want to use the tool you can just ask for the tool without having to provide that detail again and again.

Also, it would be kind of cool if you could tell a desktop LLM client how it could connect to a program running on your machine. It is a similar kind of thing to want to do, but you have to do a different kind of processes exec depending on what OS you are running on. But maybe you just want it to ultimately run a Python script or something like that.

MCP addresses those two problems.