> Simply change the domain from github.com or github.io to gitmcp.io and get instant AI context for any GitHub repository.

What does this mean? How does it work? How can I understand how it works? The requirements, limitations, constraints? The landing page tells me nothing! Worse, it doesn't have any links or suggestions as to how I could possibly learn how it works.

> Congratulations! The chosen GitHub project is now fully accessible to your AI.

What does this mean??

> GitMCP serves as a bridge between your GitHub repository's documentation and AI assistants by implementing the Model Context Protocol (MCP). When an AI assistant requires information from your repository, it sends a request to GitMCP. GitMCP retrieves the relevant content and provides semantic search capabilities, ensuring efficient and accurate information delivery.

MCP is a protocol that defines a number of concrete resource types (tools, prompts, etc.) -- each of which have very specific behaviors, semantics, etc. -- and none of which are identified by this project's documentation as what it actually implements!

Specifically what aspects of the MCP are you proxying here? Specifically how do you parse a repo's data and transform it into whatever MCP resources you're supporting? I looked for this information and found it nowhere?

As someone who is obviously not the target audience, I feel like literally anything on this page that could lead me to explain what MCP is would be nice, while we're talking about what the landing page doesn't tell you. Even just one of the MCP mentions being a link to modelcontextprotocol.io would be fine.

Or maybe I'm so out of the loop it's as obvious as "git" is, I dunno.

It’s fair to be curious, but at some point it’s also reasonable to expect people are capable of using Google to look up unfamiliar terms. I'm not gatekeeping—just, like, put in a bit of effort?

Threads like this work better when they can go deeper without rehashing the basics every time.

Having a Link to the mcp website won't be "rehashing" but how the web once was supposed to be.

I took a brief look at the MCP documentation today, and left looking confused. At a high level that protocol looks like a massive swiss-army knife that could potentially do everything, and the use-case in TFA looks like it's implementing one very specific tool within that large swiss-army knife. Both need better explanation.

[dead]

[flagged]

I appreciate that! Now maybe they could update the readme accordingly! ;)

Is this the new LMGTFY?

Not really. I had to do the following:

- Identify the files that should be put into context since tokens cost money and I wanted to use a model that was capable like Sonnet, which is expensive.

- There were 35 messages (minus 2 based on how my system works) so I wrote and read quite a bit. I was actually curious to know how it worked since I have domain knowledge in this area.

- Once I knew I had enough context in the messages, I switched to Gemini since it was MUCH cheaper and it could use the output from Sonnet to guide it. I was also confident the output was accurate since I know what would be required to put a Git repo into context and it isn't easy if cost, time and accuracy is important.

Once I went through all of that I figured posting the parent questions would be a good way to summarize the tool, since it was very specific.

So I guess if that is the next LMGTFY, then what I did was surely more expensive and timeconsuming.