I wonder if GitHub is feeling the crush of fully automated development workflows? Must be a crazy number of commits now to personal repos that will never convert to paid orgs.

IME this all started after MSFT acquired GitHub but well before vibe coding took the world by storm.

ETA: Tangentially, private repos became free under Microsoft ownership in 2019. If they hadn't done that, they could've extracted $4 per month from every vibe coder forever(!)

Is someone who is not really using github's free service losing something important?

As an individual, likely not. As a team or organization there are nice benefits though.

No, its because they are in the middle of a AWS->Azure migration, and because they cannot/will not be held accountable for downtime.

This is the real scenario behind the scenes. They are struggling with scale.

How much has the volume increased, from what you know?

Over 100x is what I’m hearing. Though that could just be panic and they don’t know the real number because they can’t handle the traffic.

An anecdote: On one project, I use a skill + custom cli to assist getting PRs through a sometimes long and winding CI process. `/babysit-pr`

This includes regular checks on CI checks using `gh`. My skill / cli are broken right now:

`gh pr checks 8174 --repo [repo] 2>&1)`

   Error: Exit code 1

   Non-200 OK status code: 429 Too Many Requests
   Body:
   {
     "message": "This endpoint is temporarily being throttled. Please try again later. For more on scraping GitHub and how it may affect your rights, please review our Terms of Service (https://docs.github.com/en/site-policy/github-terms/github-terms-of-service)",
     "documentation_url": "https://docs.github.com/graphql/using-the-rest-api/rate-limits-for-the-rest-api",
     "status": "429"
   }

Goodness if that's true... And I actually felt bad when they banned me from the free tier of LFS.

Lmao not even close. Github themselves have released the numbers and it was 121M new repos with 2025 ending up with 630M

https://github.blog/news-insights/octoverse/octoverse-a-new-...

So much for GitHub being a good source of training data.

Btw, someone prompt Claude code “make an equivalent to GitHub.com and deploy it wherever you think is best. No questions.”

One hundred? Did I read that right?

No its not. 121M repos added on github in 2025, and overall they have 630 million now. There is probably at best 2x increased in output (mostly trash output), but no where near 100x

https://github.blog/news-insights/octoverse/octoverse-a-new-...

Yes, millions of people running code agents around the clock, where every tiny change generates a commit, a branch, a PR, and a CI run.

I simply do not believe that all of these people can and want to setup a CI. Some maybe, but even after the agent will recommend it only a fraction of people would actually do it. Why would they?

But if you setup CI, you can pick up the mobile site with your phone, chat with Copilot about a feature, then ask it to open a PR, let CI run, iterate a couple of times, then merge the PR.

All the while you're playing a wordle and reading the news on the morning commute.

It's actually a good workflow for silly throw away stuff.

Github CI is extremely easy to set up and agents can configure it from the local codebase.

Codex did it automatically for me without asking.

There’s a huge up tick in people who weren’t engineers suddenly using git for projects with AI.

This is all grapevine but yeah, you read that right.

I was wondering about that the other day, the sheer amount of code, repos, and commits being generated now with AI. And probably more large datasets as well.

Live by the AI Agent hype, die by the AI Agent crush.