It seems like the rent seeking class is just moving to selling you access to LLMs in data centers by the token. In the past, the "rent-seeking class" being described here was at least part of the middle class. Now a few billionaires are going to capture all of the value, but the rent-seeking isn't going away.

Since the article is largely about open weights models, I think the argument is that this is the "last gasp" and soon doing inference at home will be common.

The trend over the last decades was towards more centralization and I don't see that changing. Unless we radically change our economic system, the rent seekers will always win. There will be probably less of them but they will be even bigger.

The small models that I can run at home are becoming more capable, and I have replaced some API-based tasks with local inference as they improve, but large open weights models are still a lot stronger. The nice thing with larger open weights models is that competing providers serve them at modest margins and prices. I don't have the hardware to run the largest Qwen models, but I can get API access at low cost. Since there are only modest barriers to new commercial inference providers for these models I'm not worried that API access to them will become drastically more expensive at some future time.

And since there are only modest barriers to new commercial inference providers for these models...

Congress: "Hold my beer and watch this"

Running on what devices (and additionally, purchased with what money)?