Not sure if this is satire.

Edit: What we have built is a natural language interface to existing, textually recorded, information. Transformers cannot learn the whole universe because the universe has not yet been recorded into text.

Transformers operate on images and a variety of sensor data. They can also operate completely on non-textual inputs and outputs. I don't know what the ceiling on their capabilities is, but the complaint that they only operate on text seems just obviously wrong. There are numerous examples but one is meteorological forecasting which takes in a variety of time series sensor inputs and outputs e.g. time-series temperature maps. https://www.nature.com/articles/s41598-025-07897-4

Based on a glance at their other comments: not satire.

AFAIK the data does not need to be text.

Well diffusers are trained unsupervised on raw pictures. I don't know how they train multi-modal LLMs on images, but yes obviously they are consuming other media than just text. I don't think, but would be happy to be corrected, that models glean much of their "knowledge" from non-textual training data.

you couldnt be more wrong

Please tell me more. When I ask an LLM a question, and get a text response, can that response incorporate non-textual information from visual training data?

It’s more than likely not.

Poe's (c)law?

Poe’s (C)law: The more absurd AI-generated content becomes, the more likely people are to believe it is real.

100% agreed. Sadly, lots of people out there with the "trust me bro, just need more compute". Hopefully we don't consume all the planet's resources trying.

I reevaluated my priors long ago when I saw that scaling laws show no sign of stopping, no sign of plateau.

Strangely some people on HN seem to desperately cling to the notion that it's all going to come to a halt. This is unscientific. What evidence do you have - any evidence - that the scaling laws are due to come to an end?

All the curves have been levelling off as expected. Not really sure what you're talking about.

They have not, every successful pre-train as of late has had performance increases greater than what the scaling laws predict.

Those gains are arch based, data quality based, etc. Scaling laws only relate to data volume and compute, holding other factors constant.

I suspect it's not that people do not see the progress, they fail to fully trust laws not truly backed by physics like the transistor laws. We empirically see that scaling works and continue to work.

[deleted]

Why should we have strong priors in either direction? Maybe it will keep scaling for decades like Moore's law. Maybe not.

Bro the planet is literally experiencing a climate disaster and you think the solution is to create more systems that are misaligned with the planet's ecosystem for humans?

I guess the great filter is a real thing and not just a thought experiment.

I assure you that voluntary meat consumption because "taste buds go brr" is a much bigger problem than AI that results in actual productivity gains (and potentially solve the very climate crisis you complain about.)

Completely agree. Meat should be priced to include externalities. People can get used to beans. Beans are great!

I’d like to see something that indicates models are getting better without the need for more training data. I would expect most gains are coming from more and better labeled data. We’re racing towards a complete encyclopedia of human knowledge. If we get there that’s only a drop in the bucket of all knowable things.

The issue people have isn’t some interpretation of scaling laws, it’s whether the planet’s ecology is goi g to be able to sustain this endeavour.

I shouldn’t have to say this out loud, but if the environment collapses, we will die, and no amount of “just a bit more scaling bro, just think of the gains” will matter.

People's voluntary dietary choices cause far more suffering and ecological damage than AI, and for much less return or economic output. But you tell people to switch to plant based foods and they lose their shit.

Yes. There's more than one thing that needs to change if we're going to make it through this.