Which would also mean the accelerationists are potentially putting everyone at risk. I'd think a soft takeoff decades in the future would give us a much better chance of building the necessary safeguards and reorganizing society accordingly.

This is a soft takeoff

We, the people actually building it, have been discussing it for decades

I started reading Kurzweil in the early 90s

If you’re not up to speed that’s your fault

Decades from now. Society is nowhere near ready for a singularity. The AI we have now, as far as it has come, is still a tool for humans to use. It's more Augmented Intelligence than AGI.

A hard takeoff would be the tool bootstrapping itself into an autonomous self-improving ASI in a short amount of time.

And I read Kurzweil years ago too. He thought reverse engineering the human brain once the hardware was powerful enough would together give us the singularity in 2045. And the Turing Test would have been passed by 2029, but seems like LLMs have already accomplished this.