Yudkowski is a clown, the local crackhead in your street is probably more accurate and less insane than him.

If he's a clown what part of his theory is the circus?

Are you saying that superintelligence is impossible?

Are you saying that the alignment problem will certainty be solved before superintelligence emerges?

Are you saying that a superintelligent being connected to the internet would be unable to gain resources such as GPU time, money, and social influence?

Are you saying that a superintelligent being would for some reason be incapable of deception and cunning?

Are you saying that a superintelligent being would necessarily regard human flourishing as a prime objective to be prized above it's own goals and ambitions?

If it's really just doomerism we should be able to point to the flaws in his argument instead of making ad hominem attacks.

At this point we should have had Ai induced apocalypse a few times according to him

Being an insane clown (posse optional) with less accuracy than the town crackhead doesn't seem to be a barrier to success in tech anymore.

Certainly makes you qualified to be CEO or Spokesperson.