There's at least a decent argument to be made that the limiting factor is actually the physical silicon itself (at least at cutting-edge nodes) not really the power. This actually gives AI labs an incentive to run those specific chips somewhat cooler, because high device temperatures and high input voltages (which you need to push frequencies higher) might severely impact a modern chip's reliability over time.
Power is the limiting layer above physical chips. You can add more chips and them at lower clock or add more efficient chips later on, but you can't really change the power of a data center easily.
It will nonetheless be vastly cheaper to build a new datacenter and arrange for powering it than to fab the amount of leading-edge chips and compute systems that are going to ultimately eat that power. So the chips themselves are still the meaningful constraint.