This looks like one of many ideas for more efficient compute chips for machine learning. I'm waiting for the day some chip gets mass produced and works at scale for some large model and with sufficient reliability, but until then, I don't think there's anything particularly newsworthy here. I do think it'll eventually happen at some point maybe within a decade, but surely some alternative computing paradigm to the GPU will succeed. The analog chip in the article only seems to be a research prototype for now.