Why it matters: As chipmakers embark on a widespread transition to locally processed generative AI, certain users are still questioning the need of this technology. NPUs have emerged as a new buzzword as hardware vendors aim to introduce the concept of the "AI PC," yet their arrival prompts speculation about whether the valuable die space they occupy could have been allocated to more beneficial purposes.
WTF?! Arm's CEO has sounded a warning bell about the energy requirements needed to advance AI algorithms. He cautions that in a few years, "AI data centers" could require so much electricity that it could jeopardize the US power grid.