Dystopian Kurzweil: As Big Tech continues frantically pushing AI development and funding, many users have become concerned about the outcome and dangers of the latest AI advancements. However, one man is more than sold on AI's ability to bring humanity to its next evolutionary level.
Why it matters: There is a growing consensus that generative AI has the potential to make the open web much worse than it was before. Currently all big tech corporations and AI startups rely on scraping all the original content they can off the web to train their AI models. The problem is that an overwhelming majority of websites isn't cool with that, nor have they given permission for such. But hey, just ask Microsoft AI CEO, who believes content on the open web is "freeware."
Cutting corners: Researchers from the University of California, Santa Cruz, have devised a way to run a billion-parameter-scale large language model using just 13 watts of power – about as much as a modern LED light bulb. For comparison, a data center-grade GPU used for LLM tasks requires around 700 watts.
Why it matters: Advanced AI capabilities generally require massive cloud-hosted models with billions or even trillions of parameters. But Microsoft is challenging that with the Phi-3 Mini, a pint-sized AI powerhouse that can run on your phone or laptop while delivering performance rivaling some of the biggest language models out there.
A hot potato: GPT-4 stands as the newest multimodal large language model (LLM) crafted by OpenAI. This foundational model, currently accessible to customers as part of the paid ChatGPT Plus line, exhibits notable prowess in identifying security vulnerabilities without requiring external human assistance.
WTF?! Arm's CEO has sounded a warning bell about the energy requirements needed to advance AI algorithms. He cautions that in a few years, "AI data centers" could require so much electricity that it could jeopardize the US power grid.
AI PCs should include an NPU, Copilot, and the Copilot key
Why it matters: Microsoft and multiple chipmakers have spent months heralding the arrival of the "AI PC," which utilizes generative AI and large language models to facilitate various tasks in new ways. However, the definition of an AI PC remains somewhat unclear. At a recent event in Taipei, Intel began defining the specifics that it agreed upon with Microsoft.
In the future, you won't buy a car without an AI chip inside
A hot potato: Nvidia has been attempting to license its GPU technology to third-party chip manufacturers for quite some time. Taiwanese fabless chipmaker MediaTek has now announced a new partnership with the GPU giant to bring new "experiences" and AI edge capabilities to cars.
In context: Most, if not all, large language models censor responses when users ask for things considered dangerous, unethical, or illegal. Good luck getting Bing to tell you how to cook your company's books or crystal meth. Developers block the chatbot from fulfilling these queries, but that hasn't stopped people from figuring out workarounds.