PowerColor claims they can improve GPU performance by leveraging an external NPU

Alfonso Maruccia

Posts: 1,185   +343
Staff
AI Everywhere: Taiwanese manufacturer PowerColor has been making graphics cards and GPUs since the late 90s. The company's latest proposal relates to AI, of course, or rather how NPUs can be used to optimize GPU utilization in modern gaming applications.

During this year's Computex, PowerColor showed a new technology prototype that could theoretically bring NPUs and GPUs together for better performance and power efficiency. Edge AI, as the technology is known, uses the specialized silicon components hidden inside an NPU to optimize how a discrete GPU uses power to render complex 3D graphics on-screen. The results can be significant they claim.

For its Computex demonstration, PowerColor wired an external NPU to an AMD graphics card. PowerColor's Edge AI was programmed to manage power consumption of the GPU through the NPU, sort of an "Eco mode" which the company claims can bring wattage down and push frame rates up. According to data provided by graphics card maker, this NPU-based AI engine was able to reduce power consumption by 22 percent while gaming.

Edge AI was employed to manage GPU utilization in Cyberpunk 2077 and Final Fantasy XV, which were using 263W and 338W of power by default, respectively. The external NPU brought those numbers down to 205W and 262W, achieving better results than AMD's own Power Saving mode.

PowerColor has provided no further details about how Edge AI performs its tricks, though it's likely using specially-trained machine learning algorithms on the NPU to manage a GPU's load, voltage, and temperature to improve energy efficiency. The company also claims that Edge AI can increase gaming performance, but the Computex demo showed a 9 percent drop in frame rates instead.

Neural processing units are coming to x86 and Arm PCs with the latest processors from Intel, AMD, and Qualcomm. The idea is that Edge AI could run its power optimization routines on these integrated NPUs, rather than using a convoluted external hardware setup as shown in the Computex demo.

PowerColor could even integrate Edge AI into its upcoming discrete GPUs, though this would likely add to the cost of the graphics card which wouldn't necessarily be something gamers are willing to pay for, unless results are really there. Despite being just a demo, the PowerColor tech does open the door to some potentially interesting ideas. GPU makers could take note and look into similar integrations in the not-so-distant future, turning NPUs embedded in the latest CPUs into something useful even on powerful x86 gaming systems.

Permalink to story:

 
Seems lkke a waste of hardware and an excuse for expense for something that's perfectly possible already (lowering power target by 10%, limiting max freq and unvervolting to some extent), unless this is gping to be able to go past those limits that AMD set, or be able to better optimise the results (the power drop makes sense, I've been able to reduce power draw by like 100-150w on my 7900xtx while only losing like 5% in performance because the cards are pushed so much for every last mhz), hiwever going by the drops they stated, it seems to be quite conservative and not really pushing the envelope.
 
Νot buying it.

They claim significant Power Savings AND FPS increase.

Sounds like Cold Fusion or Carnot's Heat Engine.

I wonder what BS the Chinese are going to come up with next.
 
I don't know about what Powercolor is selling here (Guess they have to show something thanks to RDNA4 being AWOL) but I have wondered if "AI" drivers are in the pipe to deliver better day 0 compatibility and performance with games and software. Essentially use an algo to identify and recompile game shaders to something that is more friendly to the hardware.

I'm sure we'll see them at some point sooner than later from Nvidia.
 
Νot buying it.

They claim significant Power Savings AND FPS increase.

Sounds like Cold Fusion or Carnot's Heat Engine.

I wonder what BS the Chinese are going to come up with next.

No, it makes perfect sense. Power delivery and performances can be monitored and managed in real time in perspective of the workloads.

And last, Powercolor is Taiwanese... call them Chinese like you want, but it is like comparing North and South Korea.
 
Seems too good to be true.

We have nothing to lose but our patience

No idea how AMD/Nvidia drivers work, it's it that it's handling a pack a hungry wolves that are all clamouring for resources willy nilly, or they have really good workflow optimisers. Of course this is hardware related and not sure how they average user would like AMD or Nvidia playing around with voltage, timings etc. But surely if doing this want to start as far upstream ( software ) as possible.

Anyway if Powercolor are successful then would be beat to work with AMD and Nvidia and ultimately incorporate it into all GPUs , especially as article Nvidia boosting NPUs on their GPUs for microsoft to use.
We will have NPUs here there and everywhere soon
 
All this for something an app can do. Hook into afterburner’s API and set values from there….
Or just manually undervolting
 
Back