Game developer says Intel should recall its defective, crash-prone CPUs

The media must be so proud of their attacks on AMD and biased reviews to ensure Intel wins wins wins. Media applauded Intel, even knowing full well they had pushed the silicon past it's limits in order to take the wins.

cRaptor Lake is an entirely appropriate name.

Haven't touched Intel since 3570K days. Already planning my 9900X3D new build around November.
 
Godzilla hasn't quite gotten the message.

At this point every buyer needs to buy AMD CPUs, even the fanbois.

Then maybe we will see some respect to the End Customer whom Intel have been disrespecting since haswell times.
I'll stick to intel myself; my 13700KF still runs with no issues...as do my i7 10700KF , i9 10920X, i7 6850K, i7 5930K, i7 4770K, i5 3570K, i5 2500K, i7 980X, i7 960, i7 930, i7 920, Q9650, Q6600, QX6700, P4 2.8Ghz, and PIII 1Ghz.
 
Funny how most reviewers sung the praises of 13th and 14th gen CPU's on launch day, casually ignoring that in almost every possible way they are worse than AMD's offerings.

I can hear the fanboys now: "But it's the fastest CPU for productivity". Great, so how's that 280-300W TDP working out for you when it's at best 10% faster than Ryzen 9 7950X(or 3D) that's sipping a mere 120-150W and isn't running at a temperature even The Sun would raise an eyebrow against?

Or when for just £340 if gaming if your focus, Ryzen 7 7800X3D uses around 85-90W and beats the entire Intel stack and even most of AMD's own stack hands down?

Yet they still outsell AMD and hold significant market share. I don't get it - Ignore brand loyalty on either side, but why do people pay more for something objectively worse out of the two CPU makers?
Intel can produce many more cpu than amd because Intel has in house foundry.
while amd has to share tsmc capacity with Qualcomm, apple, etc.
 
tbh 20 years ago amd didn't suck. they had the lead in mainstream and high end cpus. I'm not sure which era you are talking about. around 2004, athlon 3500 and fx cpu's were clear favorites.
Yes they were ahead of intel in terms of speed.... but their stability sucked. Intel were more stable. Those first gen Athlon CPUs didnt work well with alot of mobos.
 
My first question is, is it possible to achieve stability by running the chips at a lower voltage and lower clock speed? If so, at least there's a workaround.

Perhaps, if CPU has not rotten badly enough already. Because Intel stays silent, all we know is that there is problem. What problem exactly is, remains mostly speculation.
 
The fact that it worked at the start and these crashes only happened after some time, points to the chip failing or degrading over time. Still a speculation, but it’s a clue to the problem. Hence, once the issue starts, even reducing power draw can’t solve the problems.
 
Just a game developer? Is it really the expertise level we want to trust?
Now from a personal perspective, when I had 9600k+win10 My PC went on for years without blue screen.
I am getting one at least once a month with 13700k+ win11.
It is a solid fact 13th and 14th generations came with a variety of issues.
But I am not ready to accept that this is a widespread issue disrupting the work and gaming of millions of intel users.

It's not just one game developer pointing out the issues with stability with these CPUs.
Nvidia has pointed out that a lot of the driver errors people were complaining on their forums, were in fact due to Intel CPUs.
Several server companies have talked about how they are constantly replacing Intel CPUs.
And the RadV game tools also pointed to Intel CPUs as causing crashes when using Oodle decompression.
The Warframe devs have now said they are going to filter their crash logs by CPU, because they too consider there is probable cause.
 
My first question is, is it possible to achieve stability by running the chips at a lower voltage and lower clock speed? If so, at least there's a workaround.

I was thinking the same, I have an Alienware M18 with a core i9 13900HX, I don't know if this is supposed to be one of the problem cpu's but I have had no trouble at all in the past year... Being a laptop it is running lower power pretty much all the time.

Apparently it has a base power of 55W and max of 157W.
 
I was thinking the same, I have an Alienware M18 with a core i9 13900HX, I don't know if this is supposed to be one of the problem cpu's but I have had no trouble at all in the past year...
Only the desktop parts, not laptops. Where Intel's listed power rating stops having any meaning. As others have said, not even 360mm radiators can handle the heat.

Alderon talks about their servers 100% failing. These won't be Xeon web-servers, but rather specialty speed demons built around Intel's hot running 13900K/14900K parts.
 
My first question is, is it possible to achieve stability by running the chips at a lower voltage and lower clock speed? If so, at least there's a workaround.

Like Watzupken said in plain speech, the Intel CPUs that are failing do so b/c degradation due to electromigration owing to feeding the CPU too much voltage in order to increase its performance.

Feeding the CPU way too much voltage (not just VCORE but other voltages too) as the ppl at Overclock.Net are gonna tell you will work fine for a time but then one day the CPU will become unstable and throw a BSOD at previously confirmed stable settings verified by Prime and other stress tests.

Godzilla's 13th/14th gen hi-end CPUs are dying a slow death and a class action lawsuit is only a matter of time.
 
I honestly don't understand why anyone still buys Intel.
For years, prior to Ryzen, AMD was considered the "budget" option.

I regret buying my 14th gen. Next system will be a Ryzen system.


My first question is, is it possible to achieve stability by running the chips at a lower voltage and lower clock speed? If so, at least there's a workaround.
That isn't what consumers pay for. I paid for the maximum speed allowable sans overclocking. If I have to lower voltage to keep stability, yet hampers performance, then that's a problem.

When you have fastest CPU, power consumption doesn't matter. Even if AMD has faster CPU, power consumption still doesn't matter because IntelWatts are anyway less than AMDWatts.
It matters when it comes to the lifespan of the cpu... More power, more heat, more heat, faster degradation. Where do you get that wattage from Intel is lower than AMDs?
 
My first question is, is it possible to achieve stability by running the chips at a lower voltage and lower clock speed? If so, at least there's a workaround.
Wendel was able to prove even servers running these on conservative efficiency settings had significant crashes and were charged $1000 more than AMD'S offering wait for it for the support due to crashes.

Yeah I did a poll on Videocardz website the 9700x rumored 65 watt tdp dropped said would enthusiast rather have higher tdp and ability to clock significantly higher even some middle ground and mostly everyone liked the 65 watt tdp and got triggered about me even raising the question. Their minds melted when the 120 watt tdp rumor dropped 2 days later.
Yet they still outsell AMD and hold significant market share. I don't get it - Ignore brand loyalty on either side, but why do people pay more for something objectively worse out of the two CPU makers?
For me as an AM5 user from 2022 this probably means that AMD will have an incentive to keep innovating to grab more market share. I see this as a win for the early adapters.
 
Actually, AMD way out-sells Intel in the assemble-your-own desktop market - where you get to choose a motherboard. Which is the market we're talking about here.
 
It matters when it comes to the lifespan of the cpu... More power, more heat, more heat, faster degradation. Where do you get that wattage from Intel is lower than AMDs?

Yes idd, I forgot to add this.

Voltage, Power and Heat accelerate the process of degradation and electromigration.

The foll. comes from an INTEL employee called "RAGLAND". He is quite clear as you will see.

QUOTE

"the amount of time a processor stays in elevated temperature and voltage states has the biggest impact on lifespan"

"Assuming voltage remains constant, each successive drop in temperature results in a non-linear increase in life expectancy, so the 'first drop' in temps from 90C to 80C yields a huge increase in chip longevity"

"higher voltages definitely reduce the lifespan of a processor"

"It's the person that sets their system up at elevated voltages and just leaves it there 24/7 [static overclock], that's the person that is going to burn that system out faster than someone who uses the normal turbo algorithms to do their overclocking "

UNQUOTE

Combo of feeding Intel chips ridiculous voltages and having them operate at ridiculously high TDP's and temperatures over 80C, will cause maximum CPU degradation through electromigration.

Intel knowingly cast a blind eye on motherboard makers ignoring their "safe limits" while at the same time elevating TDP's to the max which caused the chips to routinely work at temps over 80C so that their CPUs could beat AMD's on paper.

They knew full well that doing so would cause those chips to degrade rapidly.

It's the good ol' Godzilla that we know and love pulling an Intel.
 
Godzilla hasn't quite gotten the message.

At this point every buyer needs to buy AMD CPUs, even the fanbois.

Then maybe we will see some respect to the End Customer whom Intel have been disrespecting since haswell times.

I feel this is huge. Intel got lazy and complacent. They pushed this monolithic arch to far, even with the its chiplet design, they extended its life but they need to do what Apple and AMD did and thats start from scratch on a new arch that takes advantage of you know.. Now.

But that costs money. R&D. And you got to keep share holders happy, so why not buy some stock? Make our value increase increase rather than doing ****.
 
I feel this is huge. Intel got lazy and complacent. They pushed this monolithic arch to far, even with the its chiplet design, they extended its life but they need to do what Apple and AMD did and thats start from scratch on a new arch that takes advantage of you know.. Now.

But that costs money. R&D. And you got to keep share holders happy, so why not buy some stock? Make our value increase increase rather than doing ****.
Agreed, they need to go back to the drawing board and design a whole new microarchitecture.
 
Agreed, they need to go back to the drawing board and design a whole new microarchitecture.

To remind, Pentium Pro is last totally new high performance x86 architecture from Intel. Probably not coming soon.

There have of course been some low performance ones and non-x86 architectures after that. Also some say Sandy Bridge is totally new architecture but it's not because SB shares way too many similarities with Nehalem (L1 cache: same size, fetch and decode: on high level almost same, decoders: on high level almost same, execution units: same amount of units that are almost same with AVX differences...). On high level even Sunny Cove (vs Skylake) is "more" new architecture than Sandy Bridge.
 
This is the culmination of over a decade of stagnation at Intel. They got really lazy and efficency went out the window. I have to manually put the power TDP limits in the bios for my 13700kf and it still spikes to 80-95c, averaging 80c. Intel calls this completely normal and within the thermal limits.

They really let themselves go w/ this generation.
 
Back