• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Developers of Outpost Infinity Siege Recommend Underclocking i9-13900K and i9-14900K for Stability on Machines with RTX 4090

Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
This is just a case of rumor and FUD being blown out of proportion. I suspect developers are either trying to defend their game's quality or outsource a problem they have into something else.

Any Raptor Lake system that is crashing at stock is due to user error, such as unlocking the power without suitable cooling or an insufficient power supply. A lot of the time, people just enable XMP too, which is not a wise thing to do, unless you have a kit with a very conservative XMP profile, especially if you're running in a closed case with low fan speeds aka hotbox. I have never experienced any instability in any game or software with my i9-13900KS.



I mean this is an enthusiast product. Removing the power limit is an awesome ability to have. But it's unwise to do so on a conventional machine. With 360-420 mm AIO's or large tower heatsinks such as the NH-D15, you won't want to run a Raptor chip above 300 W.
Your claim is that no CPU could be bad or leaking voltage. There is also the fact that some board vendors like to turn the voltage up regardless of the processor.
 
Joined
Feb 1, 2019
Messages
2,742 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Doubtful, as vast majority of CPUs simply work with game, including i9. They clearly say it's a small number of i9, so it could be a batch or specific motherboards that overprovision those CPUs.
You trying to make it sound like its some kind of inherit fault due to bad manufacturing batch or perhaps poor silicon quality not stable at spec settings, this is where its misleading and a rumour mill going into conspiracy theory territory.

We already know UE is a crappy engine, so its not impossible that itself has inherit problems in its code, but in regards to Intel advising people to make changes its almost certainly telling them to put things back to spec, so if it is hardware instability it will be down to motherboard's running out of spec.

For reference its entirely possible to have buggy code that only gets exposed on fast enough hardware (timing issues). One reason why console manufacturers make such an effort to gimp newer refreshes of hardware to match timings and such.
 
Joined
Jul 1, 2011
Messages
344 (0.07/day)
System Name Matar Extreme PC.
Processor Intel Core i9-10900KF @5.1GHZ All cores Ring@4.6GHZ @1.280v , 24/7
Motherboard Gigabyte Z590 UD , With PCIe X1 Card intel killer 1650x card
Cooling CoolerMaster ML240L V2 AIO with MX6
Memory 4x16 64GB DDR4 3600MHZ CL16-19-19-39 G.SKILL Trident Z NEO
Video Card(s) Nvidia ZOTAC RTX 3080 Ti Trinity + overclocked 100 core & 1000 mem
Storage WD black 512GB Nvme OS + 1TB 970 Nvme Samsung & 4TB WD Blk 256MB cache 7200RPM
Display(s) Lenovo 34" Ultra Wide 3440x1440 144hz 1ms G-Snyc
Case NZXT H510 Black with Cooler Master RGB Fans
Audio Device(s) Internal , EIFER speakers & EasySMX Wireless Gaming Headset
Power Supply Aurora R9 850Watts 80+ Gold, I Modded cables for it.
Mouse Onn RGB Gaming Mouse & Logitech G923 & shifter & E-Break Sim setup.
Keyboard GOFREETECH RGB Gaming Keyboard, & Xbox 1 X Controller
VR HMD Oculus Rift S
Software Windows 10 Home 22H2
Benchmark Scores https://www.youtube.com/user/matttttar/videos
lol this is why i am still using my i9-10900KF and RTX 3080 Ti
 
Joined
Aug 25, 2021
Messages
1,068 (1.06/day)
You trying to make it sound like its some kind of inherit fault due to bad manufacturing batch or perhaps poor silicon quality not stable at spec settings, this is where its misleading and a rumour mill going into conspiracy theory territory.

We already know UE is a crappy engine, so its not impossible that itself has inherit problems in its code, but in regards to Intel advising people to make changes its almost certainly telling them to put things back to spec, so if it is hardware instability it will be down to motherboard's running out of spec.

For reference its entirely possible to have buggy code that only gets exposed on fast enough hardware (timing issues). One reason why console manufacturers make such an effort to gimp newer refreshes of hardware to match timings and such.
Had you read the linked article (more carefully?), you would have found out that they ecplicitly observed the same behaviour, i.e. crashes, was replicated in other applications OUTSIDE of the game.

Not really, no. Three models causing issue is not a "poor batch." It's simply game devs being lazy.
Did you actually read the linked article?
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,359 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
Did you actually read the linked article?
Yes, and even running a stock 14900k on another machine had zero issues. Still points to a dev issue.
 
Joined
Feb 1, 2019
Messages
2,742 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Had you read the linked article (more carefully?), you would have found out that they ecplicitly observed the same behaviour, i.e. crashes, was replicated in other applications OUTSIDE of the game.


Did you actually read the linked article?
How does that backup the point you trying to make?

There is nothing I can find on the net stating people running this hardware at spec is inherently unstable.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Your claim is that no CPU could be bad or leaking voltage. There is also the fact that some board vendors like to turn the voltage up regardless of the processor.

Bad CPUs are the exception, not the norm. If a CPU doesn't operate at stock voltages in an otherwise well-equipped system that has adequate cooling, power supply and compatible motherboard at stock settings, then you need to request an RMA.
 
Joined
Aug 25, 2021
Messages
1,068 (1.06/day)
There is nothing I can find on the net stating people running this hardware at spec is inherently unstable.
Did you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.

Yes, and even running a stock 14900k on another machine had zero issues. Still points to a dev issue.
No, it does not.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Bad CPUs are the exception, not the norm. If a CPU doesn't operate at stock voltages in an otherwise well-equipped system that has adequate cooling, power supply and compatible motherboard at stock settings, then you need to request an RMA.
I am not saying they are the norm. I can't find it right now but I know MSI Insider did a live stream showing that a 14900K and 4090 will draw up to 1100 Watts in some scenarios. We don't know what wattage some of these failed tests could be running at. It could be as simple as PSU spikes. Would anyone use a PSU less than 1000W for that combo? I am sure there are those that have.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I am not saying they are the norm. I can't find it right now but I know MSI Insider did a live stream showing that a 14900K and 4090 will draw up to 1100 Watts in some scenarios. We don't know what wattage some of these failed tests could be running at. It could be as simple as PSU spikes. Would anyone use a PSU less than 1000W for that combo? I am sure there are those that have.

If you let both of them run wild and overclocked to the max, yes, that seems plausible. The 4090 can pull up 600 W and the i9 another 400. Add 100 for the rest of the components and fans, even 1200 W+ seems rather plausible. Therein lies the problem, people running the effectively unlimited 4096 W PL1 setting while on conventional cooling, with a 750-850W PSU and expecting it to be stable. It's user error, where is all that heat going? How is the system being fed?
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,359 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
Did you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.


No, it does not.
Okay so show me other games that suffer from a processor running too fast. Other games that suffer the same issues as this one where the devs tell people to slow their hardware down.
 
Joined
Aug 25, 2021
Messages
1,068 (1.06/day)
Okay so show me other games that suffer from a processor running too fast.
I can only tell you to read the linked article again, as you do not seem to be willing to take information in.
- it's not about other games, but other workloads with similar behaviour observed
- it's not about any i9, but a small number of select CPUs
- and it's not about CPUs running "too fast", but getting into instability territory in several workloads outside of this game too
The article is pretty simple. In order to isolate the issue, more specific investigation would be needed.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Games that malfunction on fast hardware exist and are nothing new, but I don't think it applies to this game or any modern engine, unless physics, scripting and/or animation are tied to frame rate.

Original 2011 Skyrim would break hard above 60 fps, Fallout 4 showed similar issues to a lower extent.


Even then limiting the frame rate should fix it.
 
Joined
May 24, 2023
Messages
732 (1.95/day)
Bad CPUs are the exception, not the norm.
Speaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.
 
Last edited:
Joined
Mar 28, 2020
Messages
1,660 (1.09/day)
Recommendations of running Intel CPUs at 5 GHz are no surprise for me, I stated multiple times that these CPUs are not up to the insane speeds Intel pushes them to. I am running my 14900K at 5.2 GHz and I always felt adventurous for doing so.

I was just a little surprised that they recommend power limit of just 125W in the Oodle document. I got a feeling that 160W is just fine and comfortable power draw for these CPU, but apparently feelings may sometimes not be a reliable source of information.
My guess is that not all cores are active in games, and therefore, don't really need that much power to begin with. My 12700K was typically pulling about 80 to 90W in the games I play.

Speaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.
Intel's 10nm was pushed very hard, just like its 14nm before. The main culprit for the insane power draw are the P-cores that are pushed very hard to get to this prized 6 or more Ghz since the power requirement grows exponentially when its pushed passed the "comfort zone" of the chip. The increased in E-cores and cache contributed to the increased power draw as well, but less so since 13900K and 14900K are essentially the same chip.

As you mentioned, I do worry about the longevity of these i9 processors because they are clearly running at frequencies, power and heat, that may cause them to degrade or fail prematurely. The number may sound low only because the number of i9 sold should be very low as compared to more popular models like i5/i7. I am happy with the i7 12700K because it fast enough and I don't need that many E-cores. 16 E-cores is just ridiculous and clearly not aiming at being efficient.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Speaking about what is normal - have you heard mr. Gelsinger talk about the frequencies - I think it was when Intel CPUs first reached 6 GHz. He commented how engineers were convinced such frequencies were impossible to reach.

Remember what 12900K runs at? The best silicon they could make could only do 5.2 GHz (absolute maximum of selected cores).

How do you think such big improvement with the same process in such a short period of time could have happened?

I can tell you that:
  • Some improvements in the process might have helped - but with the same process and the same underlying technology there is only a very limited scope of improvements that can be done.
  • They pushed frequencies hard. And then they pushed hard AGAIN. They are on the absolute edge and breaking point of what the chips can handle.
  • They must have abandoned reliability safety margins for these chips to make this happen.
So in this situation - can somebody be really surprised that some chips are unstable at these breakneck frequencies?

BTW 12900 was 8+8, 14700 is now 8 + 12 and 14900 8 + 16 cores. There is more heat and more power drawn in these new chips - that does not help either.

And one more thing: these chips can do AVX-512. It is disabled because it stressed the chips too much and was breaking them. I guess if we knew the frequencies the chips need to run at to be able to handle AVX-512 - we would learn the true stable frequency for Alder and Raptor lakes.

Yes, they can be surprised, you're just extremely skeptical of it. Intel has released a series of CPUs that offered increasingly higher clocks since the 12900K. Even Alder topped out at 5.5 with its KS, and now they just released a validated 6.2GHz CPU - I never believed they'd do a 14900KS myself.

There is no problem with Raptor Lake's boost frequencies. The processors are fully stable and capable of handling it. They will not malfunction as long as their cooling and power requirements are met.

No reliability margins were abandoned. Since Intel doesn't rely on TSMC and their production is fully vertically integrated from sand to packaged silicon, they've just stringently binned each processor for their quality grade. Remember every CPU since the 13900K is exactly the same, they just differ in clocks, with the 13900KS and now 14900KS being the highest quality chips they offer.

AVX-512 isn't a factor and even if it had been enabled it wouldn't run so high executing that kind of vectorized code. Current requirements would be insane.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Yes, they can be surprised, you're just extremely skeptical of it. Intel has released a series of CPUs that offered increasingly higher clocks since the 12900K. Even Alder topped out at 5.5 with its KS, and now they just released a validated 6.2GHz CPU - I never believed they'd do a 14900KS myself.

There is no problem with Raptor Lake's boost frequencies. The processors are fully stable and capable of handling it. They will not malfunction as long as their cooling and power requirements are met.

No reliability margins were abandoned. Since Intel doesn't rely on TSMC and their production is fully vertically integrated from sand to packaged silicon, they've just stringently binned each processor for their quality grade. Remember every CPU since the 13900K is exactly the same, they just differ in clocks, with the 13900KS and now 14900KS being the highest quality chips they offer.

AVX-512 isn't a factor and even if it had been enabled it wouldn't run so high executing that kind of vectorized code. Current requirements would be insane.
I agree with you but specific to this thread, you can give Intel some for releasing a chip that is so power hungry. The 4090 is also a no holds barred GPU. Putting those together needs serious thought about PSU as we have already agreed.

You don't know if all Intel fabs produce the same quality chips so there is that. Having said that just like AMD the best chips become I9 and the worst chips become I3. They are not bulletproof.

If you enabled AVX-512 on these chips the PSU would definitely trip. Probably need a 1600W behemoth with a 4090.
 
Joined
Feb 1, 2019
Messages
2,742 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Did you read in the article that the issue concerns small number of CPU that also crashed in other workloads? It's a not a pandemic. Some owners returned those and got replacements.


No, it does not.
You asked me the same question twice, you not going to get a different answer, I read it and checked the rest of the net for anything suggesting Intel 14th gen CPUs are unstable when running at spec, found nothing.

Here is some examples for reference.

I am part of the FF7 modding community and we have had issues with timing when loading different modules into the game, as well as hext code, (both when system is too slow and also when its too fast), these were difficult to fix with changes to the hooking mechanism.

I currently use Flagrum mod manager with FF15, and on my game the autobuild.earc file doesnt get patched on launch, its almost certainly a timing issue as like the FF7 mod manager patching is done in memory on the fly. I even see occasional issues with workshop mods a official patching mechanism supported by Square Enix.
 
Joined
May 24, 2023
Messages
732 (1.95/day)
BTW a motherboard knows a lot less about the chip than Intel (I believe it reads just the voltage table). And even Intel has limited time to test the chip while binning it. Even they do not know everything about how the chip is at the moment and how it will develop in the future.

If somebody pushes chips so hard that there are (almost) no safety margins for reliability left, bad things happen.

In the old days 14900K would have been released with 4800/4000 MHz stock frequencies, a very nice, efficient and very easy to cool product. I am going to test how it runs set this way now.
 
Last edited:
Joined
Feb 1, 2019
Messages
2,742 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
BTW motherboard manufacturers know a lot less about the chip than Intel. And even Intel has limited time to test the chip while binning it. Even they do not know everything about how the chip is at the moment and how it will develop in time.

If somebody pushes chips so hard that there are (almost) no safety margins for reliability left, bad things happen.

In the old days 14900K would have been released with 4800/4000 MHz stock frequencies, a very nice, efficient and very easy to cool product. I am going to test how it runs set this way now.

But that isnt the issue though.

I mean look at this document which some are claiming proves faulty CPUs are the cause.


We can see in the article errors are being mislabelled, e.g. the devs acknowledge a bad hash of data can cause their engine to report "out of video memory" thats a code problem. They also report they are mislabelling verification errors as a generic "unable to load shader".

The article starts of by stating hardware problems, but then goes on to confirm its bios related, bios is software not hardware.

It also confirms they are not sure of what the problem is other than its generic system instability.

Some solutions reported by their customers include disabling XMP, reducing SVID back to spec, reducing power limits back to spec, and downclocking CPUs. The latter could work because it in effect can force the CPU back into normal operating range by the lower clock speeds, its masking a bad bios configuration.

So yes there is nothing even in this UE document that confirms any kind of faulty CPU.

This story should be putting pressure on board vendors to stop with what they doing and UE developers to improve their code, but its a let off for them if people start blaming the chips instead.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
But that isnt the issue though.

I mean look at this document which some are claiming proves faulty CPUs are the cause.


We can see in the article errors are being mislabelled, e.g. the devs acknowledge a bad hash of data can cause their engine to report "out of video memory" thats a code problem. They also report they are mislabelling verification errors as a generic "unable to load shader".

The article starts of by stating hardware problems, but then goes on to confirm its bios related, bios is software not hardware.

It also confirms they are not sure of what the problem is other than its generic system instability.

Some solutions reported by their customers include disabling XMP, reducing SVID back to spec, reducing power limits back to spec, and downclocking CPUs. The latter could work because it in effect can force the CPU back into normal operating range by the lower clock speeds, its masking a bad bios configuration.

So yes there is nothing even in this UE document that confirms any kind of faulty CPU.

This story should be putting pressure on board vendors to stop with what they doing and UE developers to improve their code, but its a let off for them if people start blaming the chips instead.

Your argument could have some merit. Let's think back to the days of burning DVDs (Modern CPUs would be sweet for that) the speed would would be slower than what was possible to maintain the picture and audio. Even though benchmarks will show a SSD running at 10 GB/s the most Windows does as far as I have seen is 2.9 GB/s. This could hold true for processors as well. It is what makes PC so unique as a product, there could be a myriad of reasons that using the most power hungry parts are doing this. Regardless, all it does is highlight how much more efficient AMD are, especially when they use the chip everyone loves in the 7800X3D. It could even be something as stupid as the CPU robbing the GPU of power or vice versa with that single 12vHPWR connection on the 4090 with the way some PSUs may be wired.
 
Joined
Feb 1, 2019
Messages
2,742 (1.41/day)
Location
UK, Leicester
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 3080 RTX FE 10G
Storage 1TB 980 PRO (OS, games), 2TB SN850X (games), 2TB DC P4600 (work), 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Asus Xonar D2X
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Your argument could have some merit. Let's think back to the days of burning DVDs (Modern CPUs would be sweet for that) the speed would would be slower than what was possible to maintain the picture and audio. Even though benchmarks will show a SSD running at 10 GB/s the most Windows does as far as I have seen is 2.9 GB/s. This could hold true for processors as well. It is what makes PC so unique as a product, there could be a myriad of reasons that using the most power hungry parts are doing this. Regardless, all it does is highlight how much more efficient AMD are, especially when they use the chip everyone loves in the 7800X3D. It could even be something as stupid as the CPU robbing the GPU of power or vice versa with that single 12vHPWR connection on the 4090 with the way some PSUs may be wired.

Well yeah, but this is a separate discussion, we know these chips when they are allowed to can use ridiculous amounts of power, the same with Nvidia GPU's. But again this is out of spec behaviour.

The AMD platform also hasnt been plain sailing, burnt out chips, due to (guess who) motherboard bios's misconfigured (out of spec behaviour). Both vendors push to the limits but in different ways. AMD just do it differently to Intel. This is a reason why board vendors have started to get caught out, they took liberties for a long time with their baseline overclock, over volt etc. and largely got away with it due to the chip vendors leaving more tolerance in the products, those days are gone for the foreseeable future.
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I agree with you but specific to this thread, you can give Intel some for releasing a chip that is so power hungry. The 4090 is also a no holds barred GPU. Putting those together needs serious thought about PSU as we have already agreed.

You don't know if all Intel fabs produce the same quality chips so there is that. Having said that just like AMD the best chips become I9 and the worst chips become I3. They are not bulletproof.

If you enabled AVX-512 on these chips the PSU would definitely trip. Probably need a 1600W behemoth with a 4090.

Then again, an i9-KS is also a no holds barred CPU. It's the spiritual successor to the Core 2 Extreme processor that Intel offered before the lines were split from an unified LGA 775 into LGA 1366 (HEDT) and LGA 1156 (mainstream), after all. Yes yes, I know Skulltrail and all that... but the QX9775 was a one-off Xeon rebrand available for a single motherboard anyway.

The 13th and "14th generation" Core i9 processors are identical with no silicon-level changes between them. After a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.

It doesn't warrant a generational leap (which is why "14th Gen is a scam" is a thing), but it's a welcome improvement in chip quality regardless.
 
Joined
May 24, 2023
Messages
732 (1.95/day)
After a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.
Sorry, but abandoning an industry good practice of providing safety margins for product reliability and selling products stretched to breaking point is not remarkable, that is just SAD. Or even TRAGIC.

And even more sad is, that these are actually very good products, being LITERALLY DESTROYED by their manufacturer with insane out of the box settings and not having enough control over what motherboard manufacturers do with these chips, everything done only to improve how Intel looks, at the expense of the end customers. Because they will have problems dealing with all those baked, failing and unstable chips.

Tragedy.
 
Last edited:
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Then again, an i9-KS is also a no holds barred CPU. It's the spiritual successor to the Core 2 Extreme processor that Intel offered before the lines were split from an unified LGA 775 into LGA 1366 (HEDT) and LGA 1156 (mainstream), after all. Yes yes, I know Skulltrail and all that... but the QX9775 was a one-off Xeon rebrand available for a single motherboard anyway.

The 13th and "14th generation" Core i9 processors are identical with no silicon-level changes between them. After a year manufacturing the same chips in an already mature node, Intel actually managed to mass-produce chips that reach the i9-13900KS's clock targets with their manufacturing level improvements in form of the i9-14900K, and honestly this is remarkable. And even more remarkable is that they managed to stretch this to actually make a batch or two of i9-14900KS chips that somehow go above and beyond, even if you need to trade the extreme increase in power consumption even compared to the already hungry 13900KS for the last 200 MHz.

It doesn't warrant a generational leap (which is why "14th Gen is a scam" is a thing), but it's a welcome improvement in chip quality regardless.
There is nothing new in that. Even with the KS, the 14900k is already turned up. All CPUs today are. We are in the middle of a CPU war. Intel has indeed refined the node but that is what happens with every CPU on the same node process that uses the same process. Look at the fact that we are getting GT processors on AM4. Unfortunately for them the other side has been exactly that in other sectors though. They can only respond with a refresh at the moment. They will have to change to the same process or a variant of what TSMC is to keep up. I cannot see the community being keen on a 500W 15900K that can do 6.2 Ghz as an example.
 
Top