News Intel 9th Gen Coffee Lake CPU Pricing Plummets

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Phaaze88

Titan
Ambassador
Actually I think its the opposite. The 9900k is $50 more but you get 8 core/16 threads vs 9700K 8c/8t. If I personally used applications requiring more than six core/threads the 9900k would probably be the better choice for only $50.

However using my 9600k for internet, I-Heart radio, internet TV, gaming and MS Office applications the 5ghz OC all core 9600K performs all those at the top of the charts.

From what the TH has indicated, pairing a 5ghz OC all-core 9600K with a 3080 would be pretty much golden for gaming.
A)Don't look at just the cpu alone.
I can see many people, whom aren't enthusiasts, looking to pair a 9900K with their cheap H and B boards - even some of the cheapest Z board are bad with this cpu - along with a Hyper 212-level air cooler, or some 120mm hybrid...
It's just gonna be a bad time for those not informed. The 9700K has far more flexibility with low-mid range quality mobos and cheap coolers.
[Maybe enthusiast wasn't the right word, but I was referring to those who are well-informed on PC hardware - who already know the ups and downs of the 9900K. The Average Joe won't know jack about this cpu, beyond it being the best of its class at the time.]


B)The article was about 9900K and 9700K price drops.
I'm not EVEN getting involved in the above 9600K debacle. Nope :censored:
 
  • Like
Reactions: Shadowclash10

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
From the article:

>> "A few months ago when the Core i7-9700K costed $339.99 ...."

I don't believe that's the word you want there. The past tense of cost is "costed" only in the active verb form.
 

tiggers97

Distinguished
Apr 28, 2013
60
19
18,545
"The Core i9-9900K, on the other hand, is on the dying LGA1151 platform. "

having gone thru a couple builds (and planning a new one), always with an eye towards future proofing in the past; I now roll my eyes at statements like this. If you keep your computer for more than two years, both LGA1151 AND LGA1200 are both "dead end" sockets. By the time people who buy these sockets are ready to upgrade, we will have gone thru at least 2 other "new" sockets.

And if you do have sockets 1151/1200 in 3-4 years from now and want to upgrade to a faster CPU, the only inventory on the market for your LGA 11151 or 1200 will be old used stock. And you will be competing with every other builder looking to "upgrade" to the top of the line, keeping prices high to the point that you might as well have invested in the top of the line (9900k or 10900k) from the start (ok, maybe you could save $20).
 
Last edited:
3900x is AMD's fastest mainstream gaming CPU, with only the $700 3950 faster, and it still wouldn't have caught the 9600k. My point was, if AMD's fastest can't beat a 9600k, why would you recommend any AMD CPU for gaming? If you drop to your 3600 recommendation, the 9600k is now over 17% faster in Project Cars 3, and they both cost $200 at Newegg. There has not been any market shift to games having any meaningful benefit from more than 6 cores recently. Project Cars 3 and Flight Simulator are both brand new titles.

Easy,

If you absolutely have to have the fastest Intel is still on top. But if you count fps/ dollar AMD gives you more bang for your buck.

It's rare for someone to buy a top end chip to play at 1080p. Do you honestly need over 144Hz@1080p? Most will play at 4k. The cpu really isn't the bottleneck here. The GPU is. So differences at 4k are minimal between brands. But the AMD gives you the benefits of more cores which is useful if you have a multi purpose machine.

Coding, ripping. Streaming, encoding/video editing. AMD is just a better all around architecture that doesn't require a massive 360 mm aio to keep it cool.

With only a few rare exceptions at the low end does the intel deliver a better gaming value.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Easy,

If you absolutely have to have the fastest Intel is still on top. But if you count fps/ dollar AMD gives you more bang for your buck.

It's rare for someone to buy a top end chip to play at 1080p. Do you honestly need over 144Hz@1080p? Most will play at 4k. The cpu really isn't the bottleneck here. The GPU is. So differences at 4k are minimal between brands. But the AMD gives you the benefits of more cores which is useful if you have a multi purpose machine.

Have you read through this thread or just skipped to the last post? The 9600k and 3600 cost the exact same $199.99 on Newegg and the 9600k at 5GHz is on average 17% faster than the 3600 according to THG's comparison. That's a clear bang for buck advantage to the 9600k. Anyone using a $200 CPU isn't gaming at 4k. Why is it every time this topic comes up, everyone pushing for AMD jumps straight to 4k like everyone games at 4k? I'd be willing to bet less than half of 2080Ti owners have a 4k monitor. According to Steam, 2.27% of gamers are using 4k. If you add 1440p to that, you're still below 10% of gamers. Tired of rehashing the same stupid fallacies.
 
  • Like
Reactions: Gurg

Shadowclash10

Prominent
May 3, 2020
184
46
610
It's rare for someone to buy a top end chip to play at 1080p. Do you honestly need over 144Hz@1080p? Most will play at 4k. The cpu really isn't the bottleneck here. The GPU is. So differences at 4k are minimal between brands. But the AMD gives you the benefits of more cores which is useful if you have a multi purpose machine.
:(. Why does no one every mention 1440p lol? I mean, Steam Hardware Survey says 1440p is at 6.89% of users, while 2160p is at 2.27%. Plus, 1440p is growing at a rate of 0.3% (monthly?) while 2160p is 0.03%. IE, more users are at 1440p compared to 4K, but people care waaay more about 4K then 1440p. New GPU: "OMG can this do real high refresh rate 4K gaming??" Whaaaat about 1440p?

EDIT: I just read through the rest of the stats and 1440p is easily above any other resolution in terms of growth per month.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
Have you read through this thread or just skipped to the last post? The 9600k and 3600 cost the exact same $199.99 on Newegg and the 9600k at 5GHz is on average 17% faster than the 3600 according to THG's comparison. That's a clear bang for buck advantage to the 9600k. Anyone using a $200 CPU isn't gaming at 4k. Why is it every time this topic comes up, everyone pushing for AMD jumps straight to 4k like everyone games at 4k? I'd be willing to bet less than half of 2080Ti owners have a 4k monitor. According to Steam, 2.27% of gamers are using 4k. If you add 1440p to that, you're still below 10% of gamers. Tired of rehashing the same stupid fallacies.
Of Steam gamers (9/2020) only 4.44% even have a GPU that would be credible to run 4K and game. ie 2080ti, 2080, 2080s or 1080ti.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
:(. Why does no one every mention 1440p lol? I mean, Steam Hardware Survey says 1440p is at 6.89% of users, while 2160p is at 2.27%. Plus, 1440p is growing at a rate of 0.3% (monthly?) while 2160p is 0.03%. IE, more users are at 1440p compared to 4K, but people care waaay more about 4K then 1440p. New GPU: "OMG can this do real high refresh rate 4K gaming??" Whaaaat about 1440p?

EDIT: I just read through the rest of the stats and 1440p is easily above any other resolution in terms of growth per month.
1440p never gets mentioned by AMD fans because at 1080 and 1440 all AMD CPUs lag significantly behind the 9600k and above Intel CPUS in gaming performance. It isn't until using 4k that the AMD deficit for AMDs best CPUs narrows to a few percent as the bottleneck for performance of a 2080ti shifts from the CPU to the GPU. That is why the discussions in the comments quickly move to 4K even though the percentage of 4k monitors in use for gaming is relatively small.

My estimate would be that with the performance and $500 price for a new Nvidia 3070 the rates of both 1440 and 4k monitor adoption would increase rapidly.

On the graphics side, the best AMD current GPU struggles to match the three year old and two generation old 1080ti. The sad part is that even if you gave AMD the Nvidia technology, its GPUs performance would lag Nvidia due to Nvidia doing a superior job in updating and optimizing its drivers for new games.

Please keep this a secret, but it's good sport to poke the bear (ie AMD fans) with the facts/results from the reviews and listen to them squeal. Just mentioning a 9600K sets them off.
 
Last edited:
Have you read through this thread or just skipped to the last post? The 9600k and 3600 cost the exact same $199.99 on Newegg and the 9600k at 5GHz is on average 17% faster than the 3600 according to THG's comparison. That's a clear bang for buck advantage to the 9600k. Anyone using a $200 CPU isn't gaming at 4k. Why is it every time this topic comes up, everyone pushing for AMD jumps straight to 4k like everyone games at 4k? I'd be willing to bet less than half of 2080Ti owners have a 4k monitor. According to Steam, 2.27% of gamers are using 4k. If you add 1440p to that, you're still below 10% of gamers. Tired of rehashing the same stupid fallacies.

I said with rare exception at the low end. And the 3600 you picked is in low supply right now causing price hikes. You are also comparing a 3600 to is an 9600k over clocked and limited to 6 threads. There are more than enough cases where the 3600 would smash the 9600k where thread count matters.

Its a matter of personal preference. Personally i would spend an extra $20 get the 3600x and overclock it to 4.4GHz and call it a day.

Intel wins some. Amd wins some. Overall i think the amd is a better rounded chip.
 

Phaaze88

Titan
Ambassador
Everyone, please!
iu


It's not that serious.
 
As someone having Intel and AMD systems over the years here is MY take and experience.
For multitasking you can not beat AMD.
Until several months back I was still rocking a Phenom 2 960t unlucked to 6 cores@ 3.6ghz. SSD ,GTX 1070 16 gig ram 256Gig m.2 ssd on a pcie4x card.
Wife is still using a 960t unlocked @3.4 with a 256Gig ssd 8gig ram, 650 ti boost.
Have had a Q6600 @3.0 . Gave it to nephew with a GTX 460
Had a I5 2400 16gb preferred the Phenom. Gave it to grandson with a GTX 1060 6GB installed later.
Still have a I5 6600, 16gig 2400, GTX 1070 combo in the basement folding. still used the phenom for smoother multitasking even though the 6600 was a faster gamer.

Still have much older stuff dating back to P2 350 dual server board, Athlon 1000 system, P3 1.26 @1587 mhz , to name a few, that still boot and run.
Have a Dell G5 8750, 16 gig, 256 gig hard drive, GTX 1050ti 4 gig......


My new daily driver/folder/email/movies/youtube movies/photo editor etc... etc..... etc.... is a Ryzem 3600 @4.4 all core boost, 16gig 3600. 970 evo+ GTX 1070.

It was $159 then not the 200 we have now.
Now to be fair I only game occasionally unless my son or Grand-kids are here.
Most of it is older games @1080p so the laptop on my monitor and speakers is fine.

So I am more inclined to buy AMD for multitasking, more even frame rates , although a few less @ 1080p.
I also buy Nvidia video cards for folding, my monitor is 60hz 1080p who cares if my CPU can do 102 AMD or 120 Intel.
An SSD is a must.
This is my experiences over many years of computing.
 
  • Like
Reactions: Phaaze88

shady28

Distinguished
Jan 29, 2007
427
298
19,090
...

Please keep this a secret, but it's good sport to poke the bear (ie AMD fans) with the facts/results from the reviews and listen to them squeal. Just mentioning a 9600K sets them off.

I know, right? It's that squealing that seems to have removed all reason from "Enthusiast" site reviews as well, as they cater to the loudest group it seems.

As GPUs get better, the delta in performance between Intel and AMD is getting larger. That's an empirical fact at this point with the 3080 having been released. All that gibberish about how Zen 2 would do better in the future has run into the brick wall of reality.

With a 2080 Ti, the best Zen 2 3900X was about 7% behind the 9900K.
With a 3080, the best Zen 2 3900XT (faster than 3900X) is 10% - 11% behind the 9900K @ 1080.
With a 3080, the best Zen 2 3900XT is 7-8% behind the 9900K @ 1440p
The 9900K and 10900K performed identically to each other with a 3080, so its unlikely games are going CPU bound on either of them - but they are with AMD.
The above is despite the Intel platforms using PCIe 3.0 vs AMDs PCIe 4.0
^^^^^ Facts

Link: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/27.html

This delta in performance will only get worse with faster GPUs.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I said with rare exception at the low end. And the 3600 you picked is in low supply right now causing price hikes. You are also comparing a 3600 to is an 9600k over clocked and limited to 6 threads. There are more than enough cases where the 3600 would smash the 9600k where thread count matters.
You're arguing with blatantly false statements. If a $200 9600k is faster than any AMD CPU at gaming then Intel is a better value at every price point from $200 on up which includes every worth while CPU as no one should be buying any quad core CPU's at this point except budget level systems which isn't relevant to this conversation. It's not Intel's fault that AMD CPU's can't consistently overclock well. That's why major sites don't include overclocked results beyond PBO like they do for Intel, which pretty much all overclock to 5Ghz all core, in comparison reviews because there are no guaranteed worthwhile overclocks for AMD CPU's.
 
You shouldn't be buying 4c/8t nor 6c/6t for gaming today, either. Only for a very low budgets, say $100-130.

Again, some games already have deal breaking stutter, and this will get worse over time.

I don't care if you get 400000 FPS with a 9600k and the 3600 only gets 60. If it has stutter and feels choppy, its an inferior experience.
 
  • Like
Reactions: logainofhades
You seem to think highest peak FPS gaming is all that matters.
if all you do is game @1080p then go Intel. You will get higher peak frame rates, but you also get more lowest frame rate stutters.
IF you do anything besides game AMD offers more performance at a lower price.
It also has more consistent frame rates. With higher minimum frame rates ,which is a more enjoyable experience.
It does not matter if its max is 15% better if its minimum is also 15% worse. The hiccups and stutters distract from the overall experience.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
It also has more consistent frame rates. With higher minimum frame rates ,which is a more enjoyable experience.
It does not matter if its max is 15% better if its minimum is also 15% worse. The hiccups and stutters distract from the overall experience.
Just stop with the false information already. If you don't have a real counterpoint, just say nothing, don't make up things to deceive other people reading these threads. I already posted the average fps comparison on page 1 of this thread. Nobody records peak FPS. In that comparison, the 5GHz 9600k had a 16.9% advantage. Here is the 99th percentile comparison from the same comparison:
5cr2nQRvZf2iCbd4HjZek8-650-80.png.webp

In this comparison the 5GHz 9600k has a 16.1% advantage over the 3600. That's a difference of 0.74% between average at 1%. Just like the 4k argument, you're trying to argue the fringe cases as if they are the norm. Stuttering in games is absolutely not the typical experience for people gaming on a 9600K.
 
  • Like
Reactions: Gurg

logainofhades

Titan
Moderator
3900x is AMD's fastest mainstream gaming CPU, with only the $700 3950 faster, and it still wouldn't have caught the 9600k. My point was, if AMD's fastest can't beat a 9600k, why would you recommend any AMD CPU for gaming? If you drop to your 3600 recommendation, the 9600k is now over 17% faster in Project Cars 3, and they both cost $200 at Newegg. There has not been any market shift to games having any meaningful benefit from more than 6 cores recently. Project Cars 3 and Flight Simulator are both brand new titles.

3900x doesn't game much different than a 3600, because 99.99999% of games cannot make use of that many cores/threads. Anything over an 8c/16t CPU is a waste of gaming is your only goal. Project Cars 3 and Flight Simulator are new, yes, but they are not very well threaded, and rely more on single core performance.

Using less than 20% of a 3950x.
 
  • Like
Reactions: digitalgriffin
You're arguing with blatantly false statements. If a $200 9600k is faster than any AMD CPU at gaming then Intel is a better value at every price point from $200 on up which includes every worth while CPU as no one should be buying any quad core CPU's at this point except budget level systems which isn't relevant to this conversation. It's not Intel's fault that AMD CPU's can't consistently overclock well. That's why major sites don't include overclocked results beyond PBO like they do for Intel, which pretty much all overclock to 5Ghz all core, in comparison reviews because there are no guaranteed worthwhile overclocks for AMD CPU's.

If your SOUL metric is FPS shooter games that are around today, then the overclocked 9600k wins versus a stock 3600. That is until you hit games like Civ, or AoTS. And there are more metrics than raw FPS and you know it. Like encoding/decoding/file archiving/rendering/streaming/downloading while playing. The 3600 wins at all these and you know it. In fact it would smash the 9600k into the ground, overclocked or not. You're being disingenuous in your arguments. But I wouldn't build a system around 6 threads any more. 8 threads I consider the minimum for future proofing for a couple years.
 
  • Like
Reactions: Phaaze88
3900x doesn't game much different than a 3600, because 99.99999% of games cannot make use of that many cores/threads. Anything over an 8c/16t CPU is a waste of gaming is your only goal. Project Cars 3 and Flight Simulator are new, yes, but they are not very well threaded, and rely more on single core performance.

Using less than 20% of a 3950x.
^^^^ DX11 is an Achilles heal to performance at these days. I was shocked to learn FS2020 was DX11. A lot of performance parameters are "pre-baked" into a multi dim array. But there's still tons you can thread on like the cloud/weather generation, the building generation (all dynamic), structural stability analysis, etc... I'm baffled why MS went with DX11 over 12. It's basically this gens Crysis and it may never run well.
 
  • Like
Reactions: Phaaze88

logainofhades

Titan
Moderator
Just stop with the false information already. If you don't have a real counterpoint, just say nothing, don't make up things to deceive other people reading these threads. I already posted the average fps comparison on page 1 of this thread. Nobody records peak FPS. In that comparison, the 5GHz 9600k had a 16.9% advantage. Here is the 99th percentile comparison from the same comparison:

In this comparison the 5GHz 9600k has a 16.1% advantage over the 3600. That's a difference of 0.74% between average at 1%. Just like the 4k argument, you're trying to argue the fringe cases as if they are the norm. Stuttering in games is absolutely not the typical experience for people gaming on a 9600K.

So you are calling Steve, at GN, a liar then? He literally found in his testing, the 9600k had frametime variance issues, making its performance inconsistent. The R5 3600 did not have said issues. Why do you keep defending a CPU that simply isn't superior in anything needing more than 6c/6t. Many games do not need such resources. I almost bought a 9600k myself, as I really didn't need a 3700x, for my gaming needs. I literally only play WoW. I bought my 3700x for F@H performance, more than anything. My previous chip was a 6700k. The 3700x crushes either chip, for what I wanted it for, without breaking the bank.
 

Shadowclash10

Prominent
May 3, 2020
184
46
610
1440p never gets mentioned by AMD fans because at 1080 and 1440 all AMD CPUs lag significantly behind the 9600k and above Intel CPUS in gaming performance. It isn't until using 4k that the AMD deficit for AMDs best CPUs narrows to a few percent as the bottleneck for performance of a 2080ti shifts from the CPU to the GPU. That is why the discussions in the comments quickly move to 4K even though the percentage of 4k monitors in use for gaming is relatively small.

My estimate would be that with the performance and $500 price for a new Nvidia 3070 the rates of both 1440 and 4k monitor adoption would increase rapidly.

On the graphics side, the best AMD current GPU struggles to match the three year old and two generation old 1080ti. The sad part is that even if you gave AMD the Nvidia technology, its GPUs performance would lag Nvidia due to Nvidia doing a superior job in updating and optimizing its drivers for new games.

Please keep this a secret, but it's good sport to poke the bear (ie AMD fans) with the facts/results from the reviews and listen to them squeal. Just mentioning a 9600K sets them off.
I'm not even talking about CPU performance :p. Just in general. People go, "Oh look at that 4K GPU, 1080p GPU, but not really 1440p."
 

Shadowclash10

Prominent
May 3, 2020
184
46
610
I know, right? It's that squealing that seems to have removed all reason from "Enthusiast" site reviews as well, as they cater to the loudest group it seems.

As GPUs get better, the delta in performance between Intel and AMD is getting larger. That's an empirical fact at this point with the 3080 having been released. All that gibberish about how Zen 2 would do better in the future has run into the brick wall of reality.

With a 2080 Ti, the best Zen 2 3900X was about 7% behind the 9900K.
With a 3080, the best Zen 2 3900XT (faster than 3900X) is 10% - 11% behind the 9900K @ 1080.
With a 3080, the best Zen 2 3900XT is 7-8% behind the 9900K @ 1440p
The 9900K and 10900K performed identically to each other with a 3080, so its unlikely games are going CPU bound on either of them - but they are with AMD.
The above is despite the Intel platforms using PCIe 3.0 vs AMDs PCIe 4.0
^^^^^ Facts

Link: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/27.html

This delta in performance will only get worse with faster GPUs.
Wait a sec. I thought there were very marginal benefits to having a 3080 vs 2080 Ti and other cards at 1080p?