Intel’s Core i7-9700K: What We Know (And How It Could Beat AMD)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310


Intel never was unless you call allowing competition to catch up is a bad position. The rest of the semicon industry has never been this close with the Chipzilla, you can say that as a bad position.
Intel has been milking the industry with their recycled 14nm parts, since there is little competition and people would buy them anyway.

AMD's 7nm just puts them on par with Intel on a manufacturing node
 

barryv88

Distinguished
May 11, 2010
121
33
18,720


Oh come come! How is any Ryzen 5 or 7 "slower" than a 8400? You're already over 100fps+ in most games! At higher res such as 2560 and 4K, you're not even gonna see a difference between the platforms. You're not gonna outrun your opponents on an Intel system, nor are you gonna snipe better or have better reflexes either.
Stop talking crap, please.
 


Hell no, for the first time it put AMD on track to be ahead of Intel and Nvidia. IBM is pushing the technology at Global Foundry... big time. For them, it is not only for AMD, but for providing leading edge node not available outside of Samsung Fabs today. Intel have been struggling with their 10 nm node for almost 2 years.

With ARM chips aimed at server and AMD in the picture, Intel has nothing to gain, however much to lose.
 

bbertram99

Prominent
Dec 11, 2017
4
0
510
How about everyone just stop upgrading their good equipment? You can still run a gen 2 i7 with todays games just fine. Its been proven many times over. The video card is the only thing you need to upgrade every 3-4 years. The rest can last for 10+ years.

If you are buying brand new get as much as you can afford it will last you 10+ years.
 

Eximo

Titan
Ambassador


As I said I don't want to get into the debate about the process node naming. It is demonstrably false from all parties, usually a single metric might meet that criteria and that is a measurement of how accurate they can be, not necessarily the size of anything in particular. And the whole CPU isn't built to the same scale. The only proper way to compare them is basically impossible (ie building the same chip with different nodes, you can see a little of this with ARM chips) What we have is essentially square area, transistor count, and power consumption, and performance to look at.

AMD is pushing a low power node to run at high frequencies, not ideal. When they push onto to the next node (not the rebranded 12nm they are using, which is only a little better) you should see an increase in clock frequency potentials akin to Intel. But by then Intel will have a new architecture on a new node. And I stress again that Intel's 10nm is likely superior for CPUs compared to up and coming 7nm nodes.

Remember it is easier to play catchup then it is to innovate. I think optimizing is a smart move while getting an effective node working at acceptable production levels.

I want to stress I am not picking sides. I find all of the marketing and positioning quite troubling, but that is what good marketing and strategy are for, increase the bottom line for shareholders.

You can get a lot of direct comparisons here:
https://en.wikichip.org/wiki/14_nm_lithography_process

 

Giroro

Splendid
"i9 is just a rebranding(amongst many). I would more point fingers at the terrible idea of making Kabylake-X at all."

I might be getting those two product lines mixed up.
Hopefully Intel starts bundling old processors with a niche memory product under a new brand name. That will somehow simplify things and ease my customer confusion.
 

Eximo

Titan
Ambassador


Kabylake-X was the i5-7600k and i7-7700k transplanted onto X299 as the i5-7460X and i7-7740X, without quad channel memory support or the PCIe lanes. (Might have just been an excuse to do something with fully functional i5 and i7 with broken iGPUs) Only real difference was slightly higher TDP and it was a little easier to overclock. Just made no sense given the cost of the motherboards. Then, only a few months later, Coffeelake.

Were I a motherboard manufacturer I would try and make a Kabylake-X only X299 board with all the expensive features trimmed off. i7-7700k is still a decent option, would be nice to see the i7-7740x compete price wise.

 

DGurney

Prominent
Feb 21, 2017
30
0
530
Why buy ANY CPU that doesn't have a hardware fix for Spectre & Meltdown? Why even discuss a processor roadmap until these vulnerabilities are eliminated?

Otherwise, you're buying a chip that has to be permanently gimped with software workarounds.
 


What you're stating would probably require a ground up redesign by Intel and that's not happening at present. Perhaps in the near future they'll implement what you mentioned. Until then, mitigation is going to be software driven.
 

mischon123

Prominent
Nov 29, 2017
83
2
640
The 1080 is a current bottleneck. It does 4k gaming. Just not very well. I get 60fps/60hz/4k and 125fps/120hz at 2.5k in WOT. I prefer gaming at 4k. There is no difference in my scores but it renders nicer and 60fps is fine at 4k. That's flat render. Immersive 3d will be an alltogether different game and will need at least 8k, 16k or more and massively parallel dedicated asics and not run what is on a 1080 board.
 

MSC123

Distinguished
Aug 20, 2008
2
0
18,510
AMD runs much too hot, and games poorly. I'll spend the extra $$ on an Intel chip that has decent temps and games better.
 

bit_user

Polypheme
Ambassador

Some people don't have the luxury of waiting.

Now, as it pertains to this chip, Intel has said it'll have chips out by the end of the year with the mitigation in hardware. Given when the i7-9000 series would ship, perhaps it'll have the mitigation.

Personally, I will wait.
 

aTomek

Prominent
Jul 1, 2017
7
0
510
I don't trust Intel since it become clear that they knew about Spectre/Meltdown bugs when they released they latest architecture. Meltdown is easiest to exploit, could be even exploited through javascript. I feel cheated by them. My next build is AMD, it is not affected by Meltdown, I'm waiting for 2800X.
 
Apr 7, 2018
1
0
10
Raja Koduri went to Intel to work on project Canis (as in VY Canis Majoris, the 2nd biggest star known) for AMD:
https://wccftech.com/amd-project-canis-flagship-intel-joint-venture/

Also, AMD's upcoming Ryzen 2k series pricing is known, as well as some benchmark leaks:
https://wccftech.com/amd-ryzen-7-2700x-cpu-review-overclock-benchmarks-leak/
 

bit_user

Polypheme
Ambassador

I trust you saw the part where it's an April Fool's joke?

But thanks for directing me to read about the star. Quite interesting - it stretches the definition of a star in many respects ("the star is a hundred thousand times less dense than the atmosphere of Earth at sea level").

https://en.wikipedia.org/wiki/VY_Canis_Majoris


Isn't this the same process as their new APUs?
 

MxMatrix

Distinguished
Jan 20, 2012
47
0
18,540
Any heavy PCI-E user shuns intel these days. By bringing 64 lanes in the game on a fairly basic threadripper 8 core AMD is my choice today. Another point is chipset features, where AMD is just providing whatever you need, intel lacks and lets less quality manufacturers fill the gap. It's not only performance per core I want, I take a holistic view on my system. Every part must be able to perform at its peak.
 

msroadkill612

Distinguished
Jan 31, 2009
202
29
18,710


Yes, Intel has used its reserves, and amd has parried well, and relentlessly lowered prices too.

Now, as you say, after a few months of a single product high end saviour in the 8700k which reversed amd's newfound 60/40% desktop dominance, amd can counterstrike with the zen + in a few days from now, and intels cupboard is bare.

They have no reserves other than a 10nm process, maybe. AMD have options galore, including better AMD cpu/ AMD gpu teaming.
 

logainofhades

Titan
Moderator


Actually this is wrong, for current gen CPU's. Overclocking on AMD can be done with a decent air cooler, as the heat spreader is soldered on for the non APU Ryzen chips. The wraith cooler is capable of handling some overclocking even. When was the last time you could overclock an Intel CPU with a stock cooler? The only CPU that comes to mind is the Pentium G3258, and that is only because it was a dual core, using the same cooler that locked quad cores used.

Intel once again cheaped out with the TIM, and gets hot quite easily. To really push a coffee lake CPU, you have to spend more on cooling, than you do currently with AMD. 1080p, AMD is a bit behind, but still quite capable. 1440p and 4k, those differences all but disappear, as the GPU is more heavily relied on.

I love my 6700k, but if I had to buy new today, I would definitely be considering AMD for my build. Long gone are the days of i3 performance, on an 8 core FX. AMD is finally worth considering again.
 

Colin_10

Reputable
Feb 9, 2016
50
0
4,640
I always follow AMD vs Intel and Nvidia vs AMD as they are great examples of what happens to an industry when one side is overly dominant for years and then suddenly gets challenged. Both Nvidia and Intel were sort of asleep at the wheel, and AMD has done a nice job of rocking the boat, much to the benefit of consumers (minus the damn crypto market.....)
 

aTomek

Prominent
Jul 1, 2017
7
0
510
Most important fact is I7-9700 still doesn't have hardware mitigation against Spectre/Meltdown attacks.
 

regs01

Honorable
Apr 15, 2018
17
6
10,515
@redgarl
Take in mind, with FinFET those numbers are pure marketing. It's not some specification characteristic. It means nothing.
Samsung 10 nm density is almost same as Intel's 14 nm. GloFo and Samsung 7 nm are just slightly denser than original Intel's 10 nm. They are in same generation. Intel just not going that way of marketing.
 
ANONYMOUS SAID:
The TURBO button on old PC's, if nobody has said, was to SLOW DOWN the computer not to speed it up... old games were tied to the clock speed so when newer PC's ran faster so did the games... the Turbo button phased out pretty quickly since it only reduce a specific amount but for a short time it was useful and then people realized a TIME BASED method made far more sense (seriously what was the thought process there... nobody will every play this game on a CPU much faster than a 75MHz model?)
 

Eximo

Titan
Ambassador


I had so many chassis with a turbo button, but it was only functional on my old 133Mhz SX, we used it for some non-updated games without speed limiters.

 

bit_user

Polypheme
Ambassador

I swear I remember differently. We had a 8088-based IBM clone (PC's Limited - the precursor to Dell) that normall ran at 4.77 MHz and the turbo button (which defaulted to OFF) boosted it to 6 something MHz.

Later, the trend probably flipped to having turbo default to ON. But, without some well-establish reference point (like 4.77 MHz), it ceased to be very useful at all.
 
Status
Not open for further replies.