Intel Unwraps The Rest Of Its Core X-series CPU Family

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

InvalidError

Titan
Moderator

People buying into HEDT often do so for SLI/CFX setups using CPU lanes instead of being "stuck" at x8 or having to buy a motherboard with a PLX or similar PCIe router chip and would "need" a minimum of 32 PCIe lanes from the CPU just for the GPUs. Then you have to throw in a few extra lanes for the highest-speed IO to avoid potentially bottlenecking the DMI bus to the chipset with frequently used devices like the main NVMe drive(s), Optane cache/swap, USB 3.1-gen2/3.2, etc.
 


Yes and testing is showing that running at 2-4k resolution with dual 1080 ti's that the x16/x16 PCIE bandwidth does finally matter. The HEDT platforms are good for the .01% of gamers that can afford dual 1080 ti's.

 

spdragoo

Splendid
Ambassador


That reminds me of the quote about a person who represents themselves in court...

At least Tom's Hardware actually gave us information as to how they obtained their results: data on the testing rigs, information on the tests they ran, how they measured the temperatures & wattages involved, etc. We just have your word for it... & quite frankly, that & $900 (plus sales tax) will get me an i9-7900X at my local Micro Center (http://www.microcenter.com/product/480702/Core_i9-7900X_Extreme_Edition_Skylake_33_GHz_LGA_2066_Boxed_Processor).

[HINT: in case you didn't read the link, it's currently on sale for $900]

Not to mention I seriously doubt that "your experience" is actually based on hands-on work with the Skylake-X CPUs, & is probably based on a single CPU (maybe 2 or 3, tops) from a prior generation.
 
G

Guest

Guest
You people are clueless.

180 W 16/32 CPU which will suck in gaming compared to Intel CPU with no overclocking room, generally high temperature and good cinebench score.
 
G

Guest

Guest


Yes you do buy $1,000 processor to do gaming and professional work and the current CPU which gives you that is Intel 7900x.

No, there are people like you who are cheap but again so pissed at Intel for whatever reason that they are going to buy $1000 AMD CPU which will suck in gaming compared to Intel counterpart, all around worse CPU.

Oh should i mention TDP on Thread Ripper which is 180W. Suddenly that is not important?

18/36 Skylake-X is 165W
 
G

Guest

Guest


If you read NewEgg reviews of 7900x no one complained about heat and people overclock CPU, in fact for most people runs <60C under stress. Overall rating of 7900x is higher than rating of x1800, says enough. I am not saying AMD CPU is bad but i am saying that lot of BS was released these days including Toms Hardware. The best part of this is...hey we overclocked 10/20 Intel cpu to 4.6Ghz and runs a bit hot...no kidding sherlock.

Next time try to overclock Ryzen to 4.6 and let me know about temperature readings if you get there with the overclock.

That's right...i don't think so.

 
G

Guest

Guest
I repeat again. I tested 6, 8 and 10 core variants of new LGA 2066 Intel CPU and non of them showed Temperature issues and they all overclocked passed 4.4Ghz.

As I said again go ahead and overclock Ryzen pass 4.4 and let me know about temperatures. Do the same for Thread Ripper...

cough cough...i don't think so

Lesson learned?

Go do your own research before jumping into conclusion. The worse thing you can do these days is to establish the view at the world based on reading you do on Facebook, CNN, Twitter, Tom's Hardware and similar sites. If you sole opinion is based on what these amateurs do and write...God Lord save us all.
 
How exactly are these going to avoid throttling?

On the 140 watt TDP chips, you can easily push 180 watts through the CPU at stock and the result is throttling on all forms of air cooling. What does 165 watts mean from a practical standpoint?
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010


Multi GPU is dead. Nvidia doesn't support more than 2 cards on their highend GPU's, and has disabled SLI altogether on their lower range cards. People thought DX12 would breathe new life into multiGPU support. It has been the exact opposite. Developers have no interest in coding for it.
 

InvalidError

Titan
Moderator

44 PCIe lanes isn't enough to fully support more than two GPUs anyway and nobody would SLI/CF an HEDT platform with low/mid-range GPUs when a single GPU delivers far more predictable performance across all titles instead of hit-or-miss for the same price so AMD and Nvidia pulling support for those configurations is practically doing customers a favor.
 

atomicWAR

Glorious
Ambassador


4k gamers would disagree with you. As someone that has used multi-GPU since the first Nvidia SLI/AMD Crossfire days I find that game support is pretty much stayed the same. Not all games need SLI for all the eye candy at high resolutions. Those games rarely if ever support SLI/CF. When games do need that much horse power, SLI/CF is almost always supported. So Yeah I would have to disagree with that outright. And while 3way+ systems for SLI are dead in gaming there are still rendering uses in content creation that have no such limits and is still very much alive. Plus you get users that run SLI and a Physx card or tri-fire+. Granted this is only useful to 1440P high frame rate--->4K but that eats up at least 3 slots. Take one more add-in card that uses 8x-16x link and all of a sudden you're out of lanes. If you were running a 28 lane CPU you couldn't even do that. That's why you have users like me who need a minimum of 40 lanes from the CPU. One of the many reasons HEDT CPUs existed in the first place. While this is not a huge chunk of the population, it is big enough to warrant needing more PCIe lanes then the lower tier chips have. So whether it is the high end gamer or content creation. Your assertion is false.
 

sfcampbell

Distinguished
Jun 22, 2009
39
0
18,530


Freak, dude, enough with the trolling already. It's embarrassing. I do own i5's overclocked to 4.4 & 4.6ghz (as well as i7's @ 4.6 & 4.7), but when I own a 32 core Threadripper why do you think I have to overclock it above 4ghz to beat the pants off of any other two of my other Intel boxes combined? And please don't vomit your FPS garbage again (and again and again); I don't want to hear it.

FPS is an absurd cheapskate crap argument. The difference between Intel's cheapest and most expensive CPUs fall within a single-digit percentile of gaming FPS performance, within which AMD's newest releases fall comfortably within the difference. And before regurgitating your warn out flippant fanboy-isms in response to this post, fix the seven grammatical errors you couldn't even get right in your initial one.


Cheers to TH!
 

sfcampbell

Distinguished
Jun 22, 2009
39
0
18,530


Lol ermahgherd Freak I had no desire to reply to any of your other posts, but this is just too juicy to ignore!!!!!

You know what -- splitting hairs between the $1,000 32t CPU at 180 watts vs the $2,000 36t CPU (with lower clocks) at 165 watts is totally a game changer!! Thank you so much for opening the eyes of We Lil' Peasants to this earth-shattering development!!!


...your overall lack of understanding of the topic in question is duly noted...



Bartender, another round so we can toast yet again to the site that you obviously read and use to spread your fanboy propoganda!!!! I'm going to pour myself another drink :D
 

TJ Hooker

Titan
Ambassador

Does non- real time rendering and content creation even require or benefit from a x16 link over x8?
And having a dedicated Physx card is pretty niche...
Lastly, how many add-in cards other than GPUs do you know of that require x8-x16 links?
 

spdragoo

Splendid
Ambassador


Ahh...I see now....so we should listen to anonymous customer reviews over actual testing benchmarks, because no one ever left fake customer reviews on a retailer's website, ever.

And maybe I'll overclock a Ryzen CPU when I a) actually have one, & b) if I actually find I have a reason to overclock it. To be honest, while it's nice to see overclocking numbers on hardware (both new & old), it's kind of like adding nitro kits to automobiles: a niche area for a very small segment of the population, providing a "boost" that the vast majority of users neither need nor want.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010


As a 4k gamer myself for over 3 years, and a user of the original holy grail of Voodoo2 SLI + Matrox Millennium 2, followed by numerous dual card/dual gpu boards including such obscure setups as the Rage Fury Maxx (almost unarguably, the biggest piece of <preemptively redacted> either Nvidia or ATi/AMD have ever released) and ending most recently with a 7970 xfire setup, while currently using an Intel X99 system, I disagree with pretty much everything you have said. I will never use a multi GPU setup again, unless there are fundamental changes to how the technology performs. Friends don't let friends do multiGPU setups.
 

InvalidError

Titan
Moderator

Tell that to Linus and his quad-Quattro 16k setup :)
 

atomicWAR

Glorious
Ambassador
@kinggremlin

You completely entitled to your opinion on SLI. But that doesn't invalidate the users who do find it useful. It is not cheap, it certainly is not for everyone but for those who want the highest setting/resolutions on some of the most demanding games at a frame rate 60HZ/FPS or better there can be no way around using it. I am not saying I wouldn't like to see the tech change for the better. I think MCM for GPUs using an interposer is the likely direction multi-GPUs will go in the long term (say 3-4 years from now mainstream wise, high end may well be sooner). This may kill of SLI/CF or conversely change how they work altogether. I am hoping on the latter. As long as GPU makers see a market they will find a way to make it easier for software Devs to use more GPUs. I think MCM will making the multi-GPUs "invisible" to the code they are running. Seeing one GPU when there in fact 2 or more with excellent scaling will help the high end market user a lot. Nvidia is claiming near perfect scaling now on their RDA efforts. I know AMD is hopeful with Navi as well. Until then we are stuck with the tech on hand.

@TJhooker

I agree physx is fairly niche but it does exist in the highest end of gaming and, sadly the uninformed lower/mid range where folks are actually slowing themselves down because they physx card is to slow or the resolution/games settings are to low to really need it. As for other cards you could be adding. Some of my rigs have ESATA cards (4-8x), I also prefer/need to use a PCIe addin cards for m.2 usage for better cooling (angel wings). Shoving M.2 drives in tight hot places just doesn't work for me. You also get users who use capture cards, Video editing cards, etc. So the use case is still there even if you take SLI out of the equation.
 
Status
Not open for further replies.