Amd Ryzen Threadripper & X399 MegaThread! FAQ & Resources

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


At stock the 1950X with MCM package and 6 more cores at stock consumes 20 watts more! Overclocked 45 watts more! Bwahahahaha! This shows the 7900X is getting stomped and you think it's a win? You have to take off those beer googles you are wearing! They are distorting your ability to use logic and reason!
 


You're losing view of the whole picture, Juan.

I don't expect TR to be a records sale thing nor turn the tide; same for Ryzen TBH. Intel fanbois are plenty everywhere, including Germany FYI. For heavy duty tasks AMD has reached parity with Intel, but it's not *better* in all metrics, so you will always argue "but this! and this other important thing! not better, hence worse!", and it's an annoying circle-jerk you're creating.

What the numbers we're giving indicate a simple thing: AMD has a competitive CPU. If you really want to understand that, you have to go back and see how Bulldozer and Piledriver numbers were in the first half of the year for each. This is nothing like that time. And Steam, obviously, won't account for things that escape the gaming market. But more importantly, they account for laptops and desktop PCs. I'm still awaiting some AMD laptops, specially in the cheaper arena with their APUs, so start displacing some of Intel's dominance there. If you really want to play the "market share" figures, wait until then for a fun time seeing how the graphs *will* fluctuate... Well, if the APUs don't suck, obviously. I'm cautiously optimistic for them though.

Cheers!
 


You have said this numerous times, exactly the same. Your wording is evasive. There is no exchange of technologies within most X-lic agreements. It is for separately developed technology and bears no resemblance to an agreement like AMD licensing them Radeon IGPU tech, which would be specific, limited and heavily technologically walled off so advantages couldn't be transplanted to Iris Pro, esp VCA. I cannot waste more time on this so I will direct you to the example we used and show you how you were so knowledgeable you got caught in a rather obvious trap.

The MMX cross licensing agreement did not give AMD the right to use Intel's MMX technology in their processors or any others... it existed purely because AMD developed a way to add Matrix Math instructions within their arch then realised that Intel has trademarked the initials to Matrix Math Extensions so they could neither say they had support for MMX nor Matrix Math Extensions. It's illegal to use abbreviations in trademarks for this reason. AMD sued and the 'cross-license agreement' was a compromise to allow Intel to continue using a technically illegal abbreviation as a trademark without further challenge from AMD, whilst letting AMD say it supported MMX.... sharing the name between them and still forcing others to have to litigate to claim MMX support without infringing on a trademark. It was purely marketing/law/biz based and both AMD and Intel had already separately completed development. They didn't share a bit of tech at any stage so you're full of proverbial.



I have 3 prototype Apple computers running on Developer platforms. I can disclose this to you because I have a license, but if I kept my mouth shut I could have anything in prototype here I wanted without breaking any laws and hearsay would be hearsay.




New to me, but I could never see the point of Phi on a PCI-E bus. I don't VM. For everyone else it's like trying to sell a 50 x P2... MMX cards on a single x16. Now that's somewhere that could do with a talk about latency!



At 4.6Ghz Techspot quoted the Fully Integrated Voltage Regulator on the CPU as reporting 400w power draw. Hold that thought.



BIOS updates brought power draw down... lower performance too. Power efficiency came from somewhere. Further fuel to the flames of 'SkyX is a hack' and the BIOS and the motherboard issues were likely down to the fact that the 7900X wasn't announced with the original lineup... because the original lineup still wasn't competitive.

Buying a VRM fan was what he said to do. People haven't needed to do that since Pentium 4 EE... except when LN2 cooling. Bad sign, and not a mistake I see being made by so many vendors at once, only on lower end products, and without similar issues on other competing platforms with similar TDPs, that they issued boards for months before.



Then you surprise me. As someone who so keenly talked up the AMD thread, it would seem obvious to me that you of all people should know that Intel has a huge market share lead, a 10 year performance lead and millions of faithful customers. It would take more than one product range to sway them from buying blue when they have for a decade. Must I mention the Pentium 4 again because we are both aware it sold quite well.
 


P074H98.png

mnRX6Hh.png


There is no limit to what you will do to try and skew some hypothetical win! People are not making negative reviews, because they are getting paid! History tells us that is a dirty underhanded Intel tactic that is well documented!

Edit: Since when does Intel offer discounts on chips because that are selling good??? Bwahahaha you make me laugh! They increase prices on chips that sell good! They are lowering the price to try and be competitive with the superior selling 1950X!!!!!!!!
 


Nothing of that invalidates the fact that people that complained because the i9 was consuming ~400W, is now silent when TR consumes more power. The double standard is evident.

Not all cores are the same. AMD uses simpler narrow cores that consume less power each. Not a mystery here, simply a consequence of the laws of physics. That is the reason why Atoms cores are simpler than Xeon cores.

Not all process nodes are the same. 14LPP prioritizes efficiency over performance. That is why the 1950X hits a wall at 3.9GHz, whereas the i9 can hit above 4.5GHz.

Not all overclocks are the same. A chip doesn't consume the same power at 3.9GHz than at 4.6GHz. Note that increasing clocks from 3.4GHz to 3.9GHz in the 1950X has increased the power consumption by more than 100W. Increasing clocks from 3.3GHz to 4.6GHz in the 7900x has increased the power consumption only by 76W.

Pushing the 1950X to 4.6GHz would increase power consumption to about 700W. The i9 does ~500W at same clocks.
 

You are just trying to be an Intel apologist! The fact is that for 20 more watts you get 6 more cores with ThreadRipper! That's an amazing performance metric in itself! It's doesn't take a genius too see you're just griefing over Intel!
 


No one here has negated that Zen is more competitive than Bulldozer or Piledriver, because it is more competitive. The point was other...

The point is that we have been reading enough hype and bold claims! Before launch the hype train pretended Zen was going to be the new K8 and take over. It didn't happen. Immediately after launch, reports about RyZen selling as hot cakes. It didn't happen and AMD soon had to reduce the pricing of the chips. The 1700X launch pricing was of $399. I can find it today for less than $300. It is a 25% discount... and it is not enough. I expect further discounts in next months.

A bit latter reports about RyZen giving to AMD the greatest marketshare increase in the history. It was fake. Even today Ryzen only brings 4% higher marketshare to AMD.

Now are the sales on a well-known pro-AMD country, where AMD have invested millions and millions.

Aren't three years enough of continuous hype and fake data?
 


The Phi moved to standalone CPU with KNL. The PCie has been eliminated.

The PCIe card versions of the Phi are only for legacy users.
 


The launch price was $999. The current price is only a 4% discount.

The chips are being really discounted are those from AMD. For instance the 1700X did launch at $399, you can find it today by $100 less. This is a 25% discount. Expect further discounts in next months

 


I explained in the spoiler the flaws with that statement.
 


Off topic
Amd Ryzen Threadripper & X399 MegaThread! FAQ & Resources
 


Your statements didn't change anything.
 


The review you posted is reporting times of 200+ seconds for both 1950X and 7900X.

Oficcial V-ray submissions on 09//03/17 for:
1900X range from: 00:40.608 to 00:48.409
7900X range from: 00:52.392 to 01:03.037

The review I posted is inline with this data, yours is off by a factor of 4X. This is certainly not the official V-ray benchmark. I know someone that usually claims whole reviews are wrong because of small (<10%) discrepancies in a single benchmark.
 


The thoughts hidden in the spoiler explain why statements like your "20 more watts you get 6 more cores" are misguided. I reproduce the thoughts here:

Not all cores are the same. AMD uses simpler narrow cores that consume less power each. Not a mystery here, simply a consequence of the laws of physics. That is the reason why Atoms cores are simpler than Xeon cores.

Not all process nodes are the same. 14LPP prioritizes efficiency over performance. That is why the 1950X hits a wall at 3.9GHz, whereas the i9 can hit above 4.5GHz.

Not all overclocks are the same. A chip doesn't consume the same power at 3.9GHz than at 4.6GHz. Note that increasing clocks from 3.4GHz to 3.9GHz in the 1950X has increased the power consumption by more than 100W. Increasing clocks from 3.3GHz to 4.6GHz in the 7900x has increased the power consumption only by 76W.

Pushing the 1950X to 4.6GHz would increase power consumption to about 700W. The i9 does ~500W at same clocks.

Of course, nothing of that change a bit the main fact that the people accusing the i9 of consuming huge amounts of power remained silent when TR consumed still more power.
 


No one said anything about anything 'official'.

Also one thing is the discrepancy between reviews that measure exactly the same and another thing are reviews that measure different scenes on a rendering engine. In the first case we have measurement errors. In the second case we have variation due to different workloads.

The scene used in the HFR review is more complex and realist and that is because it takes ~4x more time to render. Same happened with Blender. The toy "Zen scene" used by AMD marketing gave RyZen a 3% IPC lead over Broadwell. A more realistic scene gives to Broadwell the IPC lead.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9ZLzMvNjU3MDAzL29yaWdpbmFsLzA2LUJsZW5kZXItTG9vcC01LnBuZw==


It would be weird to pretend that Tomshardware review is "wrong", because the result with the Island scene doesn't agree with the Zeb scene used by AMD marketing. Therefore stop pretending that HFR review is wrong only because the result of the benchmark is not what you want.

My point was that HU review graphs posted above are biased because only reproduced cases where TR was faster, and didn't give a single benchmark where i9 is faster. But, we know HU reviews are biased. It was demonstrated.

But I have to thank you for mentioning the V-Ray 'official' benchmark. As you know they give a CPU list and a GPU list. The best value obtained by the TR 1950x is 00:40.608. The best value obtained by GPUs are in the 00:14.359 range. I.e. GPUs are ~2x faster. The TR chips shine on workloads that are better run on GPUs.

As I tried to explain more than once, the CPU must at least shine on latency workloads and the GPUs must at least shine on throughout workloads. AMD Zen design is oriented to throughput. Zen lags in latency-sensitive workloads, but works fine in workloads where a GPU works better.
 


What rewrite? As stated it was a problem with some mobos and with the PSU he used:

In the video, der8auer elaborates to basically claim a completely lack of consistency among the quality of VRMs and their heatsinks in various manufacturers. In his first test, he takes a CPU that is known to do 5.0 GHz and on a Gigabyte Aorus branded mainboard found himself unable to even hit 4.6 GHz with dangerously high VRM temperatures. He goes on to blame the heatsinks on the VRMs, going so far to call the Gigabyte solution more of a "heat insulation" device than a cooler, as a simple small fan over the bare VRM array did many magnitudes better than a simple standard install with the stock VRM cooler attached. After an MSI-branded board did similar, it became clear this was not an isolated issue.

der8auer also went on to criticize the lack of voltage input in the form of many boards having only a "single 8-pin connector" which der8auer claims is not nearly enough. He claims a cable temperature of nearly 65 degrees Celsius on the 8-pin EPS cable which is obviously disconcerting, though TechPowerUp has been in discussions with renowned PSU-tester Jon Gerow (Jonnyguru) who feels the "all-in-one" cable design on the Super Flower PSU shown in the video may be partially to blame here for the heat level with that current draw. It's hard to tell which part is more at fault for that temperature and we will update that as we know more. Until then, here's is Jon Gerow's direct comment on the matter:

If you used the SuperFlower PSU in the video with the crystal connectors, that's part of your problem. Those "universal 9-pin connectors" have less conductors than most other modular PSUs because the same connector that's used for EPS12V, PCIe, etc. has to also support +5V and +3.3V for Molex and SATA and then there's an "LED pin" which, when grounded to a ground pin, turns on the interface's LED. A horribly bad design. This is why the wires would be so hot. I suggest checking the voltage at the PSU and then at the motherboard's EPS12V to see what the drop looks like under load. If the voltage is significantly lower than +12V, the board is going to have to pull more current than it normally would. I then suggest using that AX1500i you have on the shelf behind you and see if you end up with the same results since that modular cable for the EPS12V is four +12V pins and four grounds. -- jonny

There were also problems with some launch BIOS. All those problems are gone today.
 


You are welcome. But it's me who should thank you.
The GPU time you quote required 4x Nvidia Quadro GP100 @ ~$8000 each. Comparing a $1000 CPU to $32000 worth of GPU's it's not fair at all. :no:

Not to mention the difference in power draw and the fact that you still need a CPU anyway.

 


My point about that CPUs would target CPU-like workloads, whereas GPUs would target GPU-like workloads. The fact that TR is being advertised for rendering tasks says it all.

Acquisition costs without considering total costs, including hour-work salaries, means nothing.

It is evident that 'official' benchmark is not using all the performance that bring the four Quadro cards, because a single 1080Ti does 16 seconds. Therefore pretending that one needs $32000 in GPUs to do the work of a $1000 is so ridiculous as pretending that 1080Ti is about so fast as four Quadro

Yes, one has to acquire a CPU with the GPU, and whereas the GPU does the rendering the CPU can do other work. This is the basis of the HSA approach by AMD, which recently has been forgotten by almost everyone.
 


You are wrong again, a single 1080 Ti completes V-ray in about 1 minute or more. To complete V-ray in 16 seconds you need 7x or 8x 1080Ti. You where clearly mislead by a typo.

1 00:16.091 Intel(R) Core(TM) i7-6900K CPU @ 3.20GHz x16, GeForce GTX 1080 Ti 11264MB x7
2 00:16.305 Intel(R) Core(TM) i7-6850K CPU @ 3.60GHz x12, GeForce GTX 1080 Ti 11264MB (missing x7 or x8)
3 00:16.333 Intel(R) Xeon(R) CPU E5-2620 v4 @ 2.10GHz x32, GeForce GTX 1080 Ti 11264MB x8
4 00:16.601 Intel(R) Xeon(R) CPU E5-2630 v4 @ 2.20GHz x40, GeForce GTX 1080 Ti 11264MB x8

Results with single 1080 Ti:
153 00:58.298 Intel(R) Xeon(R) CPU E5-2683 v3 @ 2.00GHz x56, GeForce GTX 1080 Ti 11264MB
155 00:58.706 Intel(R) Core(TM) i7-5960X CPU @ 3.00GHz x16, GeForce GTX 1080 Ti 11264MB
156 00:59.206 AMD Ryzen 7 1700X Eight-Core Processor x16, GeForce GTX 1080 Ti 11264MB
157 01:00.021 AMD Ryzen 7 1700 Eight-Core Processor x16, GeForce GTX 1080 Ti 11264MB
158 01:00.315 Intel(R) Core(TM) i7-7700K CPU @ 4.20GHz x8, GeForce GTX 1080 Ti 11264MB
159 01:01.771 Intel(R) Core(TM) i7-5930K CPU @ 3.50GHz x12, GeForce GTX 1080 Ti 11264MB
160 01:02.236 Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz x8, GeForce GTX 1080 Ti 11264MB
...
209 01:18.122 Intel(R) Core(TM) i7-6950X CPU @ 3.00GHz x20, GeForce GTX 1080 Ti 11264MB
210 01:18.818 Intel(R) Xeon(R) CPU E5-2696 v3 @ 2.30GHz x36, GeForce GTX 1080 Ti 11264MB
211 01:18.984 Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz x8, GeForce GTX 1080 Ti 11264MB
213 01:21.472 Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz x32, GeForce GTX 1080 Ti 11264MB

Have a good weekend.
 
Juanrga you claim many bias paid reviews to make Intel's Skylake-X look bad compared to AMD's ThreadRipper, but you are the only 1, and I mean only 1 saying this is happening! You are scouring the internet finding any scrap of contradiction to help your point. No matter how invalid to make a hypothetical win in your own mind! No one else is believes it! At some point rational people re-evaluate what they are saying in the presence of overwhelming evidence to the contrary! No matter how much you wish the sky is not blue it won't change the fact that for the overwhelming majority the sky is blue!
 


Do not confound the idea that x86 and x86-64 are cross licensed with the idea that AMD or Intel could use patents for various core technologies or memory technologies from their competitor.

The only thing included in the cross licensing deal is the ISA. Nothing else.
 


Not all cores are the same. AMD uses simpler narrow cores that consume less power each. Not a mystery here, simply a consequence of the laws of physics.
True, and this only emphasizes my point about ThreadRipper's superior performance vs. power consumption compared to the 7900X based on design for professional content creators!

Not all process nodes are the same. 14LPP prioritizes efficiency over performance. That is why the 1950X hits a wall at 3.9GHz, whereas the i9 can hit above 4.5GHz.
You admit the process that ThreadRipper is built on offer's superior performance vs. power consumption compared to the 7900X for professional content creators!

Pushing the 1950X to 4.6GHz would increase power consumption to about 700W. The i9 does ~500W at same clocks.
This is a hypothetical statement that does not exist, which is the basis for your argument! I refer back to my statement What you said didn't change anything! You are just arguing to argue, and offer proof that you already know the truth! You are so bias even when you present the truth to yourself you can not find it!

Confirmation bias is the tendency people have to favor facts or arguments that confirm the beliefs and positions they already hold. The extreme form of this bias is referred to as “belief perseverance” when people hold onto their beliefs even after they've been proven false.
 


You are deflecting from history by trying to focus on one part of it that added to the embarrassment! The History is that the entire Skylake-X product line made no sense, and was an obvious cash grab by Intel! All popular YouTube reviews expressed this! You use a "German"(In your mind a pro AMD country) overclocking enthusiast statements to try to change the narrative about Skylake-X! The numbers speak for themselves! The difference in views, likes, and dislikes between the most favorable YouTube commentary about Skylake-X vs. the most detrimental YouTube commentary! Linus 2,487,668 views 117K likes 1k dislikes vs. der8auer 39,781 views likes 651 dislikes 116. This is the history that you are dismissing or failing to recognize! And it's amusing to me that the reference that you use is a German in a country where you think is predominantly pro AMD, because for the first time in 15 year AMD out sold Intel! Bwahahaha! The ridiculousness of your contradictory statements continues to be a great source of amusement for me! I suggest you watch the Linus video!
Linus
I have been conspicuously silent. About Intel’s new high end desktop X299 platform, and the CPU’s to go along with it. And this was on purpose, but probably not for the reasons that you think. I’ve spent the last few days walking the show floor, chatting with industry folks, and generally trying to wrap my head around what the actual hell it is Intel is trying to do with this launch.

[video="https://www.youtube.com/watch?v=TWFzWRoVNnE&feature=youtu.be&ab_channel=LinusTechTips"][/video]
I have some things to say - Core i9 & X299
2,487,668 views 117K likes 1K dislikes


der8auer
https://www.youtube.com/watch?v=kpoies2JcmI&feature=youtu.be&ab_channel=der8auer
39,781 views likes 651 dislikes 116
 


My claims was "many biased reviews" I didn't wrote paid. And I demonstrated the source of the bias affecting SKL-X:

- Using engineering samples instead retails chips.
- Using mobos incompatible with the chip and burning the chip.
- Using launch BIOS with problems and not retesting after with improved BIOS.
- Cherry picking benchmarks to warrant that the chips don't win any benchmark.
- Measuring power on 512bit workload, but then measuring performance in 128bit workload, ruining the efficiency.
- Crippling the CPU performance with GPU-bound and frame-limiting settings.
- Comparing SKL-X chip on stock settings to overclocked RyZen and TR chips, but labeling the graphs all chips as stock.
- Inflating prices of SKL-X chips on performance/price graphs.

And that is accompanied by bad press and fake news. For instance the fake news about AMD getting 10% increase in marketshare, or the articles about the SKylake bug that not only forgot to mention that the bug doesn't affect to the SKL-X line, but the same media didn't publish anything about thr RyZen bug.

As der8auer wrote after reviewing the i9-7900X: "so what's all this negative press about? I don't really understand it."
 
Status
Not open for further replies.