AMD Ryzen 7 1800X CPU Review

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Wow, what a roller coaster... it always gets pretty good by page 4, after everyone you want to hear from has said their piece on these types of articles.

AMD has an amazing processor here but in usual AMD fashion, maybe its just part of being the underdog, cannot come running out of the gate at full steam. I think we're all used to that by now. AMD releases something revolutionary, for better or worse, and then the rest of the PC world has to take stock of it and make adjustments. It has all the signs of being amazing provided you are a well informed buyer that needs what it can do.

I'll be waiting for their APU line.
 
Hi,
was anyone able to confirm the ECC memory support? In what setup - Ryzen + what motherboard/memory?
Thanks
 
It gets the real geeks a chance to see how the new processor's architecture stands up against a previous processor. And it gets overclockers a chance to estimate their performance at their targeted maximum fixed clock, before they buy.

Right, because we removed that around a year ago. And I'm guessing you're going to say that we used a crystal ball to determine that this would be the BEST benchmark to remove...to get our MAXIMUM payout from Intel...right?

Wait, are you saying Tom's Hardware gimped its X99 system to help AMD show better? Because Quad-Channel is a feature of the X99 platform, and to disable the feature would be to cheat on AMD's behalf. Also, I think the conspiracy theorist above would like to have a word with you 😀
 


That is exactly why the questions are provokative to say the least but are right.
They will compete with 300 Intel stuff (for gaming with 1600x) this is clear as now to 90% the people as I read opinions and facts written and reflected from all of us. IT's clear that 1800X and ryzen 2 will go well for games and VR at least in 1 year (and ryzen 2 anyway is not out for other 4 years - but we agree ALL, no discussion there, that R7 are not for that, they are top for PRO makers etc and gamers and will be great in future .. but no need to buy it). Anyway someone was remembering that games will not adopt as many cores as 1800x ever so anyway for all those who wanna spend less then intel (which will have to put down its pants again) wait for 1600x ... for US ppl will be under 300 so... cheaper then intel and also more VR and future games ready as more slowly in 1 2 3 years will be developed for more cores ...

Why provokative? Sayin that Amd is trying to go against intel i5 with a 1800x ... is just really try to push something isn't true. But I guess you were questioning it or saying it rethorically\ like pretending to mean it. (they are not doing that).

Once the all Ryzen armada is out they will be able to play the Price games too (it's possible prices of various r7 r3 r 3 will go up and down ... and depends on how we answer and how Intel answer and how gamers and devs answer how HW market answer .. we are back FINALLY to COMPLEXITY :))) ) . At the moment I remind you taht commercially they are a lame duck. They have this good stuff out, but they have also other Gen stuff out. So .. make your count, and reflections. You all understand that Intel has days counted until June when Ryzen 5 will roll and Vega 11 too and 10 and R3 at the end of the year so... the fight or anyway the THINGS WE NEED TO KNOW TO SPEAK are way far more that what we have for the moment.
Aren't you positive I mean optimistic or happy for what is happening and the first results? How can we not be?
 


No. Disabling a feature is known as imparting bias. Think about this: You want to have a race between a marathon runner and a sled dog. So, you untie the dog's sled to make it fair. Or you tie a sled to the marathon runner to make it fair. Or you test it both ways: "Gaming" and "Professional Workloads". Done.

But the dog has FOUR legs and the person only has TWO legs. Because four legs is a dog feature, to tie its front legs to its torso would be to impart bias.
 


Here you make me come to my mind the question I had some weeks ago or better the fact somebody told me:
Is it true as I learned that using SLI or CFX is just like a russian roulette? IT's a big risk. So many people (maybe they know the games they wanna play accept it or buy video cards accordingly - I AM NOT LIKE THAT I buy one and then it needs to work greatly for all).

So as I thought, doing a double videocard thing is still today and maybe still weill be even in 2018 when huge Navi tech by radeon will come out? (and we won't need anymore 2 videocards or more in a row, lol) .. is it is just a story telling stuff?

Because sombody told me that using two cards (GPU I mean) is really like running a compatibility risk a problem with the game (I am not talking about choosing the right GPU that work well together... that is not an issue), I mean the results. So do you suggest of having a good GPU instead of two (no matter good or not) because if this problem that exists since years especially in 2011 and still today.
 
Let me challenge your best go cart against mine. mine go cart has a V8 and yours a straight 8. Is it fair? Best Intel to AMD.......different class comparison
 
Another consideration is upgrade path - I'm buying the 1700 with the expectation that Zen 2 will work on my motherboard. I can upgrade as prices come down and have a nice path for the next 4 years (AMD stated). Intel gets either a new socket or chipset every few years.
 

If AMD ends up identifying IO bottlenecks, gets many complaints about the IO complement or anything else that may require tweaking the socket interface, Ryzen+/2 could end up requiring an AM4+ motherboard to enable its new features.
 


I thought personal attacks and insulting is not allowed here and yet you cant control yourself as a moderator and insult people and tell them to take their medication.

shame on you. you are a moderator. CONTROL YOURSELF.

Once you become a Mod, you are not allowed to speak like this because no one can hammer you and it is called "abuse of power" ...

Even if they talk bad , you just punish them not get in a fight. I am a Moderator in other sites and I know this very well.
 


Take it easy, girl. That's just a comment (and a CPU).
 
I see a CPU that at 1440p on a GTX1080 runs the 1080 at 100% exactly the same as the 7700k basically nullifying the clock difference. So if your on a higher resolution your effectively getting the same gaming performance within a couple frames but your also getting all the benefits of 16 threads of a Ryzen 1700 (better streaming, compression type stuff etc) instead of the 8 threads of a 7700k for slightly lower price.
I honestly don't see why it's getting all the hate. No one expected it to beat a 7700k in single thread performance. It's still not a bottleneck to the highest GPU's available at 1440p and higher. Seems like a pretty good chip to me.
I think the 1800x and 1700x are a waste of money. The XFR seems to be a dud feature (restircted to 100Mhz) and everyone seems to be able to clock their 1700's to 3.9Ghz at least on all cores anyway.
I'm going to upgrade my PC shortly, I'm going to get a 1440p ultrawide monitor to go with it (lots of people probably looking at 4k). Get a great CPU with all the benefits of Intel's expensive CPU's which is able to max out top end GPU's and it costs less than a 7700k. What's not to love.
 
Sure, the 1800X beats the 6800K hand down. But people seems to forget these two glaring things: heat and PCI-E lanes available. My 6800K @4GHz, VCore is 1.168 V, whereas at 4GHz the 1800X's VCore is 1.425 V. Also, with the 6800K, I have the graphics card using 16 lanes, and still being able to use 3 NVMe x3 drives and 10 sata drives at the same time. With the 1800X,...only 1 NVMe x3 and 6 sata drives are available. Current Price in Australia: 6800K is A$605, 1800X is A$699.
 
"Instead of having a base clock of 3.3GHz and turbo frequency (single core boost) of 3.7GHz under load, AMD confirmed that the Ryzen 5 1600X will actually have the same boost frequency as the Ryzen 7 1800X at 4GHz – not a lowly 3.7GHz as some rumors have suggested. The base clock is higher too, at 3.6GHz compared to the earlier-reported 3.3GHz."
Well , that's what I was after personally,thats the chip the majority of us should be excited about - its going to pretty much smoke the locked i5's if they get the pricepoint right.


Another quote - direct from AMD
'To its credit, AMD isn’t trying to spin its deficit. “We’re still 15 percent behind on Kaby Lake. We own this stuff, all right? We’re still behind on single-threaded performance,” said Kevin Lensing, corporate vice president and general manager of AMD’s client business unit. “What I’d say about this, though, is that [Intel’s] advantage...has been almost exclusively relegated to frequency.”

Lensing declined to say what AMD has planned for future versions of its Zen architecture, but pointed to Intel’s success in eking out performance gains from its third (and soon to be fourth) generation of 14nm technology. AMD is on its first 14nm iteration. “There’s no reason we can’t do what they’re doing, and get this one-thread performance where we want it,” he said.
So no ,there will be no heads rolling or loss of jobs at AMD over these 'biased' review because the results are exactly what AMD expected them to be.
 


I'm since over 10 years mod of Tom's German site and can understand RealBeast really good. Moderators are also just people and to be honest: if I had to read such a biased and sodding crap again and again in my forum, combined with personal attacks and alternate truth about or site, I would have banned / punished him long ago. I was surprised, that the mod was more or less so restrained over the last pages. But there are moments when enough is enough... 🙁
 
for the maya test:so you are measuring viewport performance as a cpu task???fps of the viewport is a graphics card measurement ,mshouldn`t the benchmark be focused on cpu tasks like rendering with arnold-vray or fluid simulation?lower sec is bettter type?as i can see other software used is also a bit older.My point is,that these are not real world benchmarks.
 
This part is in the OpenGL section and similar to all the DX benchmarks - simply to show, how the CPU is limiting the VGA output (or not). What you see as result(s) is the same problem as in games. For combined results (viewport / rendering) I have tested Solidworks, Creo and the Blender-Loop (also with rendering and realtime animation). and single render benchmarks. For less than one day I think it is a big selection of different things. If you take a look at other sites with over one week time for tests, it is really much.

 
here is one vidoe with actual side by side comparison of fps
https://www.youtube.com/watch?v=nsDjx-tW_WQ

you can find tons like it on youtube, some, like this one have the necessary details to identify what is going on.

- the video shows a disparity of fps that is about 10-15%, i think i saw 20% once or twice
- the video explains that graphics are turned down as much as possible to create a cpu-bottleneck
- the difference in fps is explained by
-- different clock (the 1700 is at 3.9GHz on this vid, the 7700k on 50.GHz - the reviewer is completely honest about this)
-- optimization - the 7700k is at about double the load of ryzen overall core for core, showing how much headroom is in each of these cpus that is not used by the game. in some titles this means the 7700k is at 80-90% load while ryzen is sitting at 40-60. that is 50% unused compute that can be leveraged for ryzen that is simply not there for the 7700k in future titles.


so is the 7700k the better gaming cpu for today? definitely. anyone telling you differently is kidding themselves.

will the situation shift in the next months and ryzen catch up with microcode updates, better ram compatibility and driver/game patches? yes, just like it did with every cpu/gpu-release ever.

will ryzen overtake the 7700k in any current game that it is behind right now? no. not unless the game engine is updated at the base level, which would be a major revamp and i can not see any company investing in this kind of project, save maybe multi-year mmo titles. todays games in many cases are not optimized for more than 4 threads, so the 7700k with its 8 is pushing it, ryzens 16 are just not in the picture (yet).

so the bottom line for me is:
the ryzen launch went pretty much as expected. it is ahead in the areas they showed us, it is behind in gaming against cpus with higher clock speed and has a disadvantage in terms of optimization.

and my personal takeaway is:
for anyone doing productivity, ryzen is a no brainer. there is no better value proposition for multithreades workloads.

for a hardcore gamer however, who does not stream or do any other thread/load intensive tasks and who updates his system every 12-18 months, there is no scenario where ryzen makes sense. by the time the 16 threads become useful this customer base will have upgraded already to next years best and greatest.

for a gamer who does stream, record, or runs other multitaksing jobs at the same time ryzen again is the path forward in terms of $/performance.

finally for the enthusiast gamber who plays a lot but does not upgrade every year to get the next 5% micro-evolution performance, ryzen also is a logical upgrade path that will not become obsolete quite as fast as i think 4c/8t chips will be with the advent of dx12 and focus on multi threading in games.


finally, i would like to see a revisited benchmark in 3-6 months when the bios/microcode/drivers have stabilized, to see where this leaves us in actual real world performance once the initial quirks have been ironed out.
 

XFR isn't restricted to 100MHz but it does require considerably more cooling to convince it to go beyond that, which does make it seem next to pointless in most cases.
 


If you actually want to see a fair review of the Ryzen with obviously no Intel input, check out hardwarecanucks.com. They even did a YouTube video of their review. They're definitely not using benchmarks cherry-picked by Intel.

 
Status
Not open for further replies.