AMD Vega MegaThread! FAQ and Resources

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Radeon Vega: Frontier Edition Review | AMD's Enigma
Gamers Nexus
Published on Jul 1, 2017
We review the new AMD Vega: Frontier Edition (FE) video card in thermals, power, noise, Blender, Maya, 3DS Max, and gaming.
https://www.youtube.com/watch?v=ZHb7fC5zUdU

AMD Vega: Frontier Edition Review & In-Depth Benchmark
By Steve Burke Published July 02, 2017 at 12:58 am

amd-vega-fe-pcb1.jpg

amd-vega-fe-teardown-2.jpg

amd-vega-fe-cooler.jpg


vega-fe-3dsmax-05.png

vega-fe-maya.png

vega-fe-creo.png

vega-fe-catia.png

vega-fe-blender-v-titanxp_copy.png


Gaming
vega-fe-firestrike-ultra_hi.png

vega-fe-firestrike-extreme_hi.png

vega-fe-firestrike-normal_hi.png

vega-fe-firestrike-normal_fps.png

vega-fe-timespy.png

vega-fe-doom-4k.png

vega-fe-grw-4k.png

vega-fe-grw-1440p.png

vega-fe-grw-1080p.png

vega-fe-forhonor-4k.png

vega-fe-forhonor-1440p.png

vega-fe-forhonor-1080.png

vega-fe-sniper-4k.png

vega-fe-aots-4k.png


Power and Thermals
amd-vega-fe-power-specviewperf_tixp.png


Those are a lot of tests more power and noise tests can be found on the website
http://www.gamersnexus.net/hwreviews/2973-amd-vega-frontier-edition-reviewed-too-soon-to-call
 
680 W peak system power usage! Are they crazy? I can cook a steak with 400W. I don't need extra heater in the room, we already have 34 degrees Celsius in summer days, adding Vega and I'd be baking at 40+ 🙁
 


You're looking at the wrong line, its 425 (right axis pink line). The red and green lines on that graph are temperature (the left axis).
 


My bad, you're right. But still, even 425 W SYS PWR is no go in my climate. I hope RX Vega or Volta will come out with much better thermal performance, till that day I'll stick with midrange NVidia GPUs.
 
Isn't this last bit rather positive then? Gaming Vega with optimized drivers could punch just above GTX 1080.

The high TDP is a turn-off for miners (especially if/when cryptocurrency prices go down) and a number of gamers. Power gamers would eventually get cards in their hands.

If the price is right there's going to be a bunch of happy people. But I'll stick with my GTX 1070 thank you.
 


AMD has a history of designing hardware with raw horsepower that software and drivers don't fully make use of for a while. Over time their GPU drivers generally mature and utilize the rest of that power. Pro cards have completely different optimizations than consumer cards for memory allocation block sizes and so forth, so wait and see.

Absolute worst case scenario is that it winds up between the 1080 and 1080 ti, both of which already had $100 knocked off their list prices. I don't believe that's a coincidence. Best case scenario it's faster than the 1080 ti although I doubt it can put much further price pressure on them given the reported costs of HBM2.

Sort of related to the paper specs vs actual performance, I see a lot of people pointing out the CPU usage on Ryzen not being maxed in games, while Intel CPU's sit at 100%. The funny thing is that if you look at the utilization between stock and OC versions of the same chip, the relative usage doesn't change even as fps goes up. Seems to imply that Ryzen too can't make full use of its resources right now. And on the Intel side it's also weird, given that the CPU utilization always reports 100% in games like BF1, even though the same CPU gets different fps depending on which GPU it has paired with it.
 


I think we are moving past the sweet spot being 4c/4t, and it's been a while that Intel i5's have been recommended over the i7's for gaming. While an i5 might be max usage, where do i7's sit? I really think that Intel still making 2c/2t CPU's is sad and needs to be cut out of the product list. IMO, the low end CPU's should be at least 2c/4t, but then what would Intel do with all the defective CPU's that only have 2 cores working...

What you are describing is a CPU bottleneck, but the difference in FPS is how much of the slack the GPU is able to handle. DX12 is supposed to let the GPU do more of the work, so those that are CPU bottlenecked will get some relief.

I know there is no and will be no affinity between CPU's and GPU's, but Vega and Ryzen should get Intel back in gear and I strongly believe Vega will give nVidia a run for their money.
 
I agree with Martell1977.

In computer science, a computer is CPU-bound (or compute-bound) when the time for it to complete a task is determined principally by the speed of the central processor: processor utilization is high, perhaps at 100% usage for many seconds or minutes.
https://en.wikipedia.org/wiki/CPU-bound

Ryzen typically having more cores will not have 100% usage in the same scenarios, because like you said Ryzen will have more resources that are not being used. This helps while gaming when you have more programs running while gaming like streaming. Ryzen designed to do this. In Linus's video he shows you that the 1800X is better than the 6900K at this task. Also, just to show you how well the 1600X fairs against an i5 I added another video by Blunty.

Ryzen is THE BEST CPU for Game Streaming? - $h!t Manufacturers Say Ep. 2
Linus Tech Tips
Published on Apr 6, 2017
Is Ryzen REALLY the best consumer CPU option for video encoding and game streaming? Let's find out!
https://www.youtube.com/watch?v=jludqTnPpnU

Best CPU for Streaming? - RYZEN 1600x vs INTEL i5 - Streaming Shoot-Out
Blunty
Published on Apr 10, 2017
AMD Ryzen 5 is HERE! Today! I’m testing the Ryzen 1600x & Ryzen 1500x, With Gameplay, Overclocks, Streaming etc etc… This vid; is the AMD Ryzen 1600x good for Streaming? 1600x vs i5 for Twitch Streaming.
https://www.youtube.com/watch?v=P8bRqdFGCf0
 
Most games do have a thread or two that dictate the max FPS so the strength of that one or two threads matters for bench-marking and those that want to run very high frame rates. What I think many tests overlook is the 6-8 core CPU's tend to make game play very smooth or fluid due to having those extra cores going underutilized, there is extra headroom as you are not CPU bound. I haven't seen anyone quantify this terribly well albeit the 1% lows do help but I think there is more to the "picture" than the frame draws. I can certainly attest to the smoothness of game play using a Ryzen 1800x, my FPS isn't much higher than my prior CPU but the experience is much smoother.
 
Vega FE Pro vs. Gaming Mode – AMD's Placebo Pill
By Steve Burke Published July 07, 2017 at 1:10 am

gaming-vs-pro-vega-specviewperf.png

gaming-vs-pro-vega-mll-4k.png

gaming-vs-pro-vega-mll-1080p.png

gaming-vs-pro-vega-aots.png


"Finally, Ashes of the Singularity with Dx12 shows more of the same – our numbers are basically tied here. There is no significant difference between the two driver modes."

"We have more data, but it’s really pointless to put it together – it’ll save us time on charts and save you time on reading: Everything is the same. Sniper, GTA, Battlefield, Ghost Recon – all the same.

AMD has shipped a driver package with a psychological switch: Toggling from “gaming” to “pro” mode results in nothing aside from a black flicker and the obfuscation of WattMan – an already broken option – and Chill, of arguable merit to begin with. This toggle is built to make you feel like the GPU is doing something better, but it’s actually not; the performance is exactly the same between modes, and the only change is that “pro” users cannot see the buggy mess that is WattMan. Perhaps that’s the benefit."

There are more benchmarks if you want to look at them following the link.
http://www.gamersnexus.net/guides/2979-vega-fe-pro-mode-vs-gaming-mode-whats-amd-doing

 


Correct! I'm disappointed. It's taking them a long time to release Vega as it is, and they don't have good drivers. I really hope RX Vega on July 30th is going to be better than this. Some how I don't think it will.
 


The 1080 ti was released at the same price as the 1080 FE, and the 1080's MSRP's dropped by $100 at the same time. They didn't do that just to be nice. Or because they wanted to make less money.
 


nvidia in the end still need to be realistic about the price. just because 1080 have no competition from AMD they can simply keep it's price and sell 1080ti at much higher price. for some people there is certain limit they can commit to. $700 already very expensive for graphic card alone. reducing the price for 1080 will make them less money on each card sold but it will play in nvidia interest in the end if they can sell more 1080 than before. and most importantly nvidia need to sell as many as possible before volta arrive.

 


Just imagine the level of performance we are going to get for our money in 2018. How close do you think we will get to 60FPS@4k with a $150-250 card?
 


If the trends continue, the 1170 or 2070 or whatever they call Volta, should have performance at or around the 1080ti and be around $300-$330. So at the $250 range, I'd expect 1080 or 1070 performance. So 60 fps at 4k, probably looking for whatever is after Volta before it gets to the $250 range.(Possibly even the gen after that) So....2020 or 2021.
 


i think it depends on nvidia a lot. if nvidia decided to pull an intel then it will be longer before we can see such thing happen (and AMD will use this chance as breathing room to catch up to nvidia). but if nvidia have no intention of changing what they are doing right now then maybe in 2020 time frame? in 2018 nvidia most likely going to bring 1070-1080 performance down to sub $250 segment. in 2020 time frame (also the time for volta successor) 1080ti performance will coming down to sub $250. but this projection still did not count if games will use even more demanding graphic option. and from what i heard now AMD finally decided to support FL12_1 on Vega. there are certain ways to use that hardware function but it will still be very demanding even with dedicated hardware (just think about tessellation. the dedicated hardware does not mean you can easily use the feature as you like. the existence of dedicated hardware only make the impossible to finally possible but still with negative performance impact).
 


I read the same thing 12.1!
 
Do you think RX Vega will be substantially cheaper than 1080? If not, what's the point? Similar performance with more power consumption doesn't sound like Nvidia should be afraid of Vega if AMD doesn't want to start price war.
 
Status
Not open for further replies.