AMD Vega MegaThread! FAQ and Resources

Page 27 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Ah... Fermi... "The way it's meant to be grilled"... Good times. I still have that image. Makes me laugh each single time.

I still have my GTX675M rocking in my lappy (which is a GTX580M, re-branded) going strong! I still play and stuff, so I really don't have complaints driver nor hardware wise. I've also avoided the "GPU killing" drivers by doing the reasonable thing: wait until reviews show up saying it's safe to upgrade. I do that with AMD as well. Both camps have had a different degree of sucess/failure with each driver release that I remember. In AMD's worse side are the gray screens in the HD5xxx gen. Man I had a good laugh with that. I got it as well with my HD4890, but reverted right away. When I had the GTX670, I didn't have any issues with drivers per se, but nVidia has always sucked with TV support. I had a *lot* of issues trying to make my PC work with my TVs, whereas the AMD cards just worked perfectly. And I would imagine that is still the case.

In any case, from the leaked information, smells like Vega 20 is the successor to Polaris 10; I can't believe, marketing wise, they're scaling down Vega instead of beefing up Polaris (if the name is any indication of what they're planning).

Cheers!
 


Nvidia got a pass on it where it matters - people kept buying.
 
AMD RX Vega 64 Outperforms NVIDIA GTX 1080 Ti By Up To 23% In DX12 Forza 7:

http://wccftech.com/amd-rx-vega-64-outperforms-nvidia-gtx-1080-ti-23-dx12-forza-7/

An the Rx56 as well according to the bench.
 


Good news until you read the part that says Nvidia has not had the chance to optimize it like AMD has.

Still love my Vega, it looks cool and it performs well enough. The fact it gets beat by 5%-10% in most cases by Nvidia is irrelevant. When it comes down to actually playing a game, it makes 0 difference.
 
Good point didn't realize that... AMD drivers are optiimzed for it..

http://www.pcgamer.com/latest-amd-gpu-driver-is-optimized-for-warhammer-ii-and-forza-motorsport-7/

But even still it is a true Dx12 title designed originally to run on AMD hardware an the Dx12 chip in scorpio I presume.
Plus Nvidia's struggles with Dx12 on a hardware level. Be interesting to see their bench's after optimization with more an more Dx12 titles coming out. Probly still beat Vega but maybe not wipe the floor with it..
 


I'm just wondering what kind of monitor are you using? Vega 64 is showing interesting FPS at higher resolution in some games. I was just wondering if you were seeing similar results.
Vega_14.png
 


System is in my sig, I run the Samsung 34 CF791 100h 3440x1440. 5% was a generalization, I know its even closer some games more some less. Right now I'm playing ME Andromeda, next up for me is Prey once I beat it. It runs smooth as silk, looks beautiful, if it ran 10 fps faster there would be no difference.
 


Yeah, a friend of mine bought the CF791 about 3 month ago when they were on sale ~750. He loves it. He said he realizes now how much he has been missing when viewing games. That's great to hear about the silky smooth performance, which is ideally what you want for full gaming immersion.
 


Yeah I got it for $750 on sale as well. Its a great monitor with a few caveats. One is it has a dead pixel, and from reading online this is common. Its such high res you barely notice, and TBH not worth packing it back up to the store to get another, when odds are I'll get another dead pixel.

Now heres the part where it relates to Vega. Freesync is straight up broken on it. Theres a warning that Freesync may cause flickering, if you use Freesync Ultimate Engine it syncs from 48-100mhz if you drop below 48 (ie on menu screens) or over 100mhz you get this flickering as in the screen goes on and off and sometimes is just blank where you need to turn the screen off and back on just to be able to see the menu or whatever. The Standard engine covers only 80-100 mhz and is slightly better but still annoying. It is worse or better in different games, but like Mass Effect Andromeda for example was unplayable. Project Cars and DOOM also had the same issue, I was able to get DOOM ok, But Project Cars not.

So this issue was reported with pre-Vega cards. Then Vega came out and there was a review saying it was fixed. Its not as I have seen and many other have reported. That said I just run it at 100hz and its smooth as can be, I don't notice any tearing or anything. If you look at reviews online the Monitor has been sold a lot to Nvidia users because it works very well just as is.

That said the fact AMD was pushing this monitor in the "Radeon Vega Packs" and selling it on the Freesync feature when Freesync is so broken on it is pretty dumb. They should have been 100% sure it works as advertised.
 
I am so close to buying a MSI water cooled Vega 64. BUT I'm not 100% settled yet. I have 2x R9 390's in xfire and I am wondering what the price to performance difference will be between a single Vega 64 and 2x R9 390's.

I had a short thread about this before and I just settled on the fact that 1 gpu is better/less complicated and annoying than 2.

Thank you,
Harrison
 


Yeah It seems a lot of DX12 games are faring well on AMD's side of the fence, heck even my 290X can Ultra Settings Forza Horizon 3 and maintain mid 80's fps (except ultra shadows only because it sends me over the 4GB limit of my card, plummeting the fps into the teens) but that is to be expected.
 
it seems there is somekind of issue with Forza 7. if the game simply that fast on AMD hardware it should retain consistent result regardless of resolution. because at 4k the 1080ti taking the lead. and there is funny thing about CPU usage with the game. despite DX12 being advertise to use multi core CPU better the game pretty much hammer 1 core as much as they can to which Turn 10 said was done on purpose to reduce input latency as low as possible.
 
You know... I was just thinking that if the makers of 3D Mark were to include a "Multi Player" type of benchmark, they'd be selling a loooot more copies of their benchmarking suite. Or at least making a great favor to all gamers in the world.

Cheers!
 


That would be the only "counter" to have 3D Mark do it.

Having a "MP" simulation is way more complex than having a GPU bench. They'd need to design a "common denominator" server-client(s) configuration that resembles a typical *design* of an MP component of a game.

Now, that being said, they could grab a few examples based on UE, Unity and Source. They are far the most popular engines out there, which Frostbite and whatever uses CoD.

One can dream, right? XD

Cheers!
 


so far only computerbase.de that got Vega (even 56) beating 1080 at 4k. TPU and Guru3d result are more reflecting each other. while Vega 64 end up being faster (4k) it is nowhere near to the level where computerbase.de got their result.

https://www.techpowerup.com/reviews/Performance_Analysis/Middle_Earth_Shadow_of_War/5.html

https://www.guru3d.com/articles_pages/middle_earth_shadow_of_war_pc_graphics_performance_benchmark_review,5.html
 


Guru3D: "Our test system is based on the eight-core Intel Core i7-5960X Extreme Edition with Haswell-E based setup on the X99 chipset platform. This setup is running tweaked at 4.20 GHz".
TechPowerUP: "We used the latest public Beta release version of the game (not a press pre-release)".
Computer.de: "In addition, for 2560x1440 is switched to very high and for 3840x2160 to high."

That could explain some of the variance.

Cheers!
 
Whats even more surprising is it's a Dx11 title... I initially thought it was Dx12 because of the bench's.
I also presumed Nvidia hadn't optimized their drivers for it yet.. but upon a quick search it looks like they have:
http://www.pcgamer.com/nvidias-latest-drivers-prep-your-pc-for-shadow-of-war-and-the-evil-within-2/

This is an impressive result from Vega, Dx11 and optimized drivers from Nvidia.. I'm finding it a bit hard to believe to be honest.

Edit: HBCC is off in computerbase.de not sure what way it is in other tests.
 
HBCC is turned of by default by AMD. to use HBCC the game needs to use more VRAM than the local VRAM available on the card at all times. Else using HBCC will impact game performance in negative way according to AMD.

AMD problem with DX11 is their CPU overhead. That's why they able to gain some distance over nvidia card once they going up resolution. DX12 was supposed to solve this issue entirely but so far DX12 result can be inconsistent. In Battlefront 2 for example AMD card also take performance hit once DX12 being enabled. It seems Frostbite did not play nice with DX12. Maybe Dice can tweaked the engine specifically for DX12 but i doubt they will do so looking at their history. Most often they will try to support older hardware as much as they can. they will not going to sacrifice potential player base just to support bleeding edge tech.
 
I agree I don't think DICE did a great job with Dx12 and Frostbite an I agree I can't see it getting fixed either..

But as we are starting to see at the moment..
A lot of new titles coming in the future will be true Dx12 (developed for Dx12 from the ground up) or Vulkan an we will have a lot getting ported from the Scorpio or the Playstation.
Dev's will mostly be aiming at this market an will have to get used to developing for Dx12/Vulkan weather they like it or not.
So I expect to see a lot more titles like the new Forza, true Dx12 titles an Vulkan ones for that matter. Well that is from here on in...

It has to be noted that Vulkan an Directx12 are very similar as MS just copied AMD's Mantle API which later became known as Vulkan so I presume if you become proficient in one it shouldn't be too difficult to learn the other. I guess what I'm trying to say is it is the future an it's happening now.
 
Status
Not open for further replies.