Mass Effect Andromeda Performance Review

Status
Not open for further replies.

WINTERLORD

Distinguished
Sep 20, 2008
1,774
15
19,815
glad to see tomshardware doing articles like this. i think it may of been cool to see a radeon fury in there since its 4gb limited ram i dont own one but am sure somebody does. however stickijg to the cards listed in the article would have been nice to see how it plays at 4k resolution. at Any rate great article
 

IceMyth

Honorable
Dec 15, 2015
571
3
11,165
Hmmm...I dont think this article is accurate when it comes to GPUs FPS . Yes, The 1050 and 460 are the worst for this game, but the other GPUs are not.

The problem I see is that you didn't eliminate the CPU bottleneck which affects the GPU performance as well, for example PCGamer used different setup to eliminate the CPU bottleneck and all GPUs used where MSI. The results they got is that MSI RX 480 is slightly higher FPS than MSI 1060 (which is 300Mhz higher) on Ultra. While on Medium settings the Rx 480 is faster by around 10FPS.

I know this is not a GPU benchmark but since you include FPS/CPU/Memory it is a way to benchmark computer hardware.
http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/

Windows 10 has around 47% of market share, you sure about this? I mean so far all what I heard and by googling the 47% market share is for Windows 7 and not Windows 10. If you mean Win10-64bit vs Win10-32bit market share then this is something else.
1- https://www.neowin.net/news/windows-xps-market-share-takes-another-hit-as-windows-7-and-10-rise
2- https://www.wincert.net/microsoft-windows/windows-10/windows-10-market-share-without-changes/
3- https://betanews.com/2017/03/01/windows-10-loses-share-again/

Regards,
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
Personally i'we disabled the motion blur (need to make a cfg file, google it).
Changed to HALF16 instead of Compressed to get less washed out gfx that have more depth to it
Changed to Double-buffering from Trippe-buffering to get rid of the console like high latency in the game = Much faster response and it also appeared to give a better frame pacing.

3440x1440 running with around 80-90 fps depending on area (gsyned 100 hz), no stuttering after the above was fixed.

The above along with some minor tweaks to light/shadow and the game is both responsive yet really beautiful
 

Jan_26

Commendable
Jun 30, 2016
247
0
1,760


I think you mean you changed from double-buffering to tripple buffering. As tripple buffering is superior to double buffering in every aspect except gpu ram requirements :)
 

Masterarms

Prominent
Apr 1, 2017
1
0
510
This game is a joke, a complete horribly rendered game with more bugs and glitches than content. The game is a flop, use something else.
 
Quote "Then again, 600 years have passed since Mass Effect 3, so perhaps evolution is to blame?"

This could also be the reason of the horrible facial expressions used. After having their faces in one position for so long, 600+ years, they don't have much muscle control to express themselves properly. lol

As for physics engine issues they seem to pop up every where. From dead body's half in the ground, wild life materializing in the ground, floating objects and the occasional shooting the enemy and it not registering these all can be a turn off but when certain parts of terrain are rendered just plain wrong this is a full deal breaker.

This is most commonly seen by me on the Tempest's bridge and looking in to the escape pod room where Peebee hangs out. most times it will show space, stars nebula's and black, then slowly switch over to the actual room view.

GPU and CPU performance aside this thing needs some serious patch work.
 

dstarr3

Distinguished
I'm hoping to get another year and a half or two years out of my 980Ti. And judging by the 1060/1070 here, it's good to see that I'll still be able to run this year's AAA games at 1080p/60/Ultra. Hang in there, buddy! Once the 1180Ti or whatever comes out, then you can take a well-deserved vacation!
 



It is obvious the whole meaning to this actical is completely lost on you even though it was strait spelled out for you. And I quote "How does it run on mainstream gaming hardware? We benchmark it on eight different graphics cards to find out." This was not a benchmark of the top GPU's and top CPU's but a bench mark of mid range hardware.

For this the article served it purpose But I would have liked to see other CPU's added in to the mix to show the CPU bottlenecks at what points.

Because very few of those that use Steam use 2k and even less use 4k monitors the mid-range segment is in the 1920X1080 resolution. For this game to run at 4k resolution more than one GTX1080 will be necessary and again puts it well out of the mid-range hardware that this article was meant to cover.
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780


Actually no, proper triple buffering as long as the vram can afford it runs great on one card and is usually what i recommend myself but me:a's implementation with multiple cards / temporal aa causes frame time issues.

Where double / temporal works quite ok considering temporal have to xfer prev frame back and forth between the cards and an nvidia double sli bridge seems to be enough for quite acceptable min/avg fps.
 

brandxbeer

Honorable
Oct 13, 2014
219
0
10,710


No, I fully understand the point of it. I still find it lazy though. Most of the benchmarks were done during the EA origins early play thing without patches or driver updates. I was mostly hoping for a proper benchmarks with all available GPU's to compare and a proper 1080 ti benchmark. I also thought it would go much deeper into the graphics settings with screen shots and fps gain/loss for each setting like geforce or gamernexus does. I guess I just expected more from Tom's.
Game is great so far though. I haven't seen any major glitches yet, only some minor clipping issues.
 

That is a good point, and it might have been nice if they had tested with a few different CPUs. Even an i5-6600 could have made a notable difference in performance if the RX480 was getting CPU-limited more often on the 6500 than the GTX 1060, and that's still very much a "mainstream" CPU. They tested with 8 different graphics cards with prices ranging from under $100 for the RX 460, to around $300 for that GTX 1060 Strix OC (and someone could have paid even more for a GTX 970 or R9 390), and yet they only tested with a single $200 CPU, in a game that's getting CPU limited with some of the cards. Why not likewise test with a $100, $200, and $300 CPU to round things out?


Have a look at Steam's latest hardware survey, which should provide a better depiction of systems that are actually used for gaming...
http://store.steampowered.com/hwsurvey?platform=pc
Currently, Steam is showing Windows 10 64 bit at 52.22% and rising, while Windows 7 64 bit is at 31.20% and dropping, among Windows versions. Those articles are undoubtedly counting business systems in the mix, which tend to be slow to upgrade to avoid having to retrain staff, as well as any potential hardware and software conflicts that might arise from moving to a new OS.


That makes me wonder if that kind of pop-in could actually affect the results of these framerate tests. If a particular GPU took longer to load something in, it might not need to render that object until later, potentially resulting in increased framerates (or the opposite). Even when viewing Tom's video of their benchmark run, I noticed a character briefly "teleport in" at 0:13. Had that character loaded sooner, or not at all, it seems like they could have potentially affected the framerate in some way.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I've been playing this on both my rigs. Yeah the game is buggy. Smooth flowing is not something that really applies anywhere for more than a few seconds here and there. I do recommend it be installed in a SSD. I tried it on SATA III RAID0 platter raids on both rigs and when I put it on an SSD I saw a pretty big change in load times and also all the annoying pauses between cutscene transitions were greatly minimized. They're still there but at least now it feels more like a dropped frame or two as opposed to feeling like something just crashed.

Old Rig: 2600k(4.2ghz), 16GB(1333mhz), Z68, 2xG1970's SLI, 1440p, 144hz G-Sync
New Rig 4930k(4.1ghz), 32GB(2133mhz), X79, 2xXteme1080's SLI, 4k 60hz V-Sync

The Old rig @ 1440p: All textures/shadows maxed, FXAA and G-Sync(all V-sync off) avg's 40-80fps. Uses roughly 6-10GB Ram, stays pinged in 3.5GB Vram limit of the 970, and according to MSI afterburner its using a pagefile around 10-20GB's. The 970's stay fairly cool in the 60-70c. CPU usage averages 40-50%. It's obvious the game wants more Vram than the 3.5gb the 970's give but otherwise very enjoyable thanks to G-sync.

New rig @ 4k/60hz: Everything maxed. V-Sync On. It averages 50-60fps most of the time and dips ~40-45fps. Will use ~10-16GB ram with Vram mostly around 6-7GB and sometimes a little higher. Afterburner's report of pagefile is the same 10-20GB. This game makes my 1080's run HOT. I've never seen this in any of the other demanding games I own. They will average 70-80c and only on this game at this time. Witcher3/GTA V/Fallout 4/Doom don't do this, they all hang around 60-70c. CPU usage is around 35-45%. The game recognized and enabled HDR10 for the HiSense 4k HDR TV I have connected. I didn't have to do anything other than leave it on auto. Pretty cool on that front.

Play experience is nearly identical between both rigs now other than resolution. Both rigs originally had the same RAID0 platter setup(2 Seagate 500GB SATA III) but now the 2600k rig has a Toshiba OCZ 960GB. I can't emphasize enough how an SSD can make a difference for this game. An SSD won't cure this game's bugs but it will make them less painful.
 
*Some of your CONCLUSIONS are probably misleading. The difference in VRAM probably explains both the system and video differences in usage.

For example, the RX-480 8GB may buffer more in video memory (however it may not NEED more than 6GB of what it buffers) because it can. Conversely the GTX1060 6GB probably buffers more in system memory because it doesn't have enough VRAM.

**Where the CONFUSION comes in over and over with different games as well is the assumption that the game NEEDS to have all the buffered data or issues happen. That is NOT necessarily true. Some games buffer as long as there's sufficient memory, and those that don't have it may simply swap data during a level load or other such transition with minimal impact on game play.
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
When I look at the system requirements, it's the CPU requirements that most pique my interest. The AMD counterpart for recommended is something I value much less than the Intel part for basic operation, so I would've like to see the game benchmarked on AMD CPUs in addition to Intel iron.
 

jerm1027

Distinguished
Apr 20, 2011
404
0
18,810
This game punishes my old 2500k @ 4.2GHz, 8GB 1866MHz RAM and R9 380x. I play with customized settings that are probably between Medium and High - I'm at native 1080p, high shadows, high textures, high AA, but most other things turned to low. I'm particularly sensitive to aliasing, and can't stand super low-res, blocky shadows. I also have FreeSync, so despite running higher resolution, and having my all hardware pegged, I'm getting a pretty good experience, bizarre animation aside.
 

Burstaholic

Prominent
Apr 3, 2017
6
0
510
That settles it: gotta be my CPU. My R9 Fury running at 1105/750 is still seeing 20-40 FPS at 1080/Ultra (automatic). My Phenom II X4 960T must be the culprit.

Can't wait to jump on the Ryzen 5 release next week for a shiny new 1600x :D
 


Have you noticed how the NPC Humans stand? Chest and stomach puffed out with the hand on the hip like a woman. :lol:
 
I am running this game on 5760x1080 on a 98Ti and a 3770 @ 4ghz with no problems smooth as silk not sure actual FPS though. On the other hand I am only a few hours in and the game kinda sucks. I have read it gets better but the beginning is kinda painful... I just finished ME3 for the second time while waiting for this to come out that was more fun than I have had so far... guess ill grind forward and hope for the best.

Thent
 

jerm1027

Distinguished
Apr 20, 2011
404
0
18,810


The minimum requirements are a mid-range gaming machine that is less than 5 years old; quad-core CPU, 8GB of RAM, mid-range GPU from 2012 will meet the minimum requirements. Even the recommended GPU is a modern mid-range GPU (RX 480 or GTX 1060 3GB).
It is fairly demanding, but not impossible to work with. I have a 2500k, 8GB of RAM, and R9 380X, and I can play 1080p UW native, medium settings; not at 60FPS, but still fluid thanks to FreeSync.
 
Status
Not open for further replies.