'Fallout 4' Benchmarks, And How To Disable VSync

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I'm sorry, but this article is close to useless for most of Tom's readers. "Most people will leave settings at default/auto" my tush. This is a hardware & tech site, most of us read these articles to gauge the relative performance of hardware on a particular software. For the comparison to be meaningful AT ALL, the conditions have to be as equal as possible OR the settings need to be detailed in full. I have no way of knowing what specific settings on one system or the other, some graphs feature different resolutions, vsync on and off, all in the same bundle. In this case a single graph is meaningless (as putting them all in one graph is only useful for comparison). You might as well just put the FPS beside each system on the rig description, and even then... what's the point?
 
The two benchmarks will have been run with completely different graphics quality settings. Read the article (I've bolded the relevant bit):

To evaluate how Fallout 4 behaves with different types of hardware, we tested out the game on several systems with a wide range of hardware. We decided to test with the auto-detected hardware settings, as this is likely how many gamers will leave it (and because this is not intended to be a performance tweak guide).


Obviously the auto-detected "optimal" settings will be radically different for these two cards, so there is no GPU pissing contest possible based on this particular article.

Doesn't that completely miss the point? Anyone can mess with the settings to get good performance, but that doesn't demonstrate to us whether or not the specs were accurately claiming a huge Nvidia bias in performance. That requires testing at the same settings.
 
Uhm... No vSync is great and all, but the obvious console port that this is isn't working well with it... when the game actually speeds up from it, 550-600fps in lockpicking is impossible... it turns almost immediately and just breaks the bobbypins... had to turn vSync on again to play the game....

Busted ass console port!
 


https://www.youtube.com/watch?v=gv6Ufgz47bY

Skyrim had the same issues it is more an issue with the way the game engine is designed not because it is a console port.
 
Funny thing, I ran a copy of this on a rig with a stock speed FX8350 liquid cooled though, 16gb of ddr3-1866mhz ram and a PNY GTX660ti and it told me to auto place this at medium but I said what the heck and hit ultra, ran at literally 30-60 fps with 25 being like the lowest it would ever drop to. Not sure if it's a fluke or anything, but the game runs smooooooooth!!!
 
Skyrim had the same issues it is more an issue with the way the game engine is designed not because it is a console port.

Agreed, Skyrim was a disaster in terms of coding----before one of the many patches, they never set the LAA flag and were using freaking x87 code rather than SSE (fixed in later patches). It was such a trivial fix that a 3rd party coder on the internet was able to write a library that intercepted the x87 calls and substitute the far faster SSE instructions, and got a sizable increase in performance.

I doubt this game suffers the same fate, after all, it is a 64 bit binary, and 64bit compilers assume you have a new version of SSE, as even the earliest x64 cpus had SSE1 or later. And of course, no need to set an LAA flag as it isnt 32bit.

Bethesda makes epic, timeless rpgs, but I wish to high heaven that they'd license something like CryEngine or Unreal Engine, because its obvious that game engine creation is not one of their strong suits. One thing I'll give their "Creation Engine" is that it is very mod-friendly for the modding community, apparently.
 


They probably recompiled it for the newer game and 64bit but the same issues can exist which was my point.
 
They probably recompiled it for the newer game and 64bit but the same issues can exist which was my point.

Oh, no doubt, the bitness wouldnt have a whole lot of effect if the game engine is only optimized to use like 2 cores, which, IIRC, unmodded Skyrim was susceptible to. But I was just speaking to the laughably bad optimization of first-release Skyrim.

 
18FPS minimum at lowest 720p on a 620m is pretty good. GPU requirements aren't that huge then at lowest graphical setttings, may be able to play it on my old G50VT laptop 9800m GS video card, though only 512MB VRAM.

It's interesting how the torrents for this game are about 25GB, less than half the size of GTA V! I wonder how expansive the world really is? No doubt it is great.
 
"The laptop build was put together earlier this year by our own Michael Justin Allen Sexton for a budget gaming laptop. Unfortunately, it had difficulty running Fallout 4; its highest average was just 26.817 fps, and that's with a 720p resolution and on the lowest graphical settings possible. This isn't exactly a high-end laptop, but it does indicate that if you plan to game on a laptop, you should get a reasonably powerful system. That, or just play on a desktop PC."

Or if you're broke and want to play games, and must have a laptop, get an AMD APU. A $500-600 laptop with an A10-8700P or similar will walk all over that 620M.
 
AMD Reference System:

CPU - AMD Black Edition FX‑8350 (4 GHz)
GPU - 2 x MSI Radeon R9 270X GAMING (2G) [CrossFireX]
RAM - G.SKILL Ares Series 32GB (4 x 8GB) DDR3 1866 (F3-1866C10Q-32GAB)
Storage - WD BLACK SERIES 2TB (64MB Cache)


I'm not sure if it was a coincidence, accident or pure bias on the lack of AMD CPUs in this "benchmark", but I would like to offer my experience from a different prospective, if I may.

As you can see, my specs are pretty modest compared to most of the builds mentioned here. In fact, I had to replace my SSD because I simply didn't have enough room. (It's amazing how quickly 250Gbs fill up these days.)

At any rate, to those interested in AMD performances, this is for you:

First, when I read the recommended specs, I thought Bethesda was nutts!
However, after booting up the game for the first time, I was shocked to see that the game defaulted the settings to Ultra.

Yep, Ultra.
No i7 nor FX-9590 needed. And I didn't have to go with a full blown 4GB gfx card either.

And yes, the frame rate was maxed at 60fps. And it is a very steady stream of 60fps with the occasional dipping to 59fps (but NO stuttering). Disabling vSync in this game is unnecessary if you're getting 59-60fps. Leave those settings alone.

Graphics-wise, the game is unbelievably beautiful. Gameplay-wise, it is beyond out-of-the-box playable. I played for 2 non-stop hours last night without a single bug, crash, or ripple. I was expecting something out of the ordinary, but the game played flawlessly on my system. Still, I did change the FOV, mouse acceleration, and some other universal stuff mentioned in Gabi's Steam guide.

I would question how much "hardware" you really need for a game tailored to consoles. A budget AMD system should more than handle this game on Ultra.

Looking forward to reading other AMD experiences.

 


A game can be tailored for consoles yet still utilize higher end PC hardware and this game does have a lot of features that the consoles cannot support as well.

A good example would be Batman Arkham City. It ran great on consoles but also included a ton of PC specific features that needed better hardware to handle.
 
I'm running it on an i5-6600k, 8 GB of DDR4-2800, and an R7 265 (2GB). About 8 hours in.

My CPU use has never exceeded 50%. My old I5-2500k would probably have done fine.

Graphics are High, with Ultra textures and 8x AA. I'm seeing frame rates of 60 when looking at the sky, 50 when walking around most areas, and 40 when in intense combat.
 

It was clearly stated that they used their personal systems, and apparently none of Tom's staff has an AMD cpu in their rig. Besides, as others have noted, its not even a proper benchmark, its more like a test run
 


Actually plenty do just that they only had so many copies and the people who got them had Intel setups.

In fact a lot of the staff and mod have multiple rigs with Intel, AMD and nVidia hardware (I happen to be one). Just they might not have the game.
 
Very happy to see that FO4 is well threaded. Runs great on my AMD FX 9590 (* Yes, uses all 8 CPU cores @ 50-75% - YAY! ) and GTX 970. 1080p @ Ultra (Just motion blur off BC I don't like it) stays pegged at 60 FPS 99% of the time.

One noteworthy gotcha that drove me nuts for a day was the controller support stutter. If your game regularly stutters every second or so, that's likely USB polling for a controller. Turning Controller and vibration off in settings smoothed it right out.

Happy hunting, Fallout Fans. :)
 
Really. That's interesting. The only place I stutter is in Boston near Goodneighbor Too many assets rendering since I have it up to ultra. I do have another problem with LODs not switching to a higher res.
 


It's not meant as a complete guide, just what they had.

Also, there's rarely any difference between an i5 and i7 of the same generation on a desktop CPU. Both have four cores, and the hyperthreading rarely makes a difference for gaming (on the i3 it would).
 
According to the Steam survey 48.3% use dual-core, 44.6 quad-core.

"Only" twice as many use Nvidia graphics cards as use AMD.
What must be disturbing for AMD though is that almost as many use Intel integrated graphics as their graphics cards ..

40% of the premium we're-the-elite Mac users have 4 GB of RAM or less.
(They too have a lot of Intel integrated graphics I guess..)
 
Status
Not open for further replies.