Fallout 4 Benchmarked

Status
Not open for further replies.

toddybody

Distinguished
I wanted to fall in love with this game...so much.

The game runs horribly, I get dips into the high 40's with GodRays set to low (1440P). It confounds me why I get better performance from Crysis 3, Witcher 3, Metro LL...all of which make this 2015 title look like a re-skinned Skyrim.

Their decision to use the creation engine was horrendous, and highlighted by their poor GameWorks implementation (but I blame team green for that forceful marketing).

If low frames, $hit textures and models werent bad enough...the game is boring as Sunday Night Church. 7 hours into it, I had a wonderful variance of "clear the raiders" missions to enjoy...and too many hours played for a Steam refund.

If I sound butthurt, it's because I am...Fallout 3 was one of my favorite games, and Fallout New Vegas was an incredible addition to the franchise.
 

Gillerer

Distinguished
Sep 23, 2013
366
86
18,940
You should *never* disable V-Sync in Creation Engine games. The game logic and physics are tied to the FPS. After playing a while with faster or slower framerate the NPC's schedules shift out of place.
 

stoned_ritual

Reputable
Dec 23, 2014
103
0
4,690
I have a vanilla 780 3gb vram. and i5 4670k, and 16gb ram. I play with ultra everything, godrays on LOW and shadow distance is set to MEDIUM. Changing the shadow distance all but eliminated the huge frame drops I was getting while exploring downtown.
 

Biscuit42

Reputable
Aug 7, 2015
49
0
4,540
My experience (about a 100 hours in): I5-6600k @ 4.3 GHz and a 2GB R7 265 @ 1920x1080 lets me get 40 - 45 FPS, with no stuttering, with everything on 'high'. Changing the shadow setting seems to have the biggest impact on frame rates. Oh, and my CPU utilization is consistently under 50%.
 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010
Why did you change CPUs across test sets? What happened to the old methodology of using one CPU for all tests sets and then doing a separate run using multiple CPUs with the same GPU?

The way this was done you cannot compare results from the same GPU between, for example, Ultra and Medium.
 

Chris Droste

Honorable
May 29, 2013
275
0
10,810
hey guys @ Toms; there's an HD texture package/project ongoing for Fallout4. i would be VERY curious to see how, or if it affects system performance. most early reports say NO, and it looks AMAZINGLY BETTER
 
You should *never* disable V-Sync in Creation Engine games. The game logic and physics are tied to the FPS. After playing a while with faster or slower framerate the NPC's schedules shift out of place.

I'm with you on the vsync. I thought it would be cool to run around at over 200 fps and then I saw what happens lol. It's reminiscent of those comedy skits that play in fast forward with some benny hill theme playing.

Anyway, the most relevant information for these benchmarks is the minimum fps, and that's not included.

The performance of the game also is dynamic just like the game world. Early in the Main Quest, the world isn't as dynamic as later. There's a lot more going on later. Areas that were previously empty now have NPCs, etc. Enabling Invisibility in the console might also remove CPU calculations for the NPCs like pathing and collision, which would certainly have some impact on performance.

I heard a rumor that Bethesda's Creation Engine license expires with this title. They could certainly license it again for the next titles, if that's the case. Who knows?

As far as Gameworks goes, I'd get used to it. Nvidia has something like 80% of the AIB market, and it seems like a lot developers don't want to hire more people to code their own solutions, so it might be around for awhile. Let's hope it gets better. ;)
 
My sons are playing F4 ....

4690K (4.5 Ghz) / Twin 970's @ 18% OC
4770K (4.6 GHz) / Twin 780's @ 26% OC
2600K (4.8GHz) / Twin 560 Ti @ 28% OC

Trying to diagnose problems, the following have been attempted

1. Played as above
2. Played w/ SLI disabled
3. Played w/ GFX cards at stock
4. Payed w/ both CPU and GFX cards at stock

The problem is that they will be sailing along quite nicely and then performance drops to single digit fps. Right now, they've stopped playing in the hope that patches will resolve the problem.

 

kewlbootz

Reputable
Jun 30, 2015
77
0
4,660
Why is minimum fps not measured here? FO4 is full of fps drops across all GPUs relative to shadow distance and god rays settings. Seems like it would be an incredibly pertinent metric.
 
I, for once, expected a bit better visuals for a game that gets a 970 to the high 60s in 1080p. I hope there is a texture pack, and it makes a difference. The game looks like it's carved out of stone and colored with paint.

**Braces for downvotes**
 

kewlbootz

Reputable
Jun 30, 2015
77
0
4,660


I agree. Some of the textures are fine (e.g. Maxson's (sp?) Battlecoat), but they're wildly inconsistent, especially if you go off the beaten path or take a look at unimportant NPCs clothing. It's especially evident in architecture, the rubble, and the super mutants.

I've got about 40 texture and model mods and Reshade/ENB running and it looks pretty solid now. Hell, some of the texture mods are less performance heavy than the defaults . Vivid Landscapes, Rock On, and some WIP Commonwealth retexture pack I can't quite recall the name of are really great.
 

mamasan2000

Distinguished
BANNED
Did Toms also turn on multithreading?
Could be interesting. You type these in console in the game. Or make a batch-file.
//enable CPU multithreading
tMta ON
tMtrdfl ON
tMtr ppld
thighprocess on //multi AI
SAM 1 //multi audio

Could also be interesting to compare Intel vs AMD
And as said, turn off Godrays. Loose 1/3 of the FPS for something that is barely noticable = bad idea.

For those having problems with stutter and shadows, I recommend Shadowboost over at
http://www.nexusmods.com/fallout4/?
Not too easy to search on the site so heres the page for shadowboost: http://www.nexusmods.com/fallout4/mods/1822/?

Theres tons of visual detail updates out there as well (at nexusmods) if you want higher detail textures etc in the game, usually with no FPS penalty. Just increased VRAM usage.

I'm running tons of mods as well, just as above poster.
Enhanced Wasteland, for better colors.
Realistic Lights
True Storms
Vivid Fallout - Landscapes
Texture optimization project

To name a few

 
It would have been nice to see more variation in CPU's being tested to see where any potential bottlenecks may be. At least an i5 4690 or similar which is probably the most popular cpu for gamers. Throw in an fx 6300, i3 and athlonx4 for good measure. Most people gaming are using one of the fore-mentioned cpu's. Or even simulate the cpu's so you dont have to physically change them, disable cores/ht etc.
 

DKL

Reputable
Dec 23, 2015
4
0
4,510
I wish it was written in the articles what the Fallout 4 settings and FPS are on the console versions! Would be nice to know.
 

blppt

Distinguished
Jun 6, 2008
579
104
19,160
.all of which make this 2015 title look like a re-skinned Skyrim.

Call me crazy, but I think Skyrim (with the HD texture pack) looks much better than FO4. Maybe it isnt as technically impressive, but there is an awful lot of blatant ugly textures and bad LoD in FO4. Skyrim, despite being really outdated, is still very pleasing looking to the eye.
 

blppt

Distinguished
Jun 6, 2008
579
104
19,160
"Could also be interesting to compare Intel vs AMD"

It wouldnt go well for AMD's cpus. The creation engine is notorious for only fully utilizing a couple of cores, and we know Intel slaps AMD silly in those poorly optimized game engines.

I can tell you firsthand that my 9590 oc'd to 5ghz constant is still noticably jittery vs my new 4790K at stock speed, using a 290x. But that wasnt a shock given the creation engine's known shortfalls. Now, a game like GTA5, which is about as good as it gets for mainstream multi-core optimization, its nearly identical in everyday gaming between that 9590 and the 4790K.
 
wow....the titan X kicks arse...impressive little card...love that the card is a dual purpose card....gaming and business oriented software

Just a heads up, only the original Titan had good dual precision performance. The Titan X does not and was only released to milk Nvidia fanboys.
 
BTW, in 4K it seems Fallout 4 is one of the games that benefits from the extra memory on the 980Ti and Titan X. Otherwise, why is the Fury X falling behind? It showed comparable performance in benchmarks to the 980Ti in 4K resolution.
 
Status
Not open for further replies.