'Fallout 4' Benchmarks, And How To Disable VSync

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
love seeing the 4790k and gtx 970 keeping up with the 5830K with a 980.. The FPS difference is 3 FPS, and the cost difference between those rigs is $1000....


4790K CPU still king and will be for some time.. even skylake can't touch it in gaming. 3 FPS difference! and thats a 980 vs 970
 
Not sure exactly what my FPS are (Never really cared so never looked it up before) for with my 8320 and GTX 480 on Ultra is runs not to bad. Might have to bring it down to high or so because of my Video card.

I noticed on my 8320 the game only uses the Odd cores (1,3,5,7) and not the even. Maybe it can only use up to 4 cores? If so that is probably a good reason why the Intel's will do a lot better than AMD's.
 
I'm actually impressed at how well the 620m does considering
+1.
I am planning to give a copy to my cousin to play it on his laptop.
He has an i7 Ivybridge CPU, 8GB RAM and GTX 555M.
555M is around 50% more powerful (144cores vs 96cores) than 620M and since 620M performed 27FPS avg with 18FPS minimum, the 555M could actually run the game around 40FPS with 27FPS minimum.
Not bad for a 4y old midrange GPU laptop...
 
I'm impressed by the performance of the 620M; I've played Skyrim using the Intel HD 4600 iGPU that came with my i7 CPU and get great performance out of it, around the same as a system with 620M would get. This is just meant to be my work computer, but I've been pretty pleased with how much gaming I've gotten out of it so far (mostly things like Civ V and older titles though, of course).

While that minimum frame-rate suggests that the game isn't going to be very playable as-is, there'll no doubt be numerous performance tweaking mods (and the article notes that default auto-detected settings are being used) so I'm actually now trying to decide whether I should just delay my plans for a new gaming PC so I can go for something much better, and just persevere with my current machine for Fallout 4 once performance mods start to appear…

It seems like the minimum requirements for an HD7870 may be a bit high in reality, Fallout 4 is after all just running the Skyrim engine; I'm not sure there are any really new features, just bigger default textures and more detailed geometry.
 
Would you mind seeing how it runs on the 2600k with a 1GB GTX-460? I've been trying to find some answers to that one but no one seems to be running such old cards any more.
 
"Unfortunately, it had difficulty running Fallout 4; its highest average was just 26.817 fps, and that's with a 720p resolution and on the lowest graphical settings possible." So basically your laptop is a xbox one?
 
Would you mind seeing how it runs on the 2600k with a 1GB GTX-460? I've been trying to find some answers to that one but no one seems to be running such old cards any more.

I've been accumulating some info on my site comparing old to newer GPUs, including the 460. Some extrapolation required, but there's enough data to go on, eg. GPU-heavy tests like Unigine and Call of Juarez. Looking for a newer test atm, but the days of downloadable standalone benchmarks seem to have passed. Note I use a 2700K @ 5GHz for a baseline, but various other CPUs are included aswell.



What is a i7-5830k? When was that released?...

Presumably a typo.

So Rexly, is that a 5820K or a 5930K?

Ian.




 
NOT A SINGLE WORD on the CRAPTASTIC performance of the 390X???! Not a single comment either?
Hmmm...

Good point, I hadn't really noticed, though the chart style didn't help. Most likely it's due to very different default settings with each config.

Ian.

 


I have an 8 core AMD and a GTX 480, and on ultra it does lag a little (this is also running at 1920x1200) but is play able on ultra. If you were to bring it down to maybe medium or high or bring down the resolution you should be ok. The CPU should be just fine.
 
I found the no-sync benchmarks of the i7 2600K Sandy Bridge GTX 970 rig vs. i7 4790K Haswell GTX 970 rig interesting. A solid 5%-7%-10% improvement of the Haswell over Sandy in minimum-average-max frames. Now since the Haswell runs turbo at 4.4GHz and the Sandy at a much lower 3.8GHz, it would be interesting to see the equivalent 4.4GHz Sandy performance. And yes, I have the i5 variants of each generation.

With that said, the Excel graphic data format chosen does take some time to decode for this article...but still good info nonetheless.
 
I understand this is no competition of graphic cards but i would suggest a Fallout 4 benchmarks part2 where you will pit all this hardware again with more spreadsheets

I am pretty sure Fallout 4 has also general settings of the "low medium high ultra ultra-high" sort of thing. Why didnt you pit and sort all the kinds of cpu/gpu combos this way? Can you do it on a part 2 article?

examples
Fallout 4 1080p no-vsync medium settings
and here you test all the hardware setups.

Fallout 4 1080p no-vsync high settings

and so on.

Can you do it?
 


I do because I was planning to upgrade from gtx 970 to r9 390x

 


It's probably roughly 20% improvement between those generations, but it's hard to say where the CPU bottleneck kicks in. Since 4.4 vs 3.8 is over 15% faster that's probably going to eliminate the CPU bottleneck for this example.
 
"Why does R9 390x perform poorly? Is this an NVIDIA Game?"

Well, I've confirmed that at the very least the God Rays library is Gameworks, wouldnt surprise me if other things are also Nvidia-optimized.

AMD still hasnt released a beta driver for Fallout 4 (not terribly shocking), my guess is they have a lot of work to do to optimize their drivers for the tesselation-heavy Gameworks libraries.

OTOH, Nvidia's "Fallout game ready driver" doesnt work SLI out of the box, which is pretty funny. Still works great when you do the manual "Use AFR 2" setting for Fallout4 in the control panel. Supposedly 'Optimize 1x1' works for AMD in Fallout 4 for CFX. I'm only running a single 290x ATM in this AMD box, so I cant confirm.
 
This is probably about the worst article I have ever read on this site.

why are you testing the different set ups on different graphic settings and then listing them in the same chart? it gives the illusion that lower end systems are out performing higher ones which is just compounded by the charts infinitesimally small writing that is almost illegible on a 27 inch screen.
 
Status
Not open for further replies.