Gaming on a 15.6inch FHD laptop and how games have evolved in the past couple of years?

rav007

Honorable
May 7, 2012
24
0
10,510
Hello

I bought a gaming laptop last year in September, it has a GTX 765m and i7 4700mq. It is 15.6inch with full HD screen. So when I bought it, the GPU was chosen because it was relatively cheap and could play almost every game released at the time in high to ultra settings in FHD.

However over the past 15 months there have been many games released, a lot of which are poor console ports it seems, with a ridiculous system requirement to play them, so a few released this year I can only play in medium and Assassins Creed Unity only on low, which is agonizing and a reason I haven't bought any new games.

Now usually given the laptop screen isn't a 40inch panel, I find I don't need full settings like 8xMSAA and ultra shadows, usually a setting down will suffice for the screens capability, does anyone else find this?

Also, thinking about games settings I was thinking that even though I can only run some of today's games in medium, is that because today's 'medium' is equivalent to 'high' from a couple of years ago and 'ultra' from years before? Meaning that games have just been given more detail now because GPUs are more capable? I would really like to buy some of these new games and play them but I don't want the disappointment of launching it to find I have to play in settings and visuals that are ancient. I would like the games to at least be in similar visuals to the games I can currently play on high and ultra, if that makes sense? Any advice and shared experiences will be much appreciated.

Thanks
 
Solution
That comparison doesn't always work, because games on consoles are locked at 720p @ 30fps, which makes those games a lot less demanding than they actually are when they are translated to the PC version, it wouldn't take much of a PC to get that kind of framerate on an equivalent PC.

However the main crux that fueled so many debates without a proper answer is how the games handles RAM. In Consoles, the system and GPU shares their RAM, which totals 8GB. Now, assuming that the console system itself uses 2GB, that leaves about 6GB for GPU. That is for the game developers to use at 720p alone. It will only get worse at higher resolutions.

However, there hasn't been very many conclusive tests out there would (at least somewhat)...

chenw

Honorable
The jump in game requirement was, from what I can see, pretty sudden in the past year due to console generation change. The few years previously the game requirements were pretty steady because the console markets were still XBox 360 and PS3, hence why one doesn't really notice it. I personally haven't looked at game requirements for a while since i got my 570 but I have upgraded it to sli 970's and things are looking like it's going to get worse before the requirements stabilise again.

That being said, Mobile GPUs have also made leaps and bounds, thanks to Maxwell's 980M and 970M. 980M is roughly equivalent in power to a desktop 970, which i find extraordinary, because mobile GPUs usually feels like generations behind, not mere 1~2 cards behind. (For example, my brother's 860M didn't do well when compared against my 570).
 

rav007

Honorable
May 7, 2012
24
0
10,510
I definitely saw the new console generation as a factor, but when I looked at the actual GPUs, the xbox One has a 7790 and the PS4 a 7870 I believe, and from performance statistics the 7790 is some 20-30% better than my card when I overclock it, the 7870 looks to be around 50-80% better which to me suggests that my card will be relevant for a while. In reality both consoles struggle with Assassins Creed Unity too so it seems like their upper bound of performance has already been tested and if I can play that on low to medium it means I should be able to run anything launched for the PS4 or XboxOne in the coming years. It would be interesting to know if the graphics on low in these new games are comparable to high in games from years ago though.
 

chenw

Honorable
That comparison doesn't always work, because games on consoles are locked at 720p @ 30fps, which makes those games a lot less demanding than they actually are when they are translated to the PC version, it wouldn't take much of a PC to get that kind of framerate on an equivalent PC.

However the main crux that fueled so many debates without a proper answer is how the games handles RAM. In Consoles, the system and GPU shares their RAM, which totals 8GB. Now, assuming that the console system itself uses 2GB, that leaves about 6GB for GPU. That is for the game developers to use at 720p alone. It will only get worse at higher resolutions.

However, there hasn't been very many conclusive tests out there would (at least somewhat) scientifically be able to judge whether or not VRAM actually is bottlenecking current GPUs. Some say yes, because VRAM is being loaded to the brim in these games and are still stuttering (which can also be caused by the game simultaneously using as much VRAM as it can use and being badly coded), others say no. Toms did a comparison between a 290x 4GB reference and 290x 8gb, but I feel the test is not enough, because currently, a single GPU power may not be enough to fully use that 4GB, the only accurate indication of VRAM bottleneck may only be visible at multi-GPU setups, where the GPU power bottleneck can be widened. There isn't enough Apples to Apples test out there.

Longstory short, Consoles uses low resolution and fps in order to be able to keep up with the number of years it is expected to run, which leads to games that are designed heavily basing on Console architecture, which is very different from a PC architecture, and will negatively affect its performance.

There are not enough developers out there willing to optimise their PC versions, while I cannot agree with their decision (being an exclusive PC gamer myself), I cannot help but understand their position: consoles are identical, so optimising for a specific console(s) is a relatively trivial matter compared to literally millions of computer part combinations, and high end cards (in fact even mid-High end) remain a small minority, leaving developers less incentive to optimise it.
 
Solution