Can I run 1366 x 768

giangkiller20

Prominent
Jan 31, 2018
12
0
510
Basically I have an I3-4170 8gb of ram and a gt 705 and I wanted to upgrade to gtx 750 (I’m on a budget so cheaper is better) I’m wondering can 1gb vram run modern game 30 FPS (I don’t about graphic settings) or just buy a gtx 750ti
 
Solution
Resolution doesn't directly impact VRAM usage significantly. The difference between a 1366x768 framebuffer (4MB) and 4k framebuffer (32 MB) barely makes a dent in 1 GB of VRAM (less than 3% of the total). Full screen anti-aliasing will increase this substantially. While options like vsync add an additional framebuffer (a virtual screen in memory that the GPU draws into).

The vast majority of VRAM is used for holding textures. Each notch you bump up texture quality increases VRAM use by 4x. Each 512x512 textures will take nearly 2 MB (1 MB for the primary texture, 0.5 MB for the 50% mipmap, 0.25MB for the 25% mipmap, etc), so 100 textures will eat up close to 200 MB of VRAM. Bump up to 1024x1024 textures @ 8 MB each and now you're...
At that resolution the 750 should be able to get you 30 fps on most games as long as the game isn't CPU demanding, since you have an i3. I'd still try for the 750 Ti though, not just for the memory but because it is a faster card. Since your focus is modern games you'll want all the performance you can get, even at 1366 x 768.

The amount of vram doesn't directly affect performance. There's no technical reason a 1gb card can't run a game at 60fps. Where vram comes into play is when the amount of vram a game needs at a certain combination of resolution and settings exceeds what the card has. Then you can lose performance.
 
Resolution doesn't directly impact VRAM usage significantly. The difference between a 1366x768 framebuffer (4MB) and 4k framebuffer (32 MB) barely makes a dent in 1 GB of VRAM (less than 3% of the total). Full screen anti-aliasing will increase this substantially. While options like vsync add an additional framebuffer (a virtual screen in memory that the GPU draws into).

The vast majority of VRAM is used for holding textures. Each notch you bump up texture quality increases VRAM use by 4x. Each 512x512 textures will take nearly 2 MB (1 MB for the primary texture, 0.5 MB for the 50% mipmap, 0.25MB for the 25% mipmap, etc), so 100 textures will eat up close to 200 MB of VRAM. Bump up to 1024x1024 textures @ 8 MB each and now you're at 800 MB and almost out of VRAM.

https://upload.wikimedia.org/wikipedia/commons/5/5c/MipMap_Example_STS101.jpg

Anisotropic filtering (basically, textures which are re-rendered for a tilted orientation) will eat up a lot of VRAM too.

https://upload.wikimedia.org/wikipedia/commons/d/dc/MipMap_Example_STS101_Anisotropic.png

When you run at a higher resolution, those low quality textures start to look blurry for the resolution, so you're tempted to increase the texture quality. *That's* how increasing resolution eats VRAM. If you're content to leave the texture setting where it is, you can crank up the resolution and only use a few dozen more MB of VRAM.

Likewise, when you start running low on VRAM, the first thing to go are the textures. Games will start dumping textures from VRAM to make room. When an object which needs that texture appears in your screen, the game will stutter briefly - it freezes until the texture can be moved from RAM into VRAM, or freeze for a longer time as it waits for the texture to be read off disk.

So if you're content to play with low or medium textures, 1 GB will be plenty. High textures (usually 512x512, 1024x1024, and a few 2048x2048) will start to be iffy. Very high and ultra out of the question. The numbers I've heard are that most games sit at about 100-150 textures loaded into VRAM at a time (the load screens in a game are primarily for the video card to load new textures for a new zone). With some modern titles up around 200-300 textures.
 
Solution