How long do graphics cards last compared to consoles performance wise?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

connor_6

Reputable
Oct 7, 2015
60
0
4,630
If I was to buy a graphics card similar in performance to a PS4 i.e. an r9 270 or 750ti. Will it always be able to play games to the equivalent of a PS4 at least for the life of the console? Or do PC games become more demanding so would I need to upgrade again in a couple of years to match the PS4 again? thanks
 
Solution


Yeah if your PSU is good enough then you can get a good deal there, the 670 is around 950-960 in terms of performance.
 


monitors do not upscale
they fit the image and redistribute pixels
 


Most companies have stopped producing native 240Hz TV's and anything higher as well. Including TV's such as Sony's 930C and 940C. Motion rates now dominates the scene. It was determined that as long as the hardware behind the 120Hz panels was sufficient and the pixels were fast enough to cope with a true 120, the better interpolation resulted in a better picture.

By "adding detail" the up-scaling in a TV uses algorithms to place the correct colors in the correct spots, as well as move lighting and brightness. They don't just open the same image on a higher resolution. If you have no idea what I'm talking about I suggest you read a few articles about Sony's "Mastered in 4K" compression and upscaling algorithms and how the math depicts the digital signals in both cases. You will get better lighting and shading, as well as AA and sharpness from a better up-scaler.

Or just go outside and look at some new TV's. I would take some game recordings from you PC at home, put on flash drive, go to a store that carries TV's (like best buy) and just throw it up on multiple TV's. YOu can even see places where the math in the algorithm makes serious mistakes. If you want to see drastic errors in the process, throw an extremely high action scene onto LG's OLED TV and you will see things actually change shape and color, Or get hidden behind objects. Its weird the first time you notice it, but after a while you'll see what I mean by TV's changing the image significantly.
 
You realise that you're talking about the X1 chip, which only upscales, not adds detail, what it's doing is maintaining the (most) of the quality in the original source, when upscaled to "4k". And you're also talking about HDR, but in order to get better lighting and brightness, the content needs to be HDR compatible as well, not just Rec. 709, but Rec. 2020, 10-bit. There's no magic involved, or special processing. You can't add detail that's not there in the source. And, 240 Hz TV's are still being manufacturered, high end Active 3D sets use native 240 Hz panels. Motion rate doesn't dominate the scene, not sure where you got that from?

 


There isn't any redistributing to be done with 1080p on a 4K monitor. The image fits perfectly without that. These are issues for 720p on a 1080p display or 1080p on a 1440p display, not for 720p on a 1440p dispaly or 1080p on a 4K display (at least assuming the typical 4K resolution for monitors, 3840x2160).
 


TV's do not simply display the image. Every single TV on the market has some form of alteration. Its as simple as That dude, the X1 chip is just the one that does it with the least amount of error and lag.

Don't even get me started on the technologies in the screens themselves that arent in monitors either......