the end of graphics cards?

SortNVF

Reputable
Mar 3, 2015
131
1
4,710
Im not asking if we are all going to onboard.

The question is simply. Given the limits of the human eye and all. Eventually cards will be so powerfull that more frames or pixels becomes pointless, less screens grow to wall size. Actually even if that happens, theres still a limit to how much we will ever need.. or?

Anyways i played games when screens were monocrome, so to some extend im easily satisfied. Probably not able to visually tell the difference between 60 and 160 fps etc.

But still arent we getting close to little or no visual benifit from GPU upgrades theese days?
Obviusly theres the power usage and cooling needs to consider aswell. But the basic question remains. When will enuff be enuff?

All this obviusly is without considering virtual reality, holographic projectors etc.
 
In a long long time. With the addition of 4k and 5k resolutions, we are slowly getting closer to what the human eye actually sees. One day we will be able to play games as if real life.
 
Of course there will be a time, just like there is functionally no point to buy a dedicated sound card these days (if you ever so a subjective analysis many cheap onboard sound chips actually beat pricey dedicated cards).

I don't think we're close to the end of improvement to graphics. We're still using raster-based rendering that approximates 3D using triangles. In order to get to the level of perfection you're talking about we're going to need real-time ray-tracing which is not feasible with current or even near-term foreseeable technology. I'm pretty sure we'll be be seeing video processing upgrades for years to come. I'm not sure if silicon can even provide the theoretically perfect result either, so we may get bogged down before we achieve it. We could all be long dead before then.
 
Right now not even the most powerful supercomputer can simulate the human brain, actually even simulating real turbulences in water takes weeks to complete, so no there will probably never be to much power in a computer, and if it will even at the Monroe scale it will take centuries and maybe even millenniums to reach so dont worry about it 😉
 
When I was a kid my parents bought me Pong - that old game with a white dot bouncing around the screen. Today there's BF4 and whatnot. The difference in 35 years is amazing and if the trend continues at the same rate we'll be looking through live windows in 30 years. But I don't think that'll happen. The trend has curved down.

Heck, the 6870 I bought four years ago can still play every game on the market today and probably will for the next 3 years or so at least.

Before 4k, it seemed like high end gpu owners had maxed out speed/fps and were simply waiting for programers to make more gpu challenging games. But 4k has taken care of that.
 
Indeed 4k set back my assumption a little. I realise that some new and better screens will come along to start the ball rolling again. But like sookslayer is saying. We were kinda getting there werent we?

I tried to exclude new techs like VR and bigger screens but i see how 4k slipped past all that and obviusly id like 4k myself, eventually cause its a big step in gear expenses.

Also i think im old enuff to claim that quite a few of the visual improvement steps, was followed by beautiful games with no real contents. Till eventually decent games caught up with the tech and the cycle started over.
 
well it seems like so isn't it? but i think graphic card has more room to grow. it might slow down a bit but gpu maker will think of something so discrete gpu will stay relevant as long as possible. even if the graphic complexity reaching to the point of diminishing return gpu maker will come up with other idea like much bigger resolution. and right now we already know that 8k will be the next after 4k.
 


the new console release might also help them a bit. with new console some dev aiming for much complex graphic that is not possible with 360 and PS3. though it is funny when you see that the latest console hardware still have problem to push 1080p especially the new xbox.
 
Consoles honestly arent helping anyone. They dumb down game titles to fit controllers while drawing in customers early on, cause "uhh Halo looks soo awesome and its not for PC". But since the money sunk into a console now all the game devs needs their game fitting the format, or they cant sell. Hence making dumber games, dumping the fun of PC gaming...

Im thinking we are at a point where GFX evolution, cant be tweaked enuff to make the next console all that awesome. Finally cause i hate the damned boxes and the focus group game dev that came along.

Personally it was never about the last pixel for me. I want the game to be like a good book on the story side, from there the visual part is just support for me. But obviusly a new shiny thing is well new and shiny and thus awesome even to me.

It would concern me, if it was like the 90's when your computer was basically outdated from the time you ordered it till it arrived, when you needed a P3 to get the next title running atall and so on. Theese days i see brand new games running at a playable rate on late DDR2 ram machines, wich is rather impressive considering DDR4 is taking over soon'ish. Its also pretty obvius that I7's arent really taking giant steps ahead. Especially with games barely able to use multicored CPU's and clock speeds not really rising. Even if games and OS somehow gets a handle on multicore CPU's now, most people have atleast 4 cores and 6 isnt uncommon.

Its not that i dont see possible development in GPU's im just seeing that its slowing down and generally PC's getting ahead of the tasks we are able to put to them. Or is that just me?
 


I've heard that argument of developers "dumbing down" games to fit consoles and that's probably true. But they're also probably doing it for all the owners of less power PC's too. Not everyone owns an i-7 and a GTX 980.

I do agree with your assertion that advancements in cpu's seem to have greatly outpaced gpu's.

 
Well Snookslayer its not so much an argument as it is a fact. Have a look at titles like diablo 3, X-rebirth, etc.
Focus groups added junk that everyone liked in WoW, interface was dumbed down to work with two bit console. All it left was bland games that nobody really liked, cause they had all the stuff the others had but no real appeal to a dedicated fangroup.

As it is now CPU's arent really faster or stronger than GPU's we just lived true an era of awesome graphics development and wery little risk taking from investor side. So every new game had to be awesome looking, contain focus group junk (random loot boxes, "colorful" repetitive caracters) and preferably fit tablets, consoles and other inferior media.

But then look at a game like Elite dangerous. Definately a limited fanbase sure. Had to go crowdfund to ever be made. Is it gonna work with 8 button console controller? nope
Will it be playable on a tablet? nope
Will it have full scale star system models with orbits and simulated navigation, tons of buttons to do things, complicated submenus.. Yup and your 2009 cpu prolly wont break a sweat running it.
Your GPU however...

For ages making games harder was all about adding DPS and HP to villans. I must have fought Tavion thousands of times in Jedi outcast https://www.youtube.com/watch?v=2OhdmFQhX-A
Even recently.
Not cause it looked any good or even cause im all that good at the game.
But that thing is a proper monster, can do most the same things as me, take the same hits. But the way AI runs the combos and counters.. Well theres that and the old UFO X-COM thatd tear you a new one and call you a crybaby if you quit.

You just dont see enemies like that in focus group games, its all fairly beatable even comes with cheatcodes sometimes, cause you wouldnt want people feeling actually challenged by the game. No you want them pulled along, with a feeling of barely making it all the time, preferably buying new stuff online to keep everything just above easymode. Constantly cheered like as if playing a slotmachine where that extra buck will allways take you to heaven.

Game developers forgot that back in the days, people took to pirate copying games. Not just to cheat out of paying, but because titles came out with big commercials, flashy grapics and zero contents. I remember paying 100+ euro for a game that took me 3½hrs to blaze true. Hell i picked up girls in bars spending less and having more replayability.

So in part my question is really. "Will focus shift to actual game contents soon?"
The reason i think it could be so is that gap to 4k. We can pertty much run everything in HD and honestly i dont care for more detail, allready thinking HD TV looks like it was filmed in a cardboard box. Also theres price to consider atleast till current monitors get worn down and 4k pricetags drop.
 
Your post reminds me of the Elder Scrolls which I'm slowly nearing the end (yes, I'm always years behind everyone else). In any case, the most fun I had playing the entire game was one particular boss a while back that for whatever reason I couldn't beat. I think I accidentally ran into something that I wasn't supposed to until several levels later. In any case, after much frustration I ended up going back to my weapon's stash and looking for stuff that might help. I tried a bunch of weapons/spells before finally finding some weapon that barely destroyed it. It was the most fun I had the entire time playing before or since. The game is pure bling, like you're saying, but challenging wise... meh. It's mostly mouse clicking and moving around until you beat whatever your facing.

So I agree that it's too bad programers seem to focus more on bling rather than content. At least with grand strategy games like Crusader Kings 2, you do get a a challenge (although no bling).

The good thing about games like Battlefield is the boss/foe is an online person. You get the bling and challenge all rolled into one.
 
A 480p TV show still looks more real than a 4K game on 3 monitors ultra settings. So I think we have much improvement indeed. I think that we should not assume that graphics cards will become "too good" one day. People were saying similar things all the time, "Wow look at these PS2 graphics!" If you watch Toy Story 1, the first fully 3D computer-animated movie, it still looks better than a game on ultra settings today, and it's from 1996 (or whenever). Compare that to animated movies today, and Toy Story is way worse. I think there's much room for improvement.
 
As for skyrim i just picked up the legendary edition, had the original release finished but started from scratch to play the added contents as it unfolds with the original storyline.
Skyrim as all the elder scrolls is highly focused on contents and storyline and that is atleast a branch away from pure bling, thou the challenge level and hands on skill level needed is set kinda low and uncomplicated.

Indeed turkey 480p tv looks more real, but set aside actual movies filmed IRL, the animated films you see have undergone thousands of hours in rendering shading etc to make them appear as "real" as they look. Doing that at a serverfarm with as much time and power as it takes, cannot be compared to real time interactive graphics.

However you do have a point, if 480p can look that "real" why would i want 4000p looking less real over just 480 interactive "real" looking pixels?

The big issue is things like shading and reflections. All handled by algorithms again handled by GPU, at our current tech level theese algorithms are set to do the job more or less, but preferably fast so we can keep FPS up.
Obviusly at some point we will be wanting it to look better, but theres no way we are giving up our FPS standards. So in that area, sure we want better algorithms running and for that we need more computing power, hence more powerful GPU's.

Its possible that with the 4k pixels algorithms will actually be better because theres more detail handled and thus maybe it will be easier to emulate "real" looking things by just plain drawing them, rather than running tricks and guesswork to get the shades and reflections right. But im not really that into how pictures are made or even any good at telling quality of pictures.

I do know that for some reason HD does not allways look wery real to me, i cant put my finger on it, but somehow the sharp details of the pictures, makes the background look too close, wich makes even high quality movies (ie LOTR trilogy) look kinda cheaply made in some scenes.

Not that id recomend the movies, but the twilight series, especially the first movie atleast on early HD TV's was a clear example of what im trying to describe.
 
@SortNVF: You are incorrect that comparing server farms to gaming cards is a bad comparison, because it just shows that if server farms get better maybe someday one card will be as powerful as a whole farm.

Also, IMO LOTR looks more real than the Hobbit.
 


i think they know very well that PC have flexibility. cannot run it on high then set it to medium or low. so it depends on developer themselves if they want to push the graphic higher on PC (and that depends on if they think it is worth it or not to spend resource to do that). but what i don't really like was the talk about how MS and Sony try to force developer to 'dumb down' the PC version so the console version does not look bad compared to their PC counterpart.
 


Interesting. Reminds me of the time my 16 y.o. nephew who plays consoles came by my house a couple years back and played BF3 on my computer. The first thing he said was "wow, look at the graphics!" It was the first time I realized there was any difference.

Sucks to know the graphics could be even better if developers built games to maximize PC's potential.
 
well turkey i wasnt comparing server farms and post editing to grapichs cards. I was saying its destinctly different from handeling game graphics live and thus you cant just demand the same and claim developers arent doing their jobs.

As for Corporations than can and have in the past, sued people to stop criticism, bought into titles that didnt fit their agenda and destroyed them, deliberately run artists so hard that they eventually died or quit.. Well im not naming anyone, but i could be saying that kind of buisness is not improving the final products in our hands.
 
Better relative to consoles, no doubt. That was my point. I don't play consoles anymore so I didn't realize BF3 on console had significantly different graphics than on PC.

Hypothetically, it would be neat to see a game specifically designed for the top say 5% PC's. I couldn't play them (my puter ain't top 5%), but just to see what developers could do with the best hardware currently available.
 
Graphics on PC compared to console.. Well consoles will allways get the optimised version, cause its just that one GPU in every single Xbox, PS, whatnot. So as long as you script to its strenghts youll get a decent result.

PC crowd however is going to be running a mix of the last five years of development and you gotta get your game running on as many as possible out of those cards, or live with staggered sales as people upgrade. Hence the PC version may not be as shiny as the console version. But then again how long did it take from Xbox360 release till PC grapichs beat it completely?
Once you get that 360 your stuck with what it can do. As for PC you can just upgrade and as long as your PCIe can keep up your pretty much good. Also with the propriatary formats on console games, your kinda buying in blind trust, that whatever company you bought from, will supply games worth playing.

But all that aside, what i originally ment by dumbing down, was simply.. anyone remember flightsim games? 30+ function keys, fuel guages, temperatures, hours of work to get anywhere etc.
Hows that done today, oh you have 8 buttons and a joystick, if you crash you instantly reload with barely any loss. To make sure you get your little victory you get on screen pictures of the buttons your supposed to be pressing and even if you fail half of that your prolly still gonna make it. If you dont, you can allways buy some DLC and make it evemn easier.

But sure it looks pretty good, its just not an actual challenge. You could offcause go online and play with human opposition, but its just bound to be killing the immersion of it all due to half the people being 14year olds, named after recent action heroes and moving around as if they were on drugs. Worse yet if the console like interface isnt constantly forcing you to click combos, you run the risk of them writing things to you.
 
Nah snook. Thats just the 90's all over again.
I remember half the block not being able to play morrowind, due to specs set for early next generation CPU's and GPU's.

Its just going to annoy people off the game till they get the tech and once they upgrade they wont buy it at premium price.

Oh on that note Bethesda sure thinks a whole lot of themselves and skyrim, with the pricetag they keep on steam. I payed full premier price for the actual game, but they want either that same amount for a brand new game with all expansions included or like 1½ times the price for step by step upgrade.
Honestly how old is that game? 4-5 years? full price?

Cant say i blame kids for adjusting the budget a little with a bit of piracy here and there.

Now im not against companies making money off their products, i even go that extra mile to make sure i pay some companies for a good product. But i may aswell admit that not every software, music or movie maker out there still enjoys that courtesy. Partially due to severe mishandeling of products and customers.. In no particular order, EMI, SONY, Microsoft, EA games, UBIsoft.
 

Crysis was a game more or less only the top few percent computers could run at an acceptable FPS.
Crysis 3 is another.
I'd like to say GTA 4, but I just can't; that game is just unoptimized, rather than being something designed for high-end gaming hardware.