PC gpu less powerful than a console's?

Status
Not open for further replies.

henrytowns

Distinguished
Feb 20, 2011
62
0
18,630
Hi,

I dont really know in what category the question fits but here it is.

I am amazed by the graphical details of new CONSOLE games like uncharted drake's deception, L.A noire, BF3, Crysis, Gears of war 3, God of war 3 etc.

What i dont understand is that how come consoles' dx9 gpus (e.g ps3 gpu geforce 7800 ) are able to show such remarkable shading details (seemingly shader 4.0 or 5.0 quality in latest uncharted ), motion blur effects and process other graphical effects which, if rendered on PC, would require a modern gpu like geforce 560ti to do the same.

maybe i am missing something here but what i can imagine is that my 7800GT on PC would not be able to run uncharted or god of war 3 with same quality and speed as they do on a ps3. I am sure the game would not even start. Like Metro 2033 would not start on my ati 1950 Pro which was faster than a 7800GT. Although metro 2033 looks great on Xbox360. And Metro last light is coming for PS3 which has some sick graphics which would easily devour a single poor GTX 560 ti while still manage to show its colors on consoles.

This really bugs me. Even if the games are a tad toned down on consoles why cant we tone down all the modern games and run them on our old DX9 gpus? or is it some gaming industry conspiracy that they want us to buy new gpus every year?

I want to clear my mind on this one. All that i stated above is my opinion which might be wrong. Please respond with your thoughts. Welcome!
 
Its amazing what you can do when you know exactly what GPU it will be running on, you can optimize your software for it and compile it in the optimum way for all the nuances of the hardware, while PC games must be written in a way that is friendly with a variety of hardware. Its also important to note that most games on consoles are running at much lower resolutions, many 360 and PS3 games are only 720p which is much easier on a GPU than running a PC game at 1920x1200, only about 40% as many pixels for the console to deal with.


You can tone down games to still run them on your old DX9 GPU, its called 1280x1024 resolution and lowest settings for many games, but ones with DX10 only you are out of luck, but if they keep providing backwards compatibility there is little incentive to upgrade and software gets stuck in a rut where it can't move forward or it risks alienating its customer base so they try to only alienate a few at a time so people upgrade and they still make money. You will notice there are no DX11 only games out yet, they all have a DX10 or DX9 path to support older systems so i would say we still tone them down to run on old GPUs.
 
thats because todays most of the pc games are console optimised . Also the console API is more optimised than the pc's . The game companies think that they lose a lot due to pc piracy. But they are ignoring the illegal downloading of console games. So the game developers optimise the games for only consoles not for pcs. So we get a good pc game with a crap graphics even we have more horsepower than the console.
 
I dont know what console's you are looking at, but my PS3's graphics are poop compared to my PC, and I dont have a very high end PC. One major thing that is lacking in console games is TEXTURE DETAIL and DRAW DISTANCE. These are kept to a minimum to speed things up and save RAM, which consoles have a lack of. And as said above, games are coded directly for the hardware, no need to go through direct X, load the game on top of windows, virus scanners etc. and no need to code the game to work on 100 different types of video cards and cpu's, which all makes the PC less efficient.
 
what about uncharted shader details? in the latest installment? i can see the details on the face of drake. I can imagine a console gpu might not be powerful and things need to be toned down but how come it show off even a pixel which looks like it comes straight from shader 4.0 model?

I dont own a ps3 or xbox. i have a good pc. but i have seen gameplay videos of uncharted and GOW 3 and they look stunning! i dont know how dull can they look in real if they look jawdropping in gameplay videos.

@hunter315
and Witcher 2 was a pc exclusive at first and it doesnot support dx9. it needs 8800GT minimum. If u lower the settings the game starts to look crap but still this crap doesnot work on a dx9 gpu.
 
Its amazing how much better things look when you scale them down to the size of a youtube video, and are they actual game play or prerendered cutscenes? Cutscenes are always of higher quality as they aren't rendered on the spot and take very little GPU power to show. Once you expand those games across a 50" TV and they are only running at 1080P at best they have to be much grainier than a PC running at 1080P on a 24" screen, the pixels on the TV are just so much larger.

Shader models are a standardized model so that games know if a card has X shader model it can do Y, but with consoles you know what console it is so you already know exactly what it can do, and can use interesting combinations of commands to get higher quality graphics out of a weaker card that wouldn't be doable for something made to run on multiple different cards. Almost all of the higher level commands in the newer versions of directX are simply automated routines that execute many simpler commands that would be hard for a person to code in repeatedly, never underestimate what can be accomplished with simple commands and a lot of creativity.


Edit: As for the comment about witcher 2, allow me to sum it up this way, "Oh noes, my beautiful brand new game requires that i have a card newer than four years old! Oh nooooooo!!!!" Seriously, if you are planning to run brand new games and you are still 2 versions of DirectX behind because you have a 4 year old card thats a different problem all together than optimization. The latest steam HW survey shows only 21.24% of gamers using a system that does not support at least DX 10, thats a fairly small number and new games likely don't affect most of them.
http://store.steampowered.com/hwsurvey
 


Even if u scale down mafia 2 to 640x480. it still would not work on dx9. new games dont even start on dx9 because they need shader model 4.0. Crysis 2 utlized dx9 and still it wont run on a dx9 gpu. later the dx11 patch came through and the game looked even prettier. But the game was still pretty on dx9 and all the time i was thinking it was dx11!

so if dx9 could do so better why other games which look less polished than crysis 2 demand a dx10 or dx11 card. so i am guessing uncharted and GOW3 detail can be produced on a pc dx9 gpu if written specifically for pc? Atleast as a test?

Edit:
@hunter315
i have a GTX560ti, 8Gb ram, corei5 2500K. but i just wanted to know why certain games looked so good on dx9 and others would the same on a higher dx version.
 
If we had PC games that were as optimized as consoles games were today on a high-end PC(580SLI) the graphics would be at Avatar level quality. Maybe one day soon.......
 


Is a pc exclusive game not a game optimized for pc? if yes then some pc exclusive games have been out and they did not look any special. if pc's can do avatar then its a shame we dont even have a demo which we can run to test our GPU capability. i have tried 3D mark latest version but ofcourse its not avatar quality.

but if games will not be specifically written for pc , it means we will forever be upgrading our gpus. dx10,dx11, possibly dx12, dx13 and at the same time not taking full advantage of their powers.. thats amazingly disturbing.
 


It's specific to PC, but there are endless numbers of PC configurations and architect types so it is impossible to fully optimize like the console have done with only two specific types. Plus they have had 6+ years to work on and perfect their graphical engine and code on one or two specific systems. My GTX 590 graphics still looks way better though. 😀
 
You will never have games optimized for PC hardware, thats why hardware abstraction layers and APIs like directX exist. It keeps programmers from having to compile 200 different versions of their code so it supports all the hardware combinations. DirectX comes with overhead in itself, and in the drivers that convert the DX commands into hardware level commands for the cards, but your only other alternative would be recompiling every game every time you made a hardware change which would take weeks for each one. Consoles don't have drivers, APIs or hardware abstraction layers because there is only a single hardware combination for each console so it can be compiled directly into code for the hardware rather than code that will be converted by another program into hardware instructions at runtime.


If they were to optimize a game for PC it would only work on a very few very specific hardware combinations(a dozen at most), any deviations from that would cause errors, but the lack of the drivers needing to convert to instructions would probably net another 10% onto the performance.
 
would it even be possible to "optimize" a game for a pc without completely bypassing Windows, which itself has to accommodate many different hardware configurations.
 

You seem to forget that Crysis came out 4 years ago that was just released on console a few days ago, and even after an extra 4 years of learning the consoles inside and out, they still can't make it look as good.
Games like Crysis are an example of a good PC exclusive. Another one is Witcher 2; yes, it was released on consoles, but have you ever seen comparisons? My 6950(soon to be 2 6950's) can make that game look a billion times better than on console, AND it's just directx 9.
 

While your point is valid, keep in mind many people believe that Crysis 1 was a very inefficiently programmed game which makes the thought of a well-programed Crysis scary 😀
 


I'm confused. The Witcher 2 is a Dx9 exclusive game. That is all that is supported. Where did you read it's not Dx9?
 

True...I guess that might not have been the best example! Regardless, there are plenty of games that are made for PC and look great. I mean, go on youtube and search "Robbaz Comparison" to see the dozens of games that look gorgeous on PC
 


No joke most gams list it and even the bargain bin at the local big box and most of them were 1080P. ( I didn't buy any)
 


Oh there is no bout that PC graphics look amazing and can topple any console game. But if optimized like consoles are then the graphics would blow our minds. I would pay top dollar for just one game optimized for my 590 and my specific PC setup. Top dollar I tell you!! :bounce:
 

Not JUST a dollar, but a TOP dollar! That's like...worth a billion times a normal dollar, right?
 

Console games that run at "1080p" are upscaled after actually being rendered at a lower resolution.
 
I play alot of dirt (2 and 3) on PC, and while I was at futureshop one day i decided to try it on the xbox360 on a full HD tv.
First thing I noticed, the edges were very jaggedy, looked like there was definitely no anti-aliasing being used.
2nd thing I noticed was the frame rate... it was seriously hampering the gameplay compared to my PC which runs it at 50-60fps, this was probably close to 30fps.

Same two above problems I noticed while playing GTA4 at a friends house on his PS3, it was so laggy I couldnt even play it, 20-35 fps it seemed. Another game I play alot of is Street fighter 4, besides taking several times longer to load on my xbox than on my pc, it is much less smooth even from far when compared to playing it on pc.

I own an xbox360, play halo, and various games now and then. I connected it to my computer monitor to play halo once, and at the same distance that I sit to my PC games I played halo reach. It looked absolutely terrible, like several times crappier than on my 42" tv and sitting far. Both were running at 1080p (actually tv is only at 1080i...)
 
Status
Not open for further replies.