How bad are "next-gen" consoles for PC gaming?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.



You see there's a big difference between what people say and actual realistic numbers.

When you say people on PC game in 4K that is basically speaking not true. Can you game in 4K, yes you can. But almost no one does.

Steam releases a hardware survey every month, they publish specs of it's users, this is hard data, actual technology that people have.

http://store.steampowered.com/hwsurvey/

Most popular single GPU is Intel Integrated graphics, only 0.06% of people actually use a 4K display.

Think about it, most people who game have very strong systems, and of these people less than 1% use a 4K display.

Consoles do not hold anyone back, most PC users just recently firmly settled on the 1080P resolution.

You have to understand that very, very few people can spend $500 on a monitor and $500 on a GPU, for many people that's 1 months rent or a mortgage.

Some can afford that but they are not significant enough as a group.

Also, up until 980TI you needed a titan to 4K, sometimes SLI Titan. A console costs $300 and two titans are as big as an XBONE lol, i don't think Titan was even out when consoles went on sale. So realistically speaking consoles were never going to be able to run 4K because of the cost.

Also as far as i know 4K TVs are still like 6%-9% of sales, so once again it makes no sense to build a 4K console if vast majority of people can not use the feature. And once again it was not possible for them to be 4K in the first place.

From business perspective what is the point of say Honda selling an Accord with a 1000 HP engine when they know fully well most people can not utilize all that power.

I personally game on both, XBOX and PC, both are great.

------------------------------------------------------------------
PS

Just to add something else, if as of now i were to buy a TV, there's over 90% chance that TV would be a non 4K TV. Since most people keep TVs for 5, 6,7 years, some even more, there's once again almost no reason to give people a console that supports 4K gaming because very few would experience it, (and once again, it was never possible to build a 4K console at the time as GPUs alone cost over $2,000)

Wasn't it here that i read that some gaming executive said that realistically it is very possible next get consoles will not be 4K either.

We need to understand that consoles are a mass market product that is aimed at average user with average budgets and average technology.
 


Thank you for paraphrasing my statement and making it more concise...
 
I5 and 970 cannot do what a I7 and Titan X can do...

I'll give you 2 examples

i5 and 970 can not play the witcher 3 with hair works @ 1080 or 1440p with acceptable frame rates...
i5 and 970 doesn' have enough vram for shadow of mordor with ultra textures even at 1080p.

There are many more examples...
 
I can't even get a solid 60 fps with the witcher 3 @ 1440p with max settings including hair works. i get close when I OC my GPU to run @ 1390 mhz and 7600 on the memory which is pretty much delivers the fps of 2x 970's in sli if not more...
 


We were already talking about the steam survey. The steam survey is probaby the best statistics availble.

I think we both understand what you're trying to say however you are all over the place and don't have any new points
 


UMA's do nothing to affect performance; it's just a coding convenience.

The reason why you can't have Ultra textures in those games is largely because of VRAM bandwidth limitations. NVIDIA has ALWAYS lagged behind in memory bandwidth, opting to go with pure shader performance instead. While this gives NVIDIA an edge in about 90% of all games, it tends to lag behind AMD in those few which are bandwidth limited, and WD and SoM happen to be two such examples.

Interestingly, my 770 GTX (2GB Version) can pretty much handle High/Ultra @ Medium textures, or High across the board, which is about where it should be.

That's one area where HBM is actually going to help. Nowadays, you shouldn't expect to run Ultra settings with anything less then 4GB VRAM, and I suspect 6GB is going to be mandatory soon.

 


SC has so overpromised, pretty much everyone outside the ecosystem is expecting a massive bust at this point.

As for Ray Tracing/Casting, all GPUs can do it, it's just a different rendering technique. But it's still slow as molasses, especially when the number of light sources starts to rise (hence why we'll get simplified Ray Casting first). All those high end lighting effects that kill FPS? Those come for free as part of Ray Tracing engines.

There's still a lot of things GPUs do badly because they are so expensive to do. Anything with realistic reflections is a no-go. Realistic smoke effects? Forget it. Multi-object physics interactions? Enjoy <10 FPS. These types of dynamics are simply too expensive for GPUs to handle, but those are pretty much the only things we can't do currently. So, we're basically tapped out. Hence why the leap from GTA IV to GTA V is minimal compared to the jump we got from GTA:SA to GTA IV. We're tapped out, and we're basically down to increasing texture resolution because there's nothing else left to do that we can.
 


https://www.youtube.com/watch?v=SN2ayVd9-3E

Real smoke looks real
 


No it doesn't; it has 8GB UNIFIED memory, meaning 8GB for the entire system. Of that, 2GB is reserved for application/OS use, leaving 6GB available to the developer. Assuming the actual program itself uses somewhere around 2GB of RAM, that would leave ~4GB or so free for the GPU to use, or similar to top-tier consumer GPUs.
 
6 is still more than average pc and when we open programs we use video ram. windows will use vram right away. Basiically many game delvelopers put more focus into the console archetexture. The console has aproximaterly 6 usable. You 770 have less than 2 GB because Windows uses 100+ mb of vram. At 1440p using win 8.1 I use slightly over 200 mb
 
Apparently the console can only uuse 6 or 7 threads for games because one or 2 are for the operating system. 6 or 7 is more than 4 on the i5 & that's why console ports Like wolfenstein are strongly recommending an I7 or a 8 core amd cpu.
 
As is now the standard for "NEXT GEN"

Sorry but console marketing is pathetic. Why lie? We can all agree it is a great, easy way to game. Why pretend it's anything more than entry level plug and play hardware?

Sony and Microsoft should be ashamed
 
But but but
30 fps is closer to the 24 Hz framerate of films thereby giving a more authentic experience.

Man, I can't even type that with a straight face.

I understand the (console) cpu architecture is more closely related to what we (PC users) have than in the past. Can we expect better utilization of cpu power with this new generation?
 


24 Hz was introduced sometime around WW1 and was used from then because people were used to this old and klunky framerate...

The sheep population of people were educated badly by Hollywood and associate modern, high quality framrates of 60+ with soaps, bad daytime tv dramas and b movies...

When actually it is a far better way to view content and gives a far more realistic viewing experience.


So BOOOOO Hollywood
 
The Ubisoft guy said 30 fps is ideal and then a bunch of people quit. I'm glad unity isn't capped at 30. Apparently since AC 3 a lot more focus is put on Pc development.
 
Indeed times have changed. I wouldnt mind if they slowed down the hardware advancements a little bit so we can max out the current one and maybe polish some games. But still I would be disappointed if we "went back". Remember companies are here to make money. There will always be something better!

This said im really happy that now almost anyone can buy a not so expensive pc and still play the latest AAA games with some "great" graphics. Steam is going to gain more users with time i think since you can use it pretty much as a console on a budget pc.



Interesting post!