What Does DirectCompute Really Mean For Gamers?

Status
Not open for further replies.

hunshiki

Distinguished
Dec 4, 2011
58
0
18,630
0
[citation][nom]hotsacoman[/nom]Ha. Are those HL2 screenshots on page 3 lol?[/citation]

THAT. F.... FENCE. :D

Every, single, time. With every, single Source game. HL2, CSS, MODS, CSGO. It's everywhere.
 
G

Guest

Guest
[citation][nom]hunshiki[/nom]THAT. F.... FENCE. Every, single, time. With every, single Source game. HL2, CSS, MODS, CSGO. It's everywhere.[/citation]

Ha. Seriously! The source engine is what I like to call a polished turd. Somehow even though its ugly as f%$#, they still make it look acceptable...except for the fence XD
 

theuniquegamer

Distinguished
Sep 7, 2011
279
0
18,790
1
Developers need to improve the compatibility of the API for the gpus. Because the consoles used very low power outdated gpus can play latest games at good fps . But our pcs have the top notch hardware but the games are playing as almost same quality as the consoles. The GPUs in our pc has a lot horse power but we can utilize even half of it(i don't what our pc gpus are capable of)
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
1
I hate depth of field. Really hate it. I hate Metro 2033 with its DirectCompute-based depth of field filter.

It’s unnecessary for games to emulate camera flaws, and depth of field is a limitation of cameras. The human eye is able to focus everywhere, and free to do that. Depth of field does not allow to focus where the user wants to focus, so is just an annoyance, and worse, it costs FPS.

This chart is great. Thanks for showing it.



It shows something out of many video cards reviews: the 7970 frequently falls under 50, 40, and even 20 FPS. That ruins the user experience. Meanwhile is hard to tell the difference between 70 and 80 FPS, is easy to spot those moments on which the card falls under 20 FPS. It’s a show stopper, and utter annoyance to spend a lot of money on the most expensive cards and then see thos 20 FPS moments.

That’s why I prefer TechPowerup.com reviews. They show frame by frame benchmarks, and not just a meaningless FPS. TechPowerup.com is a floor over TomsHardware because of this.

Yet that way to show GPU performance is hard to understand for humans, so that data needs to be sorted, to make it easy understandable, like this figure shows:





Both charts show the same data, but the lower has the data sorted.

Here we see that card B has higher lags, and FPS, and Card A is more consistent even when it haves lower FPS.
It shows on how many frames Card B is worse that Card A, and is more intuitive and readable that the bar charts, who lose a lot of information.

Unfortunately, no web site offers this kind of analysis for GPUs, so there is a way to get an advantage over competition.
 

hunshiki

Distinguished
Dec 4, 2011
58
0
18,630
0
I don't think you owned a modern console Theuniquegamer. Games that run fast there, would run fast on PCs (if not blazing fast), hence PCs are faster. Consoles are quite limited by hardware. Games that are demanding and slow... or they just got awesome graphics (BF3 for example), are slow on consoles too. They can rarely squeeze out 20-25 FPS usually. This happened with Crysis too. On PC? We benchmark FullHD graphics, and go for 91 fps. NINETY-ONE. Not 20. Not 25. Not even 30. And FullHD. Not 1280x720 like XBOX. (Also, on PC you have a tons of other visual improvements, that you can turn on/off. Unlike consoles.)

So .. in short: Consoles are cheap and easy to use. You pop in the CD, you play your game. You won't be a professional FPS gamer (hence the stick), or it won't amaze you, hence the graphics. But it's easy and simple.
 

kettu

Distinguished
May 28, 2009
243
0
18,710
5
[citation][nom]marraco[/nom]I hate depth of field. Really hate it. I hate Metro 2033 with its DirectCompute-based depth of field filter.It’s unnecessary for games to emulate camera flaws, and depth of field is a limitation of cameras. The human eye is able to focus everywhere, and free to do that. Depth of field does not allow to focus where the user wants to focus, so is just an annoyance, and worse, it costs FPS.[/citation]

'Hate' is a bit strong word but you do have a point there. It's much more natural to focus my eyes on a certain game objects rather than my hand (i.e. turn the camera with my mouse). And you're right that it's unnecessary because I get the depth of field effect for free with my eyes allready when they're focused on a point on the screen.
 

npyrhone

Honorable
Mar 12, 2012
22
0
10,510
0
Somehow I don't find it plausible that Tom's Hardware has *literally* been bugging AMD for years - to any end (no pun inteded). Figuratively, perhaps?
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
0
There's one thing I hate about current implementations of AO: it's too coarse. An object that's no more than say two feet behind something manages to receive some AO treatment. I want to say it's a shadow, but it's clearly not in the direction of the light source.
 

gtguy257

Distinguished
Feb 12, 2010
13
0
18,510
0
The human eye cannot focus everywhere at once. In fact it has a very limited depth of field. But, the human eye can focus so quickly that you rarely notice unless you focus from up close to far away. The effect isn't a flaw in camera systems it is a function of how any optic works. Whether that is your eye or a camera.
 

TeraMedia

Distinguished
Jan 26, 2006
904
0
18,990
3
@xenol:

Looking at the cartoonish pic above (last page), I would have to agree with you. It looks like they turned up the effect too strongly because they wanted to make it easily visible. The real world doesn't look like that at all. Look at the corner of your room, and you can see a faint darkening in the very corner that gradually brightens as you move away a few inches. But in the above pic, the darkening is strong enough to almost black out the pixels. To be more realistic, I think it needs to be more subtle.
 

TeraMedia

Distinguished
Jan 26, 2006
904
0
18,990
3
It would have been great to see a lower-end GCN card such as a 7750 so that we could see the frame rate impact of this feature when the card is already stretching - but is still capable. The inclusion of the 5870 kind of approximates this, I suppose, but the 7750 would have been a decent add-on match for an A8-equipped computer.
 
Rather then continuing to throw resources at approximating the Rendering equation, can we PLEASE move to Ray Tracing already? All these little problems that are hard to implement in terms of the rendering equation are a natural outcome of Ray Tracing.
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460
83
[citation][nom]gtguy257[/nom]The human eye cannot focus everywhere at once. In fact it has a very limited depth of field. But, the human eye can focus so quickly that you rarely notice unless you focus from up close to far away. The effect isn't a flaw in camera systems it is a function of how any optic works. Whether that is your eye or a camera.[/citation]

But it is an annoyance since when something isn't focused on the screen, you cannot see it until you turn your camera to it...
Just like when you look something at your side, you don't need to turn your head...
 

bloc97

Distinguished
Sep 12, 2010
1,030
0
19,460
83
[citation][nom]gamerk316[/nom]Rather then continuing to throw resources at approximating the Rendering equation, can we PLEASE move to Ray Tracing already? All these little problems that are hard to implement in terms of the rendering equation are a natural outcome of Ray Tracing.[/citation]
Really? Ray Tracing would make the game 0.3 FPS unless you have a quad Crossfire of HD 7990's, and even then you will only get 4 FPS...
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
12
I dont know if its because I just woke up or if Im right but in those 3 BF3 screens showing the no ao, ss ao, and hbao, look identical. I dont see any difference what so ever save for some small shadows around the edges of the boxes and pallets.

I would have also liked to have seen them include an nvidia card in their benchmarks to see the fps you got doing the same test with the competitors products.
 

phuzi0n

Distinguished
Mar 21, 2011
8
0
18,510
0
I'm really confused as to what the point of this article is other than to state an obvious fact, fast GPU's are fast at GPU computing. The article focuses on AMD's push for GPU computing and throws in APU benchmarks but COMPLETELY MISSES THE POINT OF AMD'S APU COMPUTING PUSH which is to use the APU to do GPU computing along with a more powerful GPU doing the rendering work. I want to see benchmarks of games using the APU for computing while a variety of other GPU's do rendering versus the GPU's doing compute and rendering without the APU.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
2
"If you have a quad-core host processor and a graphics engine with 2000 ALUs, are there any guesses as to which approach has more potential to make efficient use of available compute resources?"

Obviously the quad-core! It has higher clock rate!

/sarcasm

On serious note, I wonder what clock rate would a quad-core CPU need to run at to match a low end GPU. God forbid if its a single core CPU.
 

atikkur

Distinguished
Apr 27, 2010
327
0
18,790
1
gpgpu in games? only SAO? a bit too late talked about this,, nvidia had been long toying this, just name it, bokeh-filter, dof, cuda-water-sim-effect in just cause 2 (the best water ive seen), fluid, smoke, particle, hair, fur, cloth, destruction, rigid-body.... and everything physx you can think of, just ready to be discovered by developer for free.
 
Status
Not open for further replies.

ASK THE COMMUNITY