How the GPU Accelerated Effects in Borderlands 2

Status
Not open for further replies.

echondo

Honorable
May 29, 2012
250
0
10,810
Yeah it's an amazing technology, but I don't understand how or why people think their game looks great with PhysX on. It's just too much sometimes, it feels as if my screen is filled to the brim with effects and it just looks ugly.

I'm more interested in CUDA, but this is a nice show off demo and presentation of PhysX, wish they did a more recent game, oh wait, there isn't a recent game to show PhysX on, because it's useless as less than 1% of games have PhysX effects.
 

excella1221

Honorable
Aug 23, 2012
2,415
0
12,160
[citation][nom]echondo[/nom]Yeah it's an amazing technology, but I don't understand how or why people think their game looks great with PhysX on. It's just too much sometimes, it feels as if my screen is filled to the brim with effects and it just looks ugly.I'm more interested in CUDA, but this is a nice show off demo and presentation of PhysX, wish they did a more recent game, oh wait, there isn't a recent game to show PhysX on, because it's useless as less than 1% of games have PhysX effects.[/citation]
It makes Nvidia fanboys feel better about themselves.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
[citation][nom]mayankleoboy1[/nom]Works shitty on a AMD GPU.[/citation]
Is artificially crippled by nVidia, when detecting AMD GPU.
Code itself is from a third party, bought by nVidia.

That alone is a reason for a sane consumer to wish this proprietary crap died.
 
[citation][nom]excella1221[/nom]It makes Nvidia fanboys feel better about themselves.[/citation]

After switching to NV 5xx from ATI 5 series I love physx it is great in the games that support it.

BUT I did play Borderlands 2 and frankly the difference between the two cards to me (full settings both cards 7850/GTX570) I was nearly unable to notice any difference.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]mayankleoboy1[/nom]Thay are just using PhysX.Works great on a Nvidia GPU.Works shitty on a AMD GPU.[/citation]

Why support your competitors?
 

JJ1217

Honorable
Since I got my two 670's I've barely even touched PhysX in batman games and Borderlands. Too much performance hit for such little benefit. That video to me looks like one uses different textures and video settings to make it look much more different than it would with PhysX
 

omnimodis78

Distinguished
Oct 7, 2008
886
0
19,010
[citation][nom]JJ1217[/nom]Since I got my two 670's I've barely even touched PhysX in batman games and Borderlands. Too much performance hit for such little benefit. That video to me looks like one uses different textures and video settings to make it look much more different than it would with PhysX[/citation]
I really don't know how you can claim that your two (so I assume SLI) 670s' performance take a hit while PhysX is on. On my 670 (single card) I run full quality PhysX and everything on max (though, in DX9 mode) in BAC and there's no framerate drop. True, I have vsync on so it's always capped at 60 frames, but the point is that even with PhysX turned on, it's still at 60 frames. Maybe your problem is that you're running Batman in DX11 mode, which is useless, pointless and that's when performance goes way, way down because it's a horribly implemented "feature" (thanks to Rocksteady Studios).
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
in borderlands 2, if i could turn every thing about physx either on or off. i would do this

particle physic on
debree on

everything else, off...

i have a phenom 955 black and i have a 5770 hd, i play with just particle physics (a place in the game where i can only make particles) and the game played great, but my god, did the debree staying around just add so much to the over all feel of the game... like a battle went on there...

 

jordanjkj

Honorable
Feb 27, 2013
8
0
10,510
I liked it in BL2 because of how over the top it is when you have someone covered in corrosive matter,then the siren uses her ability and you see everything flying around. It was ok for that game because it was meant to be over the top. Now if you were to put it in a modern day shooter and have particles flying everywhere then it would just look silly
 
How is BL2 physx a realistic representation of physx. They purposefully crippled the game for amd gpus. They take out just so much stuff when physx is off that it almost requires it to be on. For christ sakes they even take out water pipes from the environment when you don't have physx on. Its crazy, I am positive amd gpus can do water pipes...

Guessing Gearbox got a huge check from nvidia for this one. Oh and unless you are on an i5 you really can't set physx to medium, the performance hit is far to big.
 
[citation][nom]A Bad Day[/nom]Why support your competitors?[/citation]

It's not about supporting it, it's about being jackasses about the tech. When they approached AMD for "licensing", I'm pretty sure Jen didn't want a "fair" deal with AMD, so AMD blew them away and went with OSS alternatives (at the time).

If you notice, nVidia is in a position where they could contribute a lot to Bullet, but they don't. They could make PhysX OSS, but they won't. They could move from CUDA to OCL, but they won't.

They're desperately trying to push down consumers throats 3D, CUDA and PhysX, where among those 3, CUDA is the least useless tech they provide (actually, it is very good).

TL;DR: No, it's about them being assholes with the tech.

Cheers!
 

contrasia

Honorable
Mar 23, 2012
19
0
10,510
[citation][nom]back_by_demand[/nom]Unfair monopoly practices?[/citation]
It's not a monopoly if they developed it and it isn't the standard. ATi are free to develop their own physics if they want, but they don't invest in the development costs.

Some people prefer more, others prefer clean, it's a simple matter of preference. I'd personally prefer more eyecandy, and also less predictability in everything that occurs during my gaming session. In fact, since you can tone the eyecandy down a ton and still gain the benefits of the PhysX then I still think it's a step in the right progression.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]contrasia[/nom]It's not a monopoly if they developed it and it isn't the standard. ATi are free to develop their own physics if they want, but they don't invest in the development costs. Some people prefer more, others prefer clean, it's a simple matter of preference. I'd personally prefer more eyecandy, and also less predictability in everything that occurs during my gaming session. In fact, since you can tone the eyecandy down a ton and still gain the benefits of the PhysX then I still think it's a step in the right progression.[/citation]


Nvidia didn't develop it though , they bought it from ageia. and its not a matter of them just not giving the license over its a matter of them intentionally recoding it to be super gimpy when amd hardware is detected. Tom's even did a test of this

they ran physx in software on an nivida card. meaning physx was running on the CPU NOT GPU frames droped maybe 10-12 fps on the game they tested this on ,

then they ran it in software mode with an AMD card in use and it locked to 15 fps (like a 50 fps drop)regardless of what was on screen at the time.

You can talk about how "oh they just arnt optimizing for their their competion" all you want but it wil all mean jack , because if they were just optimized for theri video card , physx would only see the same 10-12 fps running in software REGARDLESS of what brand of video card you are runing

further more they also disabled the ability to use an nivida card as a pure phsyx card while runign amd for rendering.
this is NOT a case of them jsut not optimising this is a case of them intentionlly building the software to run like crap if you have an AMD card on your system period. that is a monopolistic tactic.

if MS coded windows to be crippled any time you used a competitiors web browser you can bet your a-- there would be a monopoly law suit getting slapped at them again.
 

nolarrow

Distinguished
Mar 27, 2011
87
0
18,640
the water looks so reaaaaaallll! lol j/k. I appreciate the effort and I look forward to new tech but the current implementation of physx is weak. The spark effects from shooting metal is a bit over the top. The water effects are garbage - the water looks more like a goop than fluid water. I do like the cloth effects though. Not trying to be a nerd but I always thought half life 2 really set the bar for physics in video games and we actually took a step backwards. I liked how you could launch an oil drum across a room and watch it bang around and finally settle down, but the chunky water and sparks stuff kind of kills it for me.

Oh yeah, and textures first guys - physx after plz
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
[citation][nom]A Bad Day[/nom]Why support your competitors?[/citation]
More importantly, why take a concept as natural as real-world physics and try to make it closed-source technology? Game physics being implemented are as natural as gravity in a video game, yet Nvidia has tried very hard to push Physx on game developers and make it a Nvidia-only tech. That's fine (part of why it's having a slow death), but you cannot tell me that they cannot license it out in the way Havok is. I have a bigger problem when Nvidia moved Physx to their GPUs, claimed only their cards could run it (performance power factor), it was made to work on ATI cards, and then Nvidia purposely put code into the driver package to cripple physics performance with an ATI card present. That wasn't just lame, it also proved they were using marketing BS and were full of it as their cards were in no way more superior (no special hardware) to run the software than the competition was.
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,821
0
22,780
[citation][nom]contrasia[/nom]It's not a monopoly if they developed it and it isn't the standard. ATi are free to develop their own physics if they want, but they don't invest in the development costs[/citation]
People said exactly the same about IE being included in Windows, and before that WMP. They developed it for their own product at their own expense but were still forced to allow competitors. The only difference here is nVidia haven't yet managed to dominate market share in a 90%+ fashion. But doing this is a way to try and force more people to buy your product, increase your market share and the inevitable intervention by the DoJ.
 
Status
Not open for further replies.

TRENDING THREADS