Nvidia busted again for cheating.

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Actually I did read the article and if you are wondering what I thought of it then re read my first post in this thread.
 


Re-read my post.
 


Yeah obviously ATI have found a way to magically put tesselation into Crysis 2? I mean come on who are you trying to kid with that nonsense?

As for Crytek being bad at coding yeah I can believe that for the water, but there is no way at all they would put such massive tesselation on a flat object unless they were trying to slow the game down deliberately.

Put that together with Nvidia's bribe cash and it's pretty clear to most people what is going on here. It's pathetic, and defending it is pathetic as well. Nvidia or AMD fanboy there is simply NO NEED to defend this blatent cheating and lying.
 

ATi have had a dedicated tess chip on their cards since the 2xxx series and yet as soon as it needs to be used and can't cope people like you start whining and as they were the first out of the door with DX11 GPU's shouldn't they know what to expect when lots of tess is used in multiple objects? :heink:
 
I'll take the flame bait but in a constructive way::

@AMD Fanboy

Im not spewing garbage All AMD cards are practically crippled under heavy tessellation. The 6 series was a slight improvement but not much, saying "Crippled" might be exaggerated but like I said it is sad that AMD has to scale down the tessellation in their drivers for their cards to pull acceptable frames in some cases. If nvidia is cheating by being involved with a developer, than amd is cheating by using scaled down "AMD optimized" tessellation. That is crap and this is a nonsensical argument.

@ AMD Fanboy

Btw what amd and nvidia rigs have you owned as actual experience for you to judge?

Personally Ive owned several AMD/ATI 2, 3,4,5 and 6 series gpus and Ive owned nvidia 6,8,9,2,4 and 5- series gpus, so I am not talking out of my butt.
 


@ Nvidia Fanboy

Did you even read the article? Let me quote from it, so you understand just how little you...understand.


The question is: Why?

Why did Crytek decide to tessellate the heck out of this object that has no apparent need for it?

Crytek's decision to deploy gratuitous amounts of tessellation in places where it doesn't make sense is frustrating, because they're essentially wasting GPU power

Why are largely flat surfaces, such as that Jersey barrier, subdivided into so many thousands of polygons, with no apparent visual benefit? Why does tessellated water roil constantly beneath the dry streets of the city, invisible to all?

The trouble comes when, as sometimes happens, the game developer and GPU maker conspire to add a little special sauce to a game in a way that doesn't benefit the larger PC gaming community. There is precedent for this sort of thing in the DX11 era. Both the Unigine Heaven demo and Tom Clancy's HAWX 2 cranked up the polygon counts in questionable ways that seemed to inflate the geometry processing load without providing a proportionate increase in visual quality.

Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces.

Are you understanding this yet or is it still too hard?
 


I've owned an MX 440, 5200, 6600 GT, 7600 GT, 8800 GT

and

x1950, 4870, 5770 and currently a 6850.

Your point was?
 
^good just that we get a lot of people talking out of their ass that have never actually owned a rig, sorry for mistaking you for one of those people I could really care less about this thread in general, what are you arguing?
 
I'm arguing -

Nvidia and Crytek have conspired to cheat in the Crysis 2 benchmarks, making Nvidia cards look way better than what they are.

They did this by adding far too much tessellation to the game, on objects that wouldn't normally have that kind of tessellation.

This slows down Nvidia and AMD graphics cards, but it slows down AMD cards more. That would be "ok" if it wasn't for the fact that there is practically ZERO benefit to the player. You can't tell the difference visually at such extreme levels of tessellation, all it does is slow down the cards.

This only happens in Unigine, Stone Giant, HAWX 2 and Crysis 2 - there are no other games that slow down systems deliberately by overdoing the tessellation for no visual benefit. All of these have had massive Nvidia "sponsorship".

It's not hard to figure out what is going on. Nvidia is clearly bribing games devs to slow down games, knowing that even though their own cards will be affected, AMD cards will be affected more.

Do you think that's acceptable? I don't. Look at the toms Crysis 2 article - how many people do you think went and bought Nvidia cards because of that? And it's a total lie. Nvidia is buying games devs and the tech press with YOUR money because you are spending more on Nvidia cards that are actually SLOWER but perform better in these 3-4 bentmarks which you see in most reviews.
 
Flame-Flame_on.jpg
 
Random fanboy doesn't "buy that they are conspiring", some of the most respected tech journalists on the web do. I think I know who I'm more likely to believe. :)
 


Lol me? I'm not the one who started throwing out "troll" and "fanboy" around 😀

Nah kids, I know who is angry. :) But you know, you are only angry at yourselves for falling for Nvidia's crap. I'm just the messenger.

And believe me I am quite happy to spread the word about Nvidia's cheating and lying on this forum, not angry. 😉
 


lol. i believe he purposely create this thread to generate flame war and pissing off some people. that's all. maybe i should get my pop corn. it has been a while since i saw this kind of thread here at toms 😀
 
I hope that's not the reason for this thread. I for one would like to know if Nvidia is back to their old ways.

For those of you new to this, just like AMD is "famous" for their bad drivers, Nvidia was famous for their driver cheats. From FM banning certain drivers from 3DMark, to games running faster then they should unless you renamed them to game.exe, it seems that Nvidia has done all kinds of "odd" things. I myself don't see what Eyefinity sees, but its certainly possible Nvidia has gone back to their old ways. And I see nothing wrong with making a thread to warn/tell others.
 


cysis this game vary use graphics vary high not perfect computer not play :ouch: :ouch:

__________________
sbobet-คาสิโนออนไลน์-ผลบอล-gclub
 
I think Occam's razor applies here. It is much easier to explain the over tessellation of the barrier, and the existence or the water mesh as just poor coding techniques. Especially in the case of the barrier. There are thousands of object defined in these games, and as the article explains, tessellation is scalable. It is very possible that the designers simply forgot to turn down the tessellation for that one object.

The only thing that is absolutely obvious in this thread is that eyefinity is a troll who likes to indulge in conspiracy theory, and instigate unnecessarily heated debates.
 


Old ways? TWIMTBP is still around in games... That's and "all time" behavior 😛



As a fellow programmer, all I can say is that I really don't doubt in Crytek engineers/programmers/architects at all, but lousy management that pushed deadlines and made a rush patch because of prior bad calls.

Even if nVidia had their claws in all that mess described in the article, I'm sure there is more blame to pass into the management chain and a rushed product (the patch).

Cheers!
 
TWYMTBP doesn't bother me that much. If someone wants to provide help to programmers so that they can make their games run better/smoother then I don't have a problem with it. There is that whole "AA will only work on our cards" thing that is shady. But I'm not bothered by it to much.
 


It does bothers me, because although they do get improvements, they polarize optimization. In a development cycle, if you have (even get) a chance to improve/optimize your code, it's not because you love your work or because the company feels it's their duty to do so. It's because they are paid more money to invest the time and labor it involves/requires. So, that "TWIMTBP" means nVidia buying out "programmers time" to shift balance towards their line of cards or technology; in other words "wetting" companies to shift balance. I'd say it's a borderline case for a mistrust lawsuit. And, we all now a programmers time ain't cheap. And sending a lot of people to teach others how to do specific coding isn't cheap. AMD also does this though, but I've never seen or read about how much "trainning" they give out to programmers (game devs if you like to call them).

Anyway, that's how I see it.

Cheers!
 


Yeah Occam's razor let's see.

Nvidia has a history of bribing games devs to hobble AMD cards.
Nvidia bribed Crytek $2 million (that we know of, god knows how much they ACTUALLY paid in the end).
Crysis 2 has very "strange" graphics quirks which just so happen to slow down AMD cards more than Nvidia cards.

Yes I think Occam's razor applies here and it's pretty clear what the most obvious explanation was. It's the same conclusion that the industry-respected author of the article came to, that is Nvidia and Crytek are up to no good.

Oh and don't call me a troll, sheep.
 
Status
Not open for further replies.