Can we get Toms on to this band wagon about eyefinity/surround?

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
http://www.hardocp.com/article/2010/01/08/nvidia_3d_surround_multidisplay_gaming_editorial
FTA: "For god’s sake NVIDIA, if you pull this bullshit where NVIDIA Surround "games" will not work on Eyefinity configurations, we are going to beat NVIDIA down repeatedly and publicly for harming the PC gaming industry. Keep those crappy proprietary PhysX policies, but if you start messing with OUR multi-display gaming and not letting it remain "open platform," I will personally lead the mob with the burning torches to the castle gates. And we will be fully prepared to use the torches. I will personally lead a boycott of NVIDIA products if I see NVIDIA harm multi-display gaming in the marketplace through an IP grab. Multi-display gaming belongs to gamers, not NVIDIA."

Please someone from Tom's pick this up and make this a front page issue, support it and make it so that there won't be another thing like physx where available CPU resources isn't used.


I will bump this until some one from tom's responds (either nay or yay), who is with me?
 

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
well see, if they leverage TWIMTP titles to not allow non standard res when ATI is plugged in (aka not like 1920*1200, 1024768, etc. etc.) because that is precise what ATI is doing with eyefinity, the I would imagine it would be criminal, and even if they get sued, it would take years and they may be able to blame it on a technical issue and just overwhelm the judge with technical info they didn't need
 
I agree, the more proprietary crap we get, the less we consumers end up with.
While these are nice to have, at some point, the devs have to take responsibility as well.
They cater to the low end, why not the high end as well? If they havnt figured out where the butter comes from for their bread, they may as well just quit making games for PC, and if nVidia sloughs off, and just says its up to the devs, well, theyre also the ones pounding their chests by how much they do, and how often they do it with those devs.
ATI had better pick it up as well, and make it a common access, no flags
 

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
Aye, which is why I want to see this issue pushed by toms. compared to Hard ocp or other places, toms seems to have the most readers AND the most, errrrr first time, yeah first time, lets go with that, enthusiasts due to the forum and the tolerant nature.
 

notty22

Distinguished
I like (respect) Hocp,Kyle but I don't agree with about half of what he says in his editorials or reviews. Good for him, but I don't even think he needed to write this piece except maybe to hear himself bitch. He spews his bias with the "crappy physicx" rant. After a couple of his video reviews his attitude towards some things are obviously pre-determined and that turns me off.
edit: I don't think any gaming output to multi-monitors would ever become non proprietary (if thats the term you want to use) but Nvidia has been developing 3d and and still is, and that will take some unique code(maybe) on behalf of a game. But you see Kyle has already condemned 3d gaming as well.
 

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
But he does present a valid point, TWIMTP titles favor nvidia and batman showed us they were willing to take it to the next level by disabling universal features because ati was present...

most games can and will scale at randomly weird resolutions, its not like ppl are asking they include special textures for some of these eyefinity reses,just that they work and not locked to surround.
 

notty22

Distinguished
Its not part of Kyles point. But I would think that "eyefinity" is copyrighted. What part of the technology (3 monitors, 1 resolution ) is ATI's intellectual property ? I don't know. Is Nvidia have a name for their multi-monitor solution yet ? Can they do the same thing and just call it something different ?
 
Yea, they do, and will, except if it works on other gpus, will it? If its flagged out, there may be a few more people defecting. This is a tough scenario for nVidia here, since theyve come in last.
Being first, such as with Physx, well until theres real competition, they can slide a lil, not this time tho
 

paperfox

Distinguished
Mar 1, 2009
1,207
0
19,460
One question: So you can have Nvidia's 3D Vision technology running with an ATI card just like you can run Physx of an Nvidia card with an ATI main? And the author of the article is scared that Nvidia will block such support when its drivers see an ATI card? Point being that I didn’t know you could run an Nvidia card with an ATI main and still have 3D Vision, if I read that article correctly.

As a side note though I think its great that Nvidia can enable it’s Eyefinity through drivers allowing old cards access to it. And I also agree with that article that it is disappointing that the Fermi cards don’t have 3 outputs.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
imo this is just a hold over tech until the econ picks up and a few years from now imma be rocking a 34" curved OLED screen at 5000xsomething that makes sense with 5000 for my surround
 

paperfox

Distinguished
Mar 1, 2009
1,207
0
19,460
Since the games in question are only Nvidia's TWIMTBP games, Nvidia has the ability to tell the developers to add some code that has the game ask the Nvidia drivers if it detects an ATI card or its drivers on the current system. Im sure there is some line of code that allows you to do this, I mean if you have the ability to go to your device manager or add or remove programs to see if there is an ATI card or driver so should the drivers of Nvidia.
 
Thais true however, this is nvidia. No doubt they could try and make sure that if an ati card is detected then the available resolutions are limited as they will have "IP claims" over any assistance in "making multidisplays work".

I see what you are saying but i think its very thin cause in the extreme, I just cant see a way of making that fly.
I think the guys in the link need to take a chill pill. Not having access to in game AA because according to AMD Nvidia blocked it is an arguable case, "well our techs helped with the game and we never told them they couldn't include your code but it seems you never provided it"
That's one thing but if they try and stop the basic functionality of a competitors card then that's a see you in court kind of case as far as im concerned.

Mactronix
 

sabot00

Distinguished
May 4, 2008
2,387
0
19,860


How is that question relevant?
[H]ardOCP doesn't want nVidia to leverage TWIMTBP program to make games incompatible with ATI Radeon cards when going for multi-monitor setups.
He doesn't care about PhysX & 3D.
 

sabot00

Distinguished
May 4, 2008
2,387
0
19,860


The game must support the resolution, which is why Half Life doesn't work at 2560 x 1600, the game can detect the hardware running (how it sets the default graphical quality) when it detects GPU A, DEV X can code it so it would stop - close.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290

Well given the limited compatibility to begin with any failure of eyefinity displaying things properly is gonna what be blame on twimtbp if the game has it instead of just plain it doesn't work. ATI has yet to give crossfire eyefinity across the board which i find odd considering it's pushing high resolutions which crossfire is usually used for.
 
I'm surprised to see Kyle write something like this that I agree with. He's seemed to be fine with all the shenanigans in the past, it's nice to see something finally getting him to take a neutral stance, and not the 'what's good for the market leader is good for everyone' stance.
 

sabot00

Distinguished
May 4, 2008
2,387
0
19,860
This wouldn't exactly what I call neutral, straddling the fence, in the middle, or like terms.
He has made his position clear & rather forceful, a position in which I perfectly agree with.

@IzzyCraft, all of my modern games work (Crysis, Bioshock, L4D, HL2, Assassin's Creed, Far Cry 2), the 9.12 Hotfix added CF+Eyefinity for all CF setups.
 


Alot easier than you'd think.

It would only affect those too 'mainstream/naive' to 'fix' it themselves, but all you'd have to do is auto-detect and build the .ini like this (or likely more 'hidden' amongst 100s of lines);

...
[Display]
uVideoDeviceIdentifierPart1=222222222
uVideoDeviceIdentifierPart2=222222227
bForcePow2Textures=1
bForce1XShaders=1
bHighQuality20Lighting=1
bFull Screen=1
fDefaultFOV=75.0000
fMaxFOV=90.0000
bIgnoreResolutionCheck=0
.
.
.

and then the other one shows;

[Display]
uVideoDeviceIdentifierPart1=111111111
uVideoDeviceIdentifierPart2=111111116
bForcePow2Textures=1
bForce1XShaders=1
bHighQuality20Lighting=1
bFull Screen=1
fDefaultFOV=75.0000
fMaxFOV=270.00
bIgnoreResolutionCheck=1
.
.
.

Which one do you think would disable surround gaming on any hardware including eyefinity or Matrox's TH2GO?

Very easy to do, equally easy to undo, but then one is 'officially supported' and the others 'are not supported and may cause instability, blindness, anal leakage, and the apocalypse'?

I would be surprised if nVidia did so after this public statement, but then again, this thing still happens. :pfff:
 


True, I meant a stance 'for neutrality' not neutral stance.
Trying to type while watching the Daily Show. :whistle:

 

sabot00

Distinguished
May 4, 2008
2,387
0
19,860
They can code this into the game engine, into a file that would be much harder to open than a simple .ini, however I doubt (& hope) they won't go to such lengths.
 

sabot00

Distinguished
May 4, 2008
2,387
0
19,860
Well as the co-founder/co-leader of an indie game studio, I know we won't intentionally lock out any hardware, and won't implement PhysX.
PhysX doesn't make sense from a dev's standpoint, on the latest steam survey, PhysX capable GPU's are probably less than 10% of the discrete. While all of the CPU crowd can use roughly the same. Plus nVidia's outlook isn't great, so their 60% marketshare will go down.