New GeForce GTX 260 to Feature 216 Shaders - Beats Radeon 4870

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
dx10.1 is what dx10 should have been....which microsoft failed to achieve when they claimed performance increase of over 2 times or something.
and we all saw the performance difference dx10.1, assasins creed runs a LOT faster on dx10.1 card then on dx10.i am obviously talking about the version with dx10.1 support.

 


You're right, I partially mentioned Crysis because it is the game everyone else mentions. Crysis in of itself as a game is nothing super spectacular, HL and HL2 had a guy in a suit that gave him extra "powers" so nothing new there (but the idea of a nano-suit is friggin bad-a$s), the gameplay in Crysis is/was nothing extraordinary from any other shooter, and the story line is/was just mediocre.

So, Crysis as a game is nothing super special its just the first game that has been popularized to showcase DX10 support as well as all the hype nVidia and M$ threw behind it.

You're right, what other compelling reason, aside from gaming, is/was there to install/upgrade to Vista and DX10?! None that I can think of...I still am very disappointed that DX10/DX10.1 isn't supported by XP. But how else was M$ supposed to force Vista on the masses aside from OEMs?

As far as DX10 games go, Assasin's Creed looks fantastic, and so does Gears of War and Age of Conan.

I impatiently await the release of Starcraft2...
 


Add to that better buffer management for deferred shading, material management and render from MSAA buffers and virtualized memory making almost all of the performance improvements we were expecting from DX10 ended up in DX10.1

Remember when Company of Heroes DX10 patch came out most of us said "WTF did they forget any of the performance tweaks?" it was little image changes and a great performance hit. IMO DX10.1 offers that second part of the equation people wanted improved graphics quality WITH IMPROVED PERFORMANCE.

But still need devs to make use of them. May take until the first DX11 games for it to be a killer app, but hopefully they can make good use of it before then.

 


Meaning their 55nm cards are far off. Obviously they are running into problems if their solution is to unlock some shaders on the still expensive G200 chip than to start selling a significantly cheaper & faster 55nm refresh.

So all the hype/hope of the 55nm by September seems unlikely now, the bigger question is when will it arrive, and can it do anything to challenge the X2, sure doesn't seem like a small boost in shader speed alone is going to be convincing as a later launch, it's more likely to continue the win some / lose some that is the current situation, maybe breaking o creating a tie compared to before.

I'm also skeptical of adding a few shaders to the GTX 260 helping all that much, in the shader department the GTX280 is below the HD4850, so those games that were shader and bandwidth bound will still favour the HD4870, those that are still texture, ROP and VRAM bound will still favour the GTX260/280.

All this essentially does is make any tie or close battle a little bit better for the GTX260+ or whatever it's called, nothing revolutionary.
And for $50 more, would you even bother?

Seriously, this news is as boring as the GF8800GTS-SSC with 112 SPUs, the real news was the G92, and the same situation here, who cares about the 260+, the 55nm refreshes are what really matters, until then it's window dressing on a delay.
 



Inspite of what homerdog said; i'm willing to question this as its not the first time this has happened.

a GTX260 is a GTX280 with the shader units /disabled/ NOT removed completely.

Likewise, the gtx260 "clocks" can ramp WAY higher than the stock clocks - actually they can pretty much hit nearly any clock speed that the GTX280 can on average.

I'd be willing to bet that you could flash a GTX260 with a GTX"260+" Bios; unless the manufacturing process literally laser cuts the shader groupings to damage them so that they aren't usable, I don't see why you couldn't. Its already been said that "defective" GTX280 shader cores were in turn binned into GTX260s.
 

That's basically it, they're physically disabled and there is no way ever ever ever to bring them back. I'm afraid the good old days of unlocking 'pipes' are gone 🙁
 

Has it been confirmed that the tessellator in RV770 is DX11 compliant?
 



I QQ endlessly

Ok, I feel better now.
 
I posted elsewhere that DX10.1 is getting closer and closer, and eventually, denying it will be to nVidias disgrace as being seen as cavalier again. Even if the games coming out thatll use DX10.1 arent uber games, therell be more than just 1 instance for comparisons across manufacturers. With 3 or 4 such examples, the tide could turn quickly, and not having DX10.1 support could become ugly. If this card comes out at the same price as the current 260, and they lower the price of the 260, then thered be real competition, but unless it does, its another sticker change/get more money attempt by nVidia
 

If DX10.1 could somehow be patched into the current UE3 games I would go out and buy a 4870 today. Just imagine Bioshock and Mass Effect with true MSAA that works right, doesn't have to be forced in the driver, and doesn't kill performance 😍

S.T.A.L.K.E.R. too :sol:
 
Because ATI aren't about making money, advertising 10.1 when there's nothing really around that users it, and people who upgrade every 12 months will have changed cards by the time there is.

I wish there would be some balance posted here sometimes.
 


People sticks with the card as long as there's not a "good" upgrade for them, they're not bound to a fixed amount of time. There are the non-price-sensitive fellow around, but i wouldn't say they're majority.

DX10.1 has been out a lot of time, can't deny it, but cause nVidia was selling good and ATi wasn't, DX10.1 games were an illusion. Now, that ATi is on top and nVidia has to struggle, we might see a turn in the table and actually see some DX10.1 titles. Maybe not big ones, but titles to show off (like Crysis did) DX10.1 and it's performance.

Esop!
 
Ill never understand the negativite non chalance attitude towards DX10.1. Like was said, it was part of the original DX10 package, changed due to certain circumstances, added on later, and in the only example we have shows high promise. Do people feel that way because nVidia doesnt have cards for it? Lets put the shoe on the other foot. Since Crysis is really the only true example of DX10, then should we care aboput it as well, or forsake DX10 altogether? If Crysis were DX10/1, it may be playable today, certainly closer than what we have. Maybe if devs saw a desire in the gaming community for it, it would have some effect, being exclusive, which I think AC tried to do, but was shot down. And by who? Too much influence, and bad at that
 


Most people couldn't care less about DX10 because Microsoft made it Vista-only. And if DX10 is irrelevant to most people because they're not running Vista, DX10.1 is pointless.

Eventually most people will end up on Vista or 'Windows 7', but it will still be a couple of years before DX10-only games make any sense to developers; and by then we'll probably have DX11 anyway (which, the way they're currently going, Microsoft will only release on 'Windows 7').
 
Sorry to say, but xp is slowly losing out, and faster and faster as time goes by. So xp DX9 wont matter. Itll be history. How much longer can we deny DX10.1? Or Vista? Soon, a few games will be out, and we will see how well its then accepted by the gaming community. Im sure you know the whole story behind DX10, and you know that its impossible to run it as is on xp. What I dont understand is, if you buy a new cpu, you may have to buy a new mobo for it. Thats been accepted. Im not a M$ fan, but I am all for going forward in tech. This isnt forwards looking. All I heard was before, that nVidia would just muscle its way, raw power yadda yadda , now whats happened? ATI is out muscling, out hustling nVidia and guess what? It still has DX10.1.
 

DX11 will be available on Vista.

It isn't just Vista keeping devs away from DX10, it's the hardware too. There are still a lot of DX9 cards out there.
 


Is AoC even DX10? I've got Vista, a DX10 GPU and AoC and I don't recall seeing any options for it. I don't play it any more, but I was curious.
 
Nevermind... I answered my own question. So there's one LESS reason to make the switch to Vista LOL. Oh well, I've already got AoC AND Vista. Drat.

Funcom has regrettably announced that the DirectX 10 version of its MMO, Age of Conan, will not ship with the initial launch. Funcom says it has decided to ship only the DirectX 9 version and spend more time on building a DirectX 10 version "worthy of Microsoft's great vision for the future of PC gaming".

The extra development time will give Funcom time to implement more features in the DirectX 10 version of Age of Conan than originally planned.

The new, enhanced DX10 version will be premiered at Games Convention in Leipzig in August 2008. A special preview will be unveiled this summer at Nvidia's NVISION event in San Jose, California, August 25 - 27, 2008.
 


XP will still be popular by the time 'Windows 7' is released, if Microsoft actually keep to their schedule (it's late 2009, isn't it?). Anyone releasing a DX10 game will either have to abandon the XP market completely, or write a DX9 renderer too.

How much longer can we deny DX10.1? Or Vista?

Probably at least until 'Windows 7' comes out unless it's delayed as much as Vista was. Most graphics chips sold are either Intel or Nvidia, so an ATI-only DX10.1 is not a very interesting proposition for most games companies; if they already have to write a DX10 renderer and a DX9 renderer, why bother writing a DX10.1 renderer as well?

Im sure you know the whole story behind DX10, and you know that its impossible to run it as is on xp.

LOL, I know a lot more than that. The only thing stopping DX10 running on XP is Microsoft; and not allowing it to run on XP was an absolutely colossal blunder on their part.
 
Hi Guys

Which of these cards will give better performance with games yet to be released? gtx260 or HD4870. DX 10.1 / 11 excluded.
 

I would answer that but I seem to have misplaced my crystal ball.
 

TRENDING THREADS