GT200 only high-end for launch timeframe

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

9900 won't be out in just a month. 2 8800gts will significantly outperform one 9800gx2, but they use up the sli option. Up to you.
 
GF9900 is supposed to be out at the end of June, so if you could, it's probably the best choice.

Otherwise I prefer the 2 x GTS idea for two reasons. First it performs better, second you can re-sell them individually or together for probably more than just 1 GX2 should you decide to do so when the GF9900 or HD4xxx does ship.
 
idk wats ur opinion on sli. i hated cf and i had a 9800gx2 when it worked the sli sucked lol. i prefer single card setup. i dont think i can wait over 1 month for the 9900gtx grrrrr. is there any nvidia cards that can tri sli. not including 9series.
one more thing if i wait for the 9900gtx do u think my antec 500 earthwatt would be enough for this single card, i no the spec arent out but will it actually use more power then a 9800gx2??
 

The 9800gx2 is single card, but dual gpu. It's performance scales the same as a sli setup, not a single gpu. No advantage there. The only advantage is it will work on a non-sli motherboard.
 
The GF9800GX2 is actually dual card, single slot.

The thing is while I dislike SLi/CF in general if your options are 2 GTS or 1 GX2, then I'd go with the 2 GTS.

I Think, but don't know, that the GF9900GTX would require about the same amount of power as a GX2. Likely noticably more than the HD2900 and GF8800GTX, but probably at or below a GX2.

The only other cards than the GF9 series that can TRI-SLi are the 8800GTX/Ultra.
 
Name change is a very good sign..

When somethings name is changed, it means those who changed the name dont want the product or service to be related in anyway..

If they named it 9900gtx, I would be worried..

Gtx 280 is a sign of whole new tech, which should never be confused with 4 digit series..

I would guess that this card will sell for $599/ €450 and will be more than twice as fast 8800 Ultra ( not sli ) @ 2560 by 1600 ..

If an 8800 Ultra can get 10fps on Crysis @ 2560 by 1600, the Gtx 280 will do 22fps imo..

Meaning Cryis @ Very High will run like butter on a 22"
 
ya so i heard it uses a good amount of power but more then the ati hd 2900!!
y so much? but if u think its around a 2900 power consumption would u say a antec 500 earthwatt would work?
 


Should be fine..

Your pc will never pull that ammount, Ive ran 6 hdds, tri sli etc etc ona 620w Corsair psu. And it was grand.

An example for you, a friend of mine who is a sparks decided to see how much his pc would use under full load, not a fantastic spec but a good c2d, 8800gt few hdds and all that jazz and it never pulled more then 200w. The only thing you need to look for is strong rails.
 
yup hopefully it will work with a 9900.
now im gonna pack my 9800gx2 off to newegg and patiently wait till end of june. wat a long wait!!! better pay off.
 

Hopefully you are right and it deserves a name change, not just NV's desire to simplify their names and deal with hitting the GF10 series (like ATI going with X instead of 10).

Latest fud say's the GTX 280 will be out June 18th. And so will the GTX 260. Both are single GPU solutions.
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7364&Itemid=1
 


We can hope that ATI will come out with a high end card by 2010, but I bet AMD's more into putting R&D into Bulldozer instead.

I still think the monster GPU's need to get hit by their own virtual asteroid. ATI's high end for this launch will be the 4870x2. That is the wave of the future that even Huang will have to acknowledge. Nvidia came out with the 9800gx2 just to beat ATI for a bit, but they aren't really reconciled to dual GPU cards at Nvidia and didn't even attempt a single PCB.

GPU's are basically back in same situation Intel was with Prescott, and Huang's promising monster GPU's the way Intel once promised 10 gigahertz on Netburst architecture. Intel had to change, Nvidia will too.

Perhaps Nvidia doesn't like dual GPU and future dual core GPU's because it relies upon games actually supporting SLI well?
 
Dual gpu's have yet to have there day, and until then nvidia is going to go with the sure fire thing, a single core, with properties they can manage, i mean really besides crysis, which was a dumb idea in the first place, where does a dual gpu have a place to go right now? Let alone the dual gpu can only stagger to its own feet.
 


Well intel did well when they clued together two dual core to get quad core prosessor to the market. It's a solution where bad CPU worked as an single core low end variants (by disabling the not working core) and using normal duo core as middle range CPU and using two of them together as an highend variant.
It's not so easy to make completely fine multicore solution, so putting multiple sigle core chips together seems to be easier alternative.
I thinks that yipsl is right that in the long term, multi core GPU will see light.

It's guite possible that ATI will try to make let's say four core GPU and then sell two, tree and four core variants of it if they try to get back to the highend solutions. I am not expecting that ATI will go back to one monster size single core GPU anymore. With multicore solution you can use the same basic core design in your low and and highend products, that is much cheaper than making separate chips to different market areas.
It's even possible that the same GPU core can be part of that bulldoser CPUGPU solution, and use the same GPU core in normal graphick card solutions.

Can anyone see that you can have 4 CPU cores + 2 GPU cores in your CPUGPU and with that separate 4 core GPU craphich card? In that way we could have 6 indentical GPU cores in the same enthusias system.
Low and solution would be 2 CPU + 1 GPU CPUGPU system alone. If you want to have more speed, you can buy 1 to 2 GPU cores graphic card to help in graphicks. All this with only one CPU core design and one GPU core design. Very economical solution.

 
I agree that were going to see dual gpu in the future, and its gonna be the standard, the games just havent come out that really take advantage of it, its like the mulitcore cpu's (of which i own) Nothing is really "optimized" in the game market yet for them. There are only a handful of games really taking advantage of them, but for some guy like me, who likes to multitask, they work great. Same as with the gpu, nothing "optimized" but a few people have high enough resolutions, or play crysis enough to justify purchasing them. But the one thing i dont really see coming to light is the cpu-gpu combination. For now it just takes too much money to have a core that serves 2 purposes when building it. Also amd-ati is the only ones currently able to try this, but there in a financial bind right now, and playing catch-up with both the cpu, and the gpu market. Maybe in a few years we might see something like this, But also keep in mind, most of us enthusiasts put our own comps together for one very big reason. Freedom of choice. And if a cpu-gpu maker is doing it for us, it detracts from the overall product.
 



There is much more to the R700 arch than just using multiple cores to step from bottom to top end.



For a start, there is the shared memory for the X2 - not a big deal you might think.... until you consider how this integrates with say.... 3 CPU cores on a future fusion die.


Then there is the GP-GPU nature of R600 being extended onto R700 - ok, this was coming before AMD arrived - but was a reason for AMD buying ATI - again... your GP-GPU incorporated onto the same die as the CPU and sharing main system memory.

One problem AMD have on the horizon for fusion is minimising memory latency for the GPU.


Split powerplanes.... griffin anyone? Its an extension of this concept for use in notebooks with fusion.




AMD have a long term plan, and R700 is another step towards executing it.
 
I feel we have maybe two more generations of single core killer gpus. That being said, I still believe the X2 models will still be the fastest of the fastest. Going with GDDR5 will show some promise if it comes thru, with the ability to share on card memory with some built in corrections. I find this ro be very interesting. Lets hope that GDDR5 doesnt bring the same problems that GDDR4 did, which prevented nVidia from using it, or that nVidia solves their noise problem and itll work on their cards
 
Single core monster gpus will be around for quite some time I think, were still only on 55nm, so weve still got 45nm and 40nm to go yet. The g280 is nvidias first monster gpu since the g80, just wanted to say that by the way.
 

Yep, it just seems like a monster because with some moderate oc, it runs like an Ultra, which is considered a monster... a year ago. Ultra is is old. It's long past its prime. 😛
 

TRENDING THREADS