Radeon 5870

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Intels just an investor, not an owner nor to my knowledge, the leading owner/controller, so Hydra for everyone.
Theres also other rumors which ATI and sideport plus a few other things eliminate hydra itself, and does what hydra does. Time will tell here. MCM and other things are taking shape. Its all a part of fusion, or the very reason AMD bought ATI
 

+1; ATI is going 40nm for sure be able to capitalize on the price/performance especially for low end graphics cards. This sideport/hydra could be very well ATI's ace up there sleeve for high end cards
 
I think that the main problem with this multi-chip design is that it increases PCB complexity which thus increases cost. That's probably why nVidia hasn't done it yet even though they acquired much of the IP to do so when they bought 3DFX. A multi-die solution, like what you have in the 360, might make more sense. I guess we'll just wait and see.
 


nvidia cant put 2 chips on one PCB what makes you think they have the know how to do this, that the reason nvidia hasn't done it, if they could they would but they cant
 
rangers man what did I tell you about talking with out info? lol

How do you kno that? Its not that they don't kno (No information, so really we don't know if they can can't)

putting 2 PCBs together is much easier than doing a dual GPU PCB. It costs less for the company, though they don't charge like it costs less😛.

Come on rangers, atleast act like you are in the middle of both companies you talk from the heart too much.

I'll spoil it for you and tell you that ATI doesn't exist anymore:) Its all AMD now:)
 
researching and re manufacturing them is not easier. Any innovation costs to make. Once its made it might be cheaper, but some companies might choose not to risk innovation with chances of it failing.

There have been benchmarks that show the 4870 X2 being slower than 2 4870 1 gigs in CrossX for example.

So really its a situation of safety and stability.

Think about it a all in 1 phone requiring shrinking the material and or taping a phone and a Mp3 player together?

Cost of that 1 phone and 1 mp3 player...making all in 1 costs thousands to research and make sure it works etc.

:) Logic

Now lets move on, why stop if it works?

3870 X2 vs 9800 GX2 ATI lost

4870 X2 vs 295 GTX ATI lost

Now tell me why would they change if it works???

Remember this card is cheaper than the ATI 4870 X2 when it came out.

 



ohhh bad reference point. Market price is dictated by supply and demand, not what a previous competitive product has been sold at.
Theres also the minor fact that 4870x2 has been out 5-6 months already.

Of course its the best comparison available currently as both cards are the top of the range models and the pinnacle of what each company can achieve. But, as you reminded rangers its always a good idea to have the facts laid out 😉
 
ofc thats into consideration, if you read my posts I flame the late coming of the 295 GTX. It might be too late but honestly with the prices here in Toronto, its the better price/Performance buy.

Personally I avoid both, after the 9800 GX2 I just gave up. And I'm honestly surprised that the 4870 X2 drivers aren't what they should be right now.

Kinda makes the 4870 X2 in the secondary a waste of cash.
 


you hit the nail on the head there, AMD/ATI innovate nvidia stagnate, if ATI was not there to push nvidia, they would churn out the same old crap year in year out, 8800gtx106.0000386 and a half
 
I switched from ATI after their blunder 2900 XT:)

They made halves so that they lower the price, although u and me don't agree on them doing that, it doesn't mean it hasn't led to the decline in prices.

I mean the new chips were 65 nm refreshes from 80. which is alot bigger than the 65 to 55 switch.

Yes the FX series was a blunder as well.

The 8800 GTX was def not crap (I kno u didn't say that) considering its still max games better than most cards that are stronger.

8800 GTS was pretty much as fast, used up less energy and was half the price. I'd say thats a win.

Then there was the 8800 GT that came out before which was even cheaper, thinner, and less of a power hog.

The only card I think was a waste was the 9800 GTX and the +, the 9800 GT, 9600 GSO, which are direct refreshes of what we had before (although some had 55 nm )

THe 9600 GT was the card that started the price war (other than the 8800 GT)

where it could be found on release for 150 and only 10% slower than the 8800 GT in a lot of games.

Remember ATI also made the 3870 X2 mistake, and quad fire never did scale the way it was supposed 2 for that card.

and lets not forget that ATI just recently changed their naming schemes..remember GTO, GTS, XL XT, pro, etc etc....

So every1 makes mistakes😀 but The card to go for now is prob the 4850 X2, 260 GTX old or the 4870 512...prices shrunk. (also the 280 GTX went down alot surprisingly over night😛
 
I dont have anything against ATI in fact i have never owned a Nvidia card, its just that sometimes price/performance has to come first.
I wouldnt have either card myself as i am of the single card single chip = best most reliable performance mind set.
I dont need anymore GPU power than i can get from a single core anyway.

Mactronix
 
i read the reviews of the 2900xt and thought, ill skip this gen, the heat on the thing and the 8pin PSU requirement put me off, but i understand that some where stupid enough to buy it, thats there fault not ATIs
 
OK, first of all, if nVidia had what ATI uses, theyd most likely use a single pcb. Since they arent using GDDR5 , its less benficial to do so for them, tho ATI can and does. GDDR5 doesnt need the elaborate layout on a pcb that GDDR3 needs, thus a cost and space savings is incredible.
Now, sticking with GDDR5, not only are the tracings not needed to be exact, the other benefit of it is bandwidth. Since nVidia isnt using it, its stuck using much larger bus' , just to get the same bandwidth. Having all those tracings for that bus, plus then, having to route them so the power resistance/usage is the same for the GDDR3, theres over twice as many tracings that all have to be precisely laid out. So, L1qu1d, nVidia cant do what ATI is doing, itd be impossibly expensive, or the pcb would be too large. Id suggest you read up on GDDR5, as well as GDDR3, and see what Im talking about.
The 4870x2 seems to be just fine, and those drivers, Ill remind you just 1 more time, theyre coming, and soon, and the potential in DX10 will be greatly enhanced. I wouldnt deny this, as weve already seen some of the improvements, where the 4870 and the x2 edges ever nearer the 280 and makes the 295 so so.
Long before the launch of the 4xxx series, everyone knew the pricing, and some were let down, the thinking was, at these prices, how good could these cards be? So, again, read up on a few things, and dont go pointing nVidia to places it either doesnt or cant belong
 


Well I was a stupid fanboy then:) I bought it, didn't like it, so I went to exchange it for a 8800 GTS 320, which ran better with a 200$ cheaper price tag.
 


Well thats what I said:) I said it wouldn't benefit I suggest you read up on what I wrote. Them to make a dual PCB requires the innovation they don't have. It would require changing the chip into something that could work.

So to me if its not broken and it works better don't change it.

Is the 4870 X2 going to edge out the 295 GTX with the driver updates? No Thats my answer because I've always been told "oh the 8.11s will do it....next the 8.12s...then the hotfix" I'm sorry but right now I'm not getting my hopes up and i'l say lets just wait for it.

to me power is described by a single GPU.

2 GPUs are a way of extending life, not making it your primary life.

Omg I am getting 26 fps in a game...add in a 2nd card...omg I'm getting 35 now! this will hold me till I can save up money for a new generation card or till something is appealing:)

Thats what a 2nd card is supposed to be...or if your an AA junky or just a plain enthusiast:)

Anways I think I'm done here, the ATI ppl over way the Nvidia or neutral ppl so I can't get any arguments out.

I'm never making nvidia look amazing or saying they are bad...I'm just saying both companies approach the same goal from a different point of view.

Remember fanboys, to Love ATI is to love AMD:) They come hand in hand now😀
 
I did, thats why I wrote what I wrote

Youre saying its more expensive to use a single pcb, as well as easier using 2, which is both wrong. The heating/cooling solution (which is the most important part of final design) is very difficult to manage, thus the unheard of sounds heheh made by the 295. Then theres costs, which is skyhigh as well, as its using 2 pcbs vs 1, with twice as many tracings to boot. Plus, youre using a gpu thats twice the size of the competition, more expensive bus, and theres 2 of them, more expensive boards, plus theres 2 of them. All that vs slightly more expensive ram. nVidia is taking a killing just trying to sell these things
 
I'm saying that THE INOVATION COST MORE for the company for god's sake not the actual parts.

Yes it costs more to have 2 things, did u not see my cell idea...even though ur paying more for 2 parts to put together, it costs less then designing a custom phone to make at first...you kno a prototype...the parts, labour and research. Not the actual selling of the product!!!!!

Thank you!!!!!!!!


oh yes from what I understand GDDR5 neads bigger bus width which the Nvidia cards will have other the ATI, although it would cost more. I remember reading that GDDR5 is a lot better, and cheaper to make, but i'm not sure about hat.

Given that, I think that the GDDR5 is wasted on a 256 bus width.

I'll say this again, we're not defending companies we're defending facts:)
And most of the things I'm hear about the future, are just theoriess (9.X)

Let me restate my main argument in easier words.

Innovation > (cost) Super glue 2 pcbs

2 pcbs (cost) > 1 dual GPU PCB

I = sqrt(GOD)

Now then thats all I wanted to say, nothing about how bad ATI is, I only say that either company is bad to upset fanboys😀

So when i see some1 say OMG nvidia is amazing I'l start going for ATI, and when I see some1 saying oMG Nvidia sux the other way around:)

Its really fun, and I learn alot sometimes.

But thank you for the suggestion to read about GDDR5 it was a fun and time consuming read:)

Now if you'll excuse me, I think i'm going to go celebrate my Birthday by going to buy an ATI card and Nvidia Card and smashing them to make a 5850 GTX CroSSLI 😀

Bye Bye!
 
Status
Not open for further replies.