Leak: ATI Radeon 4800 Gets 480 Stream Processors

Status
Not open for further replies.

caamsa

Distinguished
Apr 25, 2006
1,830
0
19,810
I am in the market to pick up a new video card in the near future like maybe this fall. Hopefully these will be decent cards and will not be over priced.
 

korsen

Distinguished
Jul 20, 2006
252
0
18,780
Those idiots will never learn... 16 ROPS? why the hell did everything else increase except that?

ATI keeps screwing up royally and unless the 48x2 can even make a dent vs the 9900's, AMD is going to drown in it's already screwed up debt.

The same way people get pissed off for companies releasing beta software as GOLD, i'm sure most of us tech geeks are getting tired of NOBODY TESTING THEIR OWN CRAP FOR PROBLEMS.
 

lopopo

Distinguished
Apr 18, 2008
82
0
18,630
This is kinda upsetting. Here is this product that sounds like it will preform well and its released at a time when AMD's future is dubious. Even if it gets rave reviews, I would be scared to buy.
 

Gravemind123

Distinguished
Aug 10, 2006
649
0
18,980
Since they are opening up all of their documentation necessary to make 3rd party drivers for linux, I would assume that someone would make a windows driver with the data if AMD/ATI went under.

I suppose we will see if it is the lack of TMUs or the lack of ROPs that was holding back the 3800s, hopefully increasing the shaders and TMUs will increase performance enough to make it a viable competitor to the 8800GTX, I still can't believe that the 8800GTX has hardly been surpassed in almost a year, graphics companies both need to get off their asses and make something new.
 

korsen

Distinguished
Jul 20, 2006
252
0
18,780
but you also need to consider that gfx increases come out every 6 months as opposed to every two years for processors. We're looking at huge jumps here. I say they should go into processor schedules and start making something that's worth upgrading to.

It's like, meh, this new generation is only slightly better than the last, i'll pass. It's like comparing an old celeron to a new celeron instead of an P4-HTT vs an Athlon vs a Core2 vs a Nehalem vs AMD's brain-child. Celeron on Celeron action is a stalemate and LAME.
 

sailer

Splendid
If AMD/ATi can get these cards out and they perform close to promised, I can see a couple of them going into my computer. But I'm not holding my breath about it, as I can remember the anticipation I had last year for the 2900, after which I bought a 8800 GTS 640.
 

Andrius

Distinguished
Aug 9, 2004
1,354
0
19,280
Either ATI has accepted it's role as "the (class) second GPU provider" or something went wrong with the other 16 ROPs.

If something bottlenecks an architecture the logical step is to increase it until it stops being the bottleneck. It would seem ATI thinks 64x5 stream processors was the bottleneck not the 16 ROPs.

The strange decisions at AMD/ATI continue ever since AM2 came out 2 years ago :heink:
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

Because ATI is doing AA in the shaders instead of the ROPs (they call them RBEs by the way). 16 should be enough.
 

thomasxstewart

Distinguished
Jan 16, 2006
221
0
18,680
Before 4x, there was dynamic controllers on mainboards, they where 80X2 then 80X4 & so on?. Dynamic controllers are more situation specific? & 8 x80 or today 16x80?. Isn't game card little mainboard, with each unit little transistor controller or effector. 8 x80 being half? total potential.Or is it TOP at ~640, as seems design here. It is great improvement from much less assembly machins on board, i'm sure. Is this multiboard thing forced upon US simply because No one has designed more complex mainboard Host controllers for years?6&7/XX


Signed:pHYSICIAN THOMAS STEWART VON DRSHEK M.D.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
The new GPU has 480 stream processors or shader units (96+384)

What does this mean? 96 shaders (same as old g80 gts) + 384 vertex processors? Or what I know that 2900 and 3870 have 3x better geometry performance than nvidia g80/92, but I never did understand how come that its shader performance is so low compared to nvidas solution. Does anyone have an explanation for this?

And 32 TMU?. What the bleep is that? Texture units?? what part of UNIFIED processors am I not understanding right?

And another thing- 3870 supposedly has 64x5 processors making it x5 wasted as it competes to 9600gt which has 64x1 processors. What is ATI doing that it needs x5 to get same results as nvidia with x1?

Confused.
 
G

Guest

Guest
Flagship dual-GPU 4870 X2 cards will include 2048 GB of GDDR5 memory clocked at 1.73 GHz. The Radeon HD 4870 X2 will be introduced at a later date (and could see spec revisions).

2048 GB's!! AWESOME!!
 

yadge

Distinguished
Mar 26, 2007
443
0
18,790
so how would a 4850 perform compare to maybe a 3870? I know now one really knows, but maybe someone can speculate based on these specs.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Nordic Hardware reported in February that a 4870 would compare to a 3870X2 in capability. That bodes well for the 4850.

I don't see ATI as the second class GPU company. If AMD tanked out, I'm sure ATI would be purchased by someone other than Nvidia. Even in games, Nvidia's not that far ahead (despite what must be an expensive The Way It's Meant to be Played program). ATI's image quality is top notch and ATI cards do better in video playback than Nvidia.

The only time I owned a recent generation Nvidia card, I was disappointed in Pure Cinema, in blurry image compared to the equivalent ATI and in Vista driver support. So, I went ATI when I had the cash for a high end card.

 

Gravemind123

Distinguished
Aug 10, 2006
649
0
18,980


nVidia also uses TMUs, its unified SHADERS, Texture Mapping Units are not shaders, vertex and pixel shaders are. As for the 64x5, they are not the same style stream processors that nVidia uses, they are a completely different way of approaching the unified shader idea, and thus you can't directly compare 64 nVidia shaders to 64 ATI shaders(5 way superscalar shaders for a total of 320 stream processors), on top of that there are large gaps in clock speeds between the ATI and nVidia shaders, with the ATI being lower clocked. For example the shaders in the HD 3870 are at less then half the clockspeed of those in the 9600GT, but as with processors, clock speed is not a good comparison of performance, it depends on the architecture.
 

leo2kp

Distinguished
NVidia COULD be very far ahead if they tried, but why innovate when you're sitting pretty on the top with little threat from Mr. 2nd Place?

All they need to do is up the stream processors like ATI, widen their memory bus to 512bit on a single card, go for DDR4 or 5, and with the current core architecture I'm guessing it would nearly double the speed of the GX2. I think NVidia, if they really tried, could completely destroy ATI which makes it bad for competition and the future of both companies IMO.
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815
SO!
I hope some expert here can make an educated guess if this new 4800 is a knockout or just "competitive" with the NV 9800/9900 ?
I'm guessing a 4870 has about 1.5x the performance as a 3870. Hoping for a 2x increase per socket.
Just thinking.
Thanks!
 

bwdsmart

Distinguished
Jul 22, 2007
68
0
18,630
The ati arc. wont benefit much from more then 16 rops, the TMU's was where the bottleneck was. and chances are, we wont see many 512bit mem bus's due to the fact that the pcb's required are much more expensive then 256bit(hence why amd is using high clocked ddr4/5) to eliminate that bottleneck. a 256bit bus is plenty with 4ghz effective on the memory.
 

Andrius

Distinguished
Aug 9, 2004
1,354
0
19,280
I don't see them that way either. The problem is that the things ATI cards are great at are great even at cheap IGP level for them. The HD 3200 IGP offers the same video enhancement features as their highest level HD 3870 cards. They just lack performance in top tier gamer cards. Why a HD 3870 is still $40/40EUR more expensive than a 9600GT is a bit of a mistery to me and that is what is killing ATI's sales at this level.
 

TemjinGold

Distinguished
Dec 15, 2007
143
0
18,680
Flagship dual-GPU 4870 X2 cards will include 2048 GB of GDDR5 memory clocked at 1.73 GHz.

Dang... where can I get one of them 2048 GB GDDR5 cards? :D
 

Andrius

Distinguished
Aug 9, 2004
1,354
0
19,280
What does one need a 2GB framebuffer for? F@H? 64xAA at 2560x1600? It's a waste of memory. XP 32bit will only have 1.5GB system memory left! And most current Intel chipsets have 8GB limits. 24 months from now when maybe something comes out that could use it's 2GB framebuffer it'll be just as obsolete as a 1GB version.
 



Also curious to read educated guesstimates on this. Though my question is a little different: Wondering when it'll be justifiable to upgrade my 8800GTX. Something along the lines of... say... 20% better on many benchies would have me thinking.

So - Does this look like it'll do that?







Well - Since the "memory" limit you are referring to is actually OS based (runs out of addresses), and the 8GB limit is chipset based... If your OS will can handle the addressing requirements, it won't be a problem. For example - Vista x64 (Business/Ultimate. also XP64, I believe) are set up to handle up to 128GB - 500MB for the system, 8GB for RAM, 1 or 2GB for the card... 10.5GB from a pool 128 gig deep?

Also, are we implying playable frame rates while gaming at 2560x1600?? From a single card? If so, then I may just get rid of the TV entirely and use a 30"+ monitor. I'd spend money for that, for sure.


Those of us with X64 Operating Systems are standing at the ready! BRING IT! :D
 

Andrius

Distinguished
Aug 9, 2004
1,354
0
19,280
The chipset limits are meant as statement for the obvious overkill size of a 2GB framebuffer. If a game needs that much size from a framebuffer imagine it's system memory consumption.
If the memory controller from the chipset cannot address more than 8GB where does it map the extra 2GB framebuffer? I'm not really sure how this works on a modern chipset. Does a chipset have special addresses reserved for PCI-Express controllers outside of that 8GB limit?

I doubt it will have the "horsepower" in shaders to run the most recent "benchmark" games at playable rates on a 30" 2560x1600 screen.
Most are already quite playable at 2560x1600 on an overclocked 8800GTS with some eyecandy turned off.
 
The memory controller is concerned with the DIMMS installed in the 4 slots. It has no control/input/concern over other 'addressable' items, such as video RAM. That is controlled by the card itself and it's associated drivers via the PCi buss.
 
Status
Not open for further replies.