• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

AMD/ATI Beats Nvidia to the 1 GHz GPU Milestone

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]bobbytsia[/nom]Nvidia unified shaders clock have been 1.2GHz+ since G80. Nvidia was 1GHz+ almost 2.5 years ago.[/citation]
thats not the only part to the GPU...
 
[citation][nom]bobbytsia[/nom]Nvidia unified shaders clock have been 1.2GHz+ since G80. Nvidia was 1GHz+ almost 2.5 years ago.[/citation]

Shaders and Core clocks are seperated with nVidia, while ATIs is linked.

ATI is doing the first core clocked at 1GHz+. nVidias is still around 700MHz.

[citation][nom]Pei-chen[/nom]So what? Nvidia has higher performance per clock which was what AMD used to preach back in the Athlon XP/64 vs. Pentium 4 days. Now AMD is bragging the high clock speed of PII and 48xx GPUs.The table has finally turned and AMD is on the receiving end.[/citation]

Actually it was the R500 X1800 that had a unified shader arch with 64 unified shaders. Xenos was developed based off of that arch and then the HD2K series pushed even more shaders (320).
 
I don't get it...

Pardon my ignorance, but is this something major in the GPU industry? If it is then congrats to ATI. It just seems that the technology behind ATI and Nvidia are different anyways, so i don't understand why you would compare clock speeds. Don't ATI cards have way more shading processors too? Do they get a round of applause for that too? I didn't that ATI having more sp's than Nvidia gave them more performance.

Heck if this is some breakthrough in the industry then i will gladly say ATI has done something awesome. But in the end is this giving them any more of a performance edge against Nvidia? This seems a lot like street racing to me: You can find imports that will rev higher than 8000 rpm. However does this give them more power than a big block v8 revving at 6000 rpm? Maybe, but maybe not.

I'm just trying to find the purpose behind this award, so i will gladly listen to any worthy justification.
 
[citation][nom]cynewulf[/nom]It's a symbolic victory. Means nothing in true performance terms.[/citation]
I would be curious to see the performance difference it makes when OCing from 850MHz to 1GHz. My guess is 5-10%, not that impressive in itself, but that is also the average performance gain of a 4890 over a 4870 or a GTX285 over a GTX280 ...
 
Wait wait wait... Chris Angelini used to be on Sharky Extreme?? Back in the day Sharky Extreme was the best, better that Toms' in many regards but then they sold out and things went downhill fast as I recall.
 
The good old days... I got myself an Athlon 1.33 GHz the year after. Had been chugging along on a 400MHz K6-2 that I got in '98 before that. Built them both myself. That was back when dial-up modem weren't the easiest thing to get going due to com port and i/o conflictions. Not that most of you know about com ports.
You also had to set clock speeds by motherboard jumpers back then. With the K6 you had to set multiplier and base clock, I think at the point of the Athlon all you had to set was the base clock. Anywho... can't wait to see the #'s on the new part!
 
So the processor is 250Mhz faster. But is the card faster than NVidia's flagship?
 
[citation][nom]sublifer[/nom]...in '98 before that...That was back when dial-up modem weren't the easiest thing to get going due to com port and i/o conflictions. Not that most of you know about com ports.[/citation]

Does being a pretentious ass always come easy to you? Let me see if I can remember 1998...hold on...oh wait, it was a little over 10 years ago.

Yeah, I think the average Tom's reader remembers those days just fine. Glad you were a big boy/girl and built your machines by yourself though, take two gold stars out of petty cash and go see mommy, she has a cookie for you.
 
[citation][nom]jrnyfan[/nom]take two gold stars out of petty cash and go see mommy, she has a cookie for you.[/citation]
Nice flame but this isn't the place for it... go play on yahoo or somewhere they might care.
If you're old enough to know and were into hardware back then, then you know it was a completely different world. I'm guessing you're a punk kid, maybe 16, who thinks he knows everything. You've got a lot to learn boy.
 
Just wait for the 40nm chip to be used in a high end card other than the 4770, or get a couple 4770's now. I prefer single card solutions. Keep it up AMD/ATI.
 
Actually, many of you are wrong. ATi does not have 800 phsycial stream processors. Nvidia's stream processor perform 1 MADD per unit if my memory serves me well, so if they have 118 or 216 shading processors, they physically have that amount. ATi actually has LESS shading/stream processors (SP) than nVidia, because each of their SPs performs 5 MADDs, giving them 800 shader processing UNITS (again they physical don't have 800). So take 800 and divide it by 5

800 / 5 = 160 ALU/SP

So ATi, actually has 160 ALUs/Shading Processors that each perform 5 MADDs giving them 800 Stream processing UNITS, the problem is games today do not really take advantage of those 5 MADDs ATi can do, so AMD/ATi has to try harder, unless the game was made for ATi.

I thought most people on here knew more about technology!
 
[citation][nom]spanky deluxe[/nom]Wait wait wait... Chris Angelini used to be on Sharky Extreme?? Back in the day Sharky Extreme was the best, better that Toms' in many regards but then they sold out and things went downhill fast as I recall.[/citation]

LOL, yeah. We spent many caffeine-loaded nights up in San Jose testing hardware. It was me, Alex (who now overclocks Porsches) and Ben (who now works for Nvidia). Those were the days, man!
 
[citation][nom]RicardoK[/nom].. Remember the P4 bug with 1ghz speeds? lol..[/citation]

ummm...wasn't that a Slot 1 P3 bug? (maybe it was a socket 370 but I keep thinking Slot 1)

actually from searching it seems it was more a recall on the P3 1.13GHz cpu...

http://www.tomshardware.com/reviews/intel-admits-problems-pentium-iii-1,235.html

http://www.tomshardware.com/reviews/intel,219.html

http://www.encyclopedia.com/doc/1G1-77081927.html

...heck I don't even remember a P4 being that small in GHz. I thought the smallest was a P4 1.4 GHz but my buddy says he thinks they came in P4 1.2 GHz versions.
 
[citation][nom]cangelini[/nom]LOL, yeah. We spent many caffeine-loaded nights up in San Jose testing hardware. It was me, Alex (who now overclocks Porsches) and Ben (who now works for Nvidia). Those were the days, man![/citation]

Someone who works for nVidia and someone who tunes up Porsches, great contacts to have, I bet!! Looking at the site brings back waves of nostalgia, the designs hardly changed! Back in the day Sharky Extreme was like gospel. I don't think I bought any components back then that hadn't been recommended by the site. I have to say, having some of those who were the great of Sharky Extreme in its golden era working here now is a serious benefit to the site!!
 
Either something has happend that im not aware of or people are confusing Shaders for GPU cores. Since i havent seen a multi core GPU yet ill assume for the moment there is no such thing. SLI and Xfire not counting of corse that would be more like "multi socket" Though i cant wait for the day they Crossfire for lack of better word on a single dye some quad core 4890's with 4 gigs of ram that can be shared 😱
 
Actually, many of you are wrong. ATi does not have 800 phsycial stream processors. Nvidia's stream processor perform 1 MADD per unit if my memory serves me well, so if they have 118 or 216 shading processors, they physically have that amount. ATi actually has LESS shading/stream processors (SP) than nVidia, because each of their SPs performs 5 MADDs, giving them 800 shader processing UNITS (again they physical don't have 800). So take 800 and divide it by 5

800 / 5 = 160 ALU/SP

So ATi, actually has 160 ALUs/Shading Processors that each perform 5 MADDs giving them 800 Stream processing UNITS, the problem is games today do not really take advantage of those 5 MADDs ATi can do, so AMD/ATi has to try harder, unless the game was made for ATi.

I thought most people on here knew more about technology!
 
[citation][nom]EnFoRceR22[/nom]Since i havent seen a multi core GPU yet[/citation]
How could you miss the 7th comment with 15+'s?
[citation][nom]thundercleese[/nom]GPUs are multi-core. They have been for a while now.[/citation]
 
"GPUs are multi-core. They have been for a while now."

Now i realise spreading missinformation is populer...... BTW where did thundercleese show a multi core gpu? your just going to take his word for it? im assuming so since your following the crowd of people that thinks a shader is a cpu/gpu

I saw it just fine. Its compleatly wrong but i saw it. Shaders are not cores xfire is not multi core its closer to multi socket. Each of these things have differences. Just becuase this guy said it does that make it true? Actualy doing some research i found they did plan on making some muldi core gpu's sometime in the future but people thinking xfire/sli and shaders are cores are the same dumb ass's that think the core cpu is multi core becuase it has multi shaders on dye with the actual core. Find me a single board either by ati or nvidia that has more then a single core. Not more then a single chip but more then one core.
 
I ment cell cpu :/

Please dont use if it can process its a core argument. Cuz then my sound card, network card, ata controlers, blah blah blah all become cores as well.
 
[citation][nom]EnFoRceR22[/nom]I ment cell cpu Please dont use if it can process its a core argument. Cuz then my sound card, network card, ata controlers, blah blah blah all become cores as well.[/citation]
No, they're all processors, that are single core, and can execute a single thread of execution.

GPU's have multiple cores, giving them multiple processors doesn't make as much sense, considering a single processors with the same cores will almost always perform better. Just take a look at the Core 2 Quads, they were essentially 2 dual cores on the same die, and the Phenom II's, true Quad cores, give better performance in most cases.
 
Status
Not open for further replies.