Another Long-time Graphics Leader Leaves AMD

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]alidan[/nom]i hate nvidia business practices, and i wont go into detail but in the past they royally screwed me and i wont forgive them. this is just me but if amd gpu division died, i would go intel before i go nvidia that's how deep my hate goes.[/citation]

Nvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.
 
[citation][nom]sp0nger[/nom]Does suprise me, AMD will always be #2 to Nvida as long as they keep playing catch up. "AMD believes its New Zealand and 'Sea Islands' architectures will be a potent challenge for Nvidia's GK104" Always playing catch up. Next cards come out that are just barely better than Nvidias year old cards, then Nvida releases there next gen that puts AMD way behind again.[/citation]

The 7970 is right behind the GTX 680 most of the time and it beats the 580 handily (more than it loses to the 680), so no, AMD isn't a generation behind like you imply. Besides that, the only reason that Nvidia is winning with GK104 like this is that AMD made GCN with compute in mind (expending very large fractions of the die size and power consumption to accomplish this) and AMD also clocked their cards way too low.

Also, the only reason that the GTX 580 was so much faster than the 6970 (the difference was actually fairly small) is because it's GPU was almost twice as large. Yeah, 530mm2 compared to 375mm2 or so, what do you think will win if they're on the same process node? Nvidia, for whatever reason, decided to abandon their compute performance (the 680 loses in DP compute to even the GTX 470 and 560 TI), so they could focus on gaming throughput only.

The problem with this approach is that more and more games are becoming reliant on compute performance, so the 7970 is a more future proofed option. The 7970 also has more memory and memory bandwidth, so it's a more future proofed option for another reason. In fact, the GTX 680's memory bandwidth bottleneck coupled with it's poor compute performance is what causes it to lose in some games to the 7970 despite it supposed to being a faster card. f course, games often favor Nvidia or AMD so this is nothing new, but all Nvidia had to do to fix this was improve compute a little (just getting a 1 to 12 instead of a 1 to 24 DP to SP ratio would have let the 680 match the 580 in DP performance) and give the 680 a 384 bit bus.

AMD wanted to hammer Fermi in compute, and that they do. The 7970 can run at over three times faster for DP and I don't remember how many times faster for SP than the 580 (the 680 is okay for SP compute, but most compute work is DP and it still loses by a huge margin in SP anyway)
 
[citation][nom]mister g[/nom]Question, why would you want the graphics chip integrated with your main CPU on one huge chip? Server chips are built with reliability in mind, and the reliability of integrated CPU and GPU chips all in one package still hasn't been tested in full yet. Those type of APUs seem more a fit for consumer devices.[/citation]well if realiability is ur question, it still doesnt explain why AMD didnt use Opteron and bundle with GCN together as a strong hold for server market. Now we got Nvidia laughing their ass off with tesla, then we got Intel smiling with the 90% cpu market share, tell what good is AMD if they got a gun that didnt put into good use?
 
AMD worked out that integer performance was far more important in the server world than floating point performance, hence the significant boost in integer resources with Bulldozer which does actually pay off where it's required. Intel CPUs are so damned good at integer calculations, and they happen to own the vast majority of the server market without having any onboard GPUs. So, why would AMD want to throw a GPU into its server CPUs?

Notwithstanding the fact that this wouldn't be good whatsoever for power usage... something AMD is struggling with thanks to the first-gen 32nm process at GF.
 
[citation][nom]stm1185[/nom]So the graphics guys who came up with good products are leaving yet the CPU guys who made failure after failure are still there.[/citation]

This world's awesome logic :L
 
Ok, it sounds like AMD lost a manager, not an engineer or developer; a coordinator (and maybe a good one), but not a creator. Minor hitch maybe, but I see no great loss here. Moving right along...
 
[citation][nom]vrumor[/nom]Nvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.[/citation]

before windows 7 came around, nvidia had 3d vision... i was willing to spend as much money as i had to to play games in 3d, i played once with a head mount and a game built for it and have always wanted to play that way again...

so im ready to buy a new monitor, a top end gpu, and a 3d kit from them... double check everything... wait... whats this... vista only... no xp version... they managed to crush my hopes, my dreams... in one fell swoop.

than with business practices, in wimtbp games, they lock code and forced ati to get a work around going, artificially making the game look worse on their cards, not sure if this still goes on

making phsyx an nvidia only thing, could easily have ported the code to x86 and allowed it to be better on computers, and sell the phsyx code as the best (currently it is) physics tool available, but no, to make their cards look better they gimp the code to x87, it runs like crap on x86, makes their cards look better, and screw over half the gaming market on the pc.

lets also not forget the drivers that burnt out cards

ill say it again, many reasons for me to despise nvidia.
 
[citation][nom]vrumor[/nom]Nvidia doesnt sell the cards, whatever beef you had with an nvidia base GPU should be taken up with the reseller. But hey why not hate an entire company.[/citation]

Not true if his beef is with the mobile Nvidia cards. The debacle with the mobile 8600GT that lead to a major class action against Nvidia still leaves a bad taste in my mouth. Especially since only a few of the big name companies were actually a part of that class action and others just got screwed over.
 
[citation][nom]stm1185[/nom]So the graphics guys who came up with good products are leaving yet the CPU guys who made failure after failure are still there.[/citation]

I don't know if I should cry or laugh about this coment, it's so true.

I really hope AMD do something !!!!!! And with something I mean SOMETHING GOOD!
 
If the Intel fanboys get thier way and AMD goes under(not going to happen)they'll complain that prices are too high. Hey don't get me wrong I would love to have an I5-2500 but I'm not going to pay 2x as much for a %10-%20 performance increase.
 
[citation][nom]antilycus[/nom] 2) MS didn't write NTFS, it came from IBM along with the NT KERNEL (formally known as OS/2 Warp). [/citation]
Must set this one straight, NT most certainly did not come from OS/2 Warp, or any other OS/2. That becomes apparent if you have ever used both. If memory serves, NT has it's roots in System V.
 
sounds like poor management that doesn't listen to those who know their stuff. Pretty sad that they're so delusional that they let their people quit before conceding.

Lets see how their future reflects these decisions
 
[citation][nom]alidan[/nom]before windows 7 came around, nvidia had 3d vision... i was willing to spend as much money as i had to to play games in 3d, i played once with a head mount and a game built for it and have always wanted to play that way again... so im ready to buy a new monitor, a top end gpu, and a 3d kit from them... double check everything... wait... whats this... vista only... no xp version... they managed to crush my hopes, my dreams... in one fell swoop. than with business practices, in wimtbp games, they lock code and forced ati to get a work around going, artificially making the game look worse on their cards, not sure if this still goes onmaking phsyx an nvidia only thing, could easily have ported the code to x86 and allowed it to be better on computers, and sell the phsyx code as the best (currently it is) physics tool available, but no, to make their cards look better they gimp the code to x87, it runs like crap on x86, makes their cards look better, and screw over half the gaming market on the pc. lets also not forget the drivers that burnt out cardsill say it again, many reasons for me to despise nvidia.[/citation]
your reasons for hating Nvidia over bad drivers, well founded, this ATI guy that got axed was one of the guys that helped release all the terrible drivers for ATI before AMD bought ATI up and straightened out most of the mess ATI was.
as far as your 3-D argument goes, i'm laughing in your face because NOBODY has been able to produce a real working 3-D imaging solution for gaming or watching movies. that red blue glasses or shuttering is not 3-D. if it's displayed on a 1-D screen it's still 1-D.
any more of this 3-D rubbish and i am liable to buy a steam roller and run over everything and any one saying 3-D and put them in a picture frame just to show them what this image they claim to be 3-D really is.
 
Today AMD needs to make waves entering the mobile graphics market with a killer low watt APU. For all the talk of Bulldozer there is Tegra 3!!! Hello! McFly!!!
 
[citation][nom]olaf[/nom]and the same (over)price as weak a$$ consoles[/citation]
yes, but the weakest cpu's and graphics cards will cost more then the core i5 2500k or the gtx 580, since they would have a monopoly. you NEED competition to keep prices down.
 
[citation][nom]f-14[/nom]your reasons for hating Nvidia over bad drivers, well founded, this ATI guy that got axed was one of the guys that helped release all the terrible drivers for ATI before AMD bought ATI up and straightened out most of the mess ATI was.as far as your 3-D argument goes, i'm laughing in your face because NOBODY has been able to produce a real working 3-D imaging solution for gaming or watching movies. that red blue glasses or shuttering is not 3-D. if it's displayed on a 1-D screen it's still 1-D.any more of this 3-D rubbish and i am liable to buy a steam roller and run over everything and any one saying 3-D and put them in a picture frame just to show them what this image they claim to be 3-D really is.[/citation]

you never played a game with a head mount display, or you never played a game in 3d that was optimized for 3d.

let me put it another way. in real life i have no death perception, dont know why, but i cant judge distance at all, but any style of 3d, be it the red blue, magenta green, shutter, passive, or head mount (2 monitors, one for each eye) i can see, and i would pay damn near anything for that. after the nvidia thing though... i decided to hold off for either a head mount, or a universal standard to emerge.
 
[citation][nom]blazorthon[/nom]The 7970 is right behind the GTX 680 most of the time and it beats the 580 handily (more than it loses to the 680), so no, AMD isn't a generation behind like you imply. Besides that, the only reason that Nvidia is winning with GK104 like this is that AMD made GCN with compute in mind (expending very large fractions of the die size and power consumption to accomplish this) and AMD also clocked their cards way too low.Also, the only reason that the GTX 580 was so much faster than the 6970 (the difference was actually fairly small) is because it's GPU was almost twice as large. Yeah, 530mm2 compared to 375mm2 or so, what do you think will win if they're on the same process node? Nvidia, for whatever reason, decided to abandon their compute performance (the 680 loses in DP compute to even the GTX 470 and 560 TI), so they could focus on gaming throughput only.The problem with this approach is that more and more games are becoming reliant on compute performance, so the 7970 is a more future proofed option. The 7970 also has more memory and memory bandwidth, so it's a more future proofed option for another reason. In fact, the GTX 680's memory bandwidth bottleneck coupled with it's poor compute performance is what causes it to lose in some games to the 7970 despite it supposed to being a faster card. f course, games often favor Nvidia or AMD so this is nothing new, but all Nvidia had to do to fix this was improve compute a little (just getting a 1 to 12 instead of a 1 to 24 DP to SP ratio would have let the 680 match the 580 in DP performance) and give the 680 a 384 bit bus.AMD wanted to hammer Fermi in compute, and that they do. The 7970 can run at over three times faster for DP and I don't remember how many times faster for SP than the 580 (the 680 is okay for SP compute, but most compute work is DP and it still loses by a huge margin in SP anyway)[/citation]

can you elaborate on the why you think the 7970 is better than the 680 in more detail please..
whats compute performance? is this where the GPU is also used for the work that a CPU would usually have to do?
how would this apply to games? what games are using this now?

im saving up for a graphics card and my canidates are either the 7970 or the 680.
my current card 9600gt has lasted me 4years. and i spent $160 on it.
im hopeing that if i drop $550 on one of those cards i can get another 4 years out of my new rig.
i plan to be playing alot of first person shooters. including bf3.

so whats the scoop?
 
[citation][nom]horaciopz[/nom]I don't know if I should cry or laugh about this coment, it's so true. I really hope AMD do something !!!!!! And with something I mean SOMETHING GOOD![/citation]

I just can't take a person seriously when he/she writes like this "It's so truuee!!!!!" Till this day Toms has yet to bring up some facts to say that Bulldozer is a "failure"

Features & Benefits
Experience the world’s first native 8-core desktop processor.
Overclock for a big boost in performance and speed1.
Perform mega-tasking and get pure core performance with new CPU architecture.
Get an extra burst of raw speed when you need it most with AMD Turbo CORE Technology.
Push your performance with tuning controls in the easy-to-use AMD OverDrive™ software1.
Enjoy stable, smooth performance with impressive energy efficiency thanks to a 32nm die.

Where does it say it'll be any match to Intel in gaming? No where. Toms is the largest site that hyped the FX before it was out with all the b*lls*it it being twice as powerful when AMD never actually said that, so stop that hate towards it, Bulldozer is an Server CPU brought to the consumer market, they are fixing the flaws with Piledriver. It's just funny to see how many of you think you know it all but actually just make fool out of yourself.
 
Status
Not open for further replies.