AMD Retires Legacy GPUs, GCN Only Going Forward

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

none12345

Distinguished
Apr 27, 2013
431
2
18,785
Ive got nothing wrong with them discontinuing major driver support for 5+ year old archetecture.

Nvidia does it too, and dont give me, but they still release driver updates! Sure they do, but nothing major changes. Im still kinda annoyed about the nvidia 6600gt, and its horrible drivers. It had major problems with either 2d or 3d depending on the driver used, not a single driver to this day that nvidia released ever fixed both problems at once. Seemed like every other driver release would fix one problem, then bring it back when they fixed the other, over and over. Even if i could find a working driver for 1 game, id need to install a different one for another game. It got old fast. This card was in use in one of my parents computers, till this week when it finally died for good(was having some hardware issues over the last couple years). They dont game, so it was just using a driver that would work in 2d, ignoring the 3d issues. But i got reminded of the issue earlier this year, when i had to try a buncha driver versions to find one that would work correctly with turbotax on it. It had been so long i had forgotten just how annoying that was, but its fresh in my mind now.

Nvidia's driver failure with their 6600 cards is what made me seriously consider ati graphics cards when it came time to replace it. And ultimately i went from almost a decade of nvidia gpu use to almost a decade of ati/amd use because of it.

My current graphics card a 4850 bought almost 6 years ago has been on legacy status for awhile. But you know what, it still works just fine. Ive never had a problem in any game that made me want to go try different drivers to fix it. I can still play every game just fine, tho i finally have to start lowering settings in titles from about 1 year ago.

But, im not upgrading till 16/14nm stuff comes out. Had i known that both gpu vendors would be stuck on 28nm for 3 years, i would have bought a card 3 years ago. Im not buying end of life 28nm stuff, with 16/14nm stuff just on the horizon.
 

boytitan2

Honorable
Oct 16, 2012
96
0
10,630
I wrote above that cards from 2013 are too young to be retired, but i went to Nvidia's website to see their Legacy Products and it looks like almost entire 700 series (750ti is not included) is there, even Titan. So retiring cards from 2013 is an industry thing, not an AMD thing.


http://www.nvidia.com/page/legacy.html

The 700 series still gets new drivers. Its on the legacy drivers page but you can still download the newest game ready drivers for the 700 and 600 series.
 

Brian Blair

Reputable
Mar 20, 2014
128
0
4,690
Well this is complete BS! Especially since some of AMD's 5000/6000 series can still run games at medium to high settings, even today! And how long someone has owned a GPU is irrelevant as a reason to replace it! This is a very shameful way to try and boost their failing market shares! As a matter of fact this will have the opposite effect on most buyers! Especially since the 6000 series is only five years ago! (Actually a little less) This will steer many over to Nvidia! Nvidia still supports the 400 series and 500 series cards that were out during the late 5000 series era, And the 6000 series era. This is because much like the 5000/6000 series, Most 400/500 series cards can still run modern games! I myself have a GTX 970 in my current Rig, And a R9 270 in a older Rig. But this still hits me were it hurts! Because I do not like this idea! I may want to keep my GPU for five years! That is kinda the point of buying a high end GPU is it not? Long story short! Forcing people to buy new cards, When their older cards still run modern games just fine, Is completely unacceptable! And for this I will no longer buy a AMD product ever again! They sealed their fate with me as a customer! And many other people! Anyone thinking of buying a new AMD card to replace their older one! Shame on you! You are telling them this is ok! And it's only screwing you over (not to mention everyone else) in the future! I SAY THIS IS NOT OK AMD! You can keep your outdated GCN and stick it!
 


Yes so. Nvidia needs more cores because their Fermi cores and older simply ran at a higher frequency. They used to run at double the GPU frequency. We know these days that it's far more power efficient to have two cores running at 1GHz instead of one of the same cores at 2GHz. They are still the same basic cores, just at a lower frequency.

The 580 is strong in dual-precision math because Nvidia didn't limit it nearly as much as they limit their current cards. That's not an architectural problem, that's a soft limit. AMD does a similar thing. For example, their 290/290X and 390/390X cards can run DP math at 1/2 the rate of SP math, but most of their other cards right now run at 1/16 or 1/32 except tahiti based cads that run at 1/4. Nvidia is mostly at 1/24 or 1/32 as well. Titan and its family run higher. The 580 is 1/4 or 1/2, don't remember which. The cores haven't changed much.

Technically, AMD's GCN cores are also pretty similar to VLIW4, but the instruction set and things are very different and that's where the problems come in. The cores are similar in hardware, but how they operate has changed radically. Nvidia's cores are both similar in hardware and in operation across many generations.
 


In no way does this force anyone to stop using older GPUs. I, for one, have a laptop with an old Trinity APU and this news doesn't change anything. I have a driver that works just fine and I doubt I'd update it several years later even if I could because I don't need to.

People cite how Nvidia still updates old things, and you're kinda right, but except for things using the same cores (read Fermi and up), Nvidia actually hasn't released a new driver in quite some time. They just tweak an old driver a bit.

In what way is GCN outdated? Nano proves that it's competitive with Maxwell in power efficiency when you have properly working Powertune, so I don't see have it can be called out of date unless you also think that Maxwell is out of date.
 

Creme

Reputable
Aug 4, 2014
360
0
4,860


We're talking about DX10 cards that can't run modern DX11 games anyway, but at least get support for any problems until April 2016. AMD dropped support for DX11 cards that could easily play games in this day and age. Nvidia still releases updates and game specific drivers for 400 and 500 series.



It's competitive because of HBM, which lets them brute force and cram more shaders.
 


The DirectX version has nothing to do with it and Nvidia doesn't develop specific drivers for the 400 and 500 series cards. The 400 and 500 series, as I've said many times now, use the same drivers as the 600, 700, and 900 series because they have the same basic cores (among other reasons). Developing drivers for both VLIW5/4 and GCN would be like trying to develop drivers for both an Itanium CPU and a Skylake CPU at the same time to run the same software.

HBM doesn't make that big of a difference. Yes, it helps, but it is far from everything, otherwise the Fury and the Nano (which have fairly similar performance) wouldn't have such a huge difference in power consumption. The biggest improvement in power efficiency was fixing powertune. We can also note how the 390X/290X cards and the Fury/Fury X don't have such a big disparity in power efficiency, so although HBM uses a lot less power than similarly performing GDDR5, the power consumption of the memory is still far from being the majority of a card's power consumption. HBM allowed for higher memory performance more than it reduced overall power consumption of the card.
 

7Fred7

Distinguished
Feb 5, 2009
8
0
18,510
Most of the driver updates in the past couple of yrs have been mods for more recent cards, so the news of no further driver updates should hardly make anyone wet their pants.
 


Plus I think it is a current HBM 1.0 limitation.
 

Tzn

Honorable
Nov 4, 2013
694
0
11,060
I have HD 7750 so i am saved a year or two maybe, i must say it's a wonder card from AMD, played all the games with it except the broken Batman A.Knight.
 
HBM 1.0's 4GB limit is a hardware limitation - 1GB per stack, 4 stacks maximum. However, Fury uses GCN 1.2, which includes lossless texture compression - which showed up as efficient enough to allow the R9 285 2GB to reach performances close to the R9 280 and R9 280X 3GB in texture-heavy benchmarks as far as data transfer speed went. As such, 4 GB of RAM on a Fury would be rather close to 6GB on a GCN 1.1 card.
 


seriously though if you need 4 or more GB of vram on a GPU then for the tasks that need that kind of VRAM you might as well go for a workstation card over a mainstream one

 

GObonzo

Distinguished
Apr 5, 2011
972
0
19,160
Maybe it is time to replace my HD 6870. I probably need to replace my i3 2120 as well. Too bad I don't really do anything with my computer that needs more than that.
You can absolutely be forgiven for the delay. They haven't really released compelling price/performance upgrades for your board. The same $170 that bought a 6870 can finally buy a rebranded 7850/7870. Upgrade, yes. Something to get excited about? No.
for a few dollars more an R9 280X. going from a 6870 there will be quite an improvement.
 

GObonzo

Distinguished
Apr 5, 2011
972
0
19,160
seriously though if you need 4 or more GB of vram on a GPU then for the tasks that need that kind of VRAM you might as well go for a workstation card over a mainstream one
the 2GB R9 380 is only $170 which is a 1/4-1/3 the price that the 7950 was at launch. the Fury series cards are much more expensive and the HBM does not require even half the memory to perform the same tasks as VRAM.
 


right ,because price is the only difference between a workstation card versus a desktop one. seriously, if that was all it was then why bother make workstation grade GPU's

 

Johnpombrio

Distinguished
Nov 20, 2006
248
68
18,770


There are two types of Legacy cards for NVidia. the one you have mentioned is for LINUX:
http://www.nvidia.com/object/IO_32667.html

As for End of Support for Windows driver, here is the list. GTX 780 is not even close to being on that list:
http://nvidia.custhelp.com/app/answers/detail/a_id/3473
 

Johnpombrio

Distinguished
Nov 20, 2006
248
68
18,770
I just recycled my last 3 AMD cards today. They were all 57XX cards and I was still using one for my HTPC. Does not matter if it still "works" as at some point, some program or other will stop allowing the card to be used. My friend also had a 57XX series card and he had his astronomy software say that that the card is not longer able to work with the program. I got him to buy an EVGA superclocked NVidia card and he loves it.
There were the last AMD products in the house. So long AMD, it was nice while it lasted.
 


What Nvidia card did you get him and what exact program, if you don't mind me asking?
 

Creme

Reputable
Aug 4, 2014
360
0
4,860


My mention of DX was to show how Nvidia seperated their drivers between DX10 and 11 cards. I know that 400 and 500 series use the same driver as the latest ones. VLIW5 and GCN being different doesn't excuse AMD from dropping support, they chose to keep using that architecture after the 6000 series even after GCN was out, so they should have owned up and supported the 2013 APUs longer too.

HBM did a lot for the Fury and Fury X. With GDDR5 they would have been over 300 watts easily. It's still a respectable amount of power that it saved them.
 
Status
Not open for further replies.