ATI Radeon HD 5830: Bridging The 5700- And 5800- Price Gap

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Haha I have a suspicion that these 5830's are reject 5870 Cypress Chips that were allocated for a NEW SKU since a defect or part of the chip was not up to par or standards they just disabled the full side instead thus allowing ATi to at least sell and market some leftovers. Expect to see a few of these babies when compared to the 5770's and 5850's though. This is just a suspicion but I think in an industry as big as microchips, processors, silicon, and graphics, this is a perfect marketing strategy.....
 

ordcestus

Distinguished
Feb 9, 2010
156
0
18,690
[citation][nom]noob2222[/nom]seeing as these are just binned defective 5870 processors, I wonder if we will ever see a video card with a socket for swapping gpus. Keep the memory during upgrades. Of course this would mean standardizing the cards to be compatible with each other, and companies would fight this to the end.As for unlocking it, its probably possible, but the results would be unpredictable.[/citation]
I doubt it it would be harder to do properly. While its nice to imagine video cards being kinda like mobos/proccesor/ram today(videoboard/gpu/vram? maybe), Its unrealistic cause of the pace of video card technology(imagine a new proccesor socket every 9 months to a year).
Maybe it could be engineered in a more compatible way but the benefits to consumers wouldn't be worth the headaches and the retooling and therefore higher cost of GPUs
 

terr281

Distinguished
Dec 22, 2008
261
0
18,790
If things continue at the rate they have been in the industry:

1. 3D graphics require a separate card, then they could be integrated with the motherboard, and now the CPU. (AMD's next processors will have integrated graphics, just like the newest Intel chips.)

2. Memory controllers moved from being a separate component on the MB, to being integrated into the Northbridge, and now the CPU.

...

The thing that graphics cards manufacturers, both those at the level of Nvidia and Sapphire (as examples), have to worry about is the era when integrated graphics are "good enough" for the highest settings in most games. This issue will become even more of an issue as we continue to slowly move away from the desktop as the platform of choice. (And, instead, everyone begins carrying around "desktop replacement notebooks.")

And, in regard to the highest settings for games working on integrated graphics...

Short of the game that gets mentioned in every new video card post here, there are very few games out today that actually REQUIRE a $300+ video card to play. Most of them run perfectly well on their highest settings at the typical consumer resolutions. (Still 1680x1050 and smaller.)

Moore's Law will eventually not be able to be followed by cpu manufacturers, even if they bend the rules and make multiple cores instead of a single one on the same die. Graphics card manufacturers have to follow... and eventually fail... at the same law.
 


From the AnandTech article:-
On the one hand it’s a card to fill a perceived gap in their product line, and on the other hand it’s an outlet for less-than-perfect Cypress chips. Particularly when yields could be better, AMD wants to take every chip they can and do something with it. The 5850 line sucks up chips that can’t meet the 5870’s clock targets and/or have a 1-2 defective SIMDs, but until now AMD hasn’t had a place to put a Cypress chip with further defects. With the 5830, they now have a place for those chips.

Source

According to that article they have been stockpiling the defective chips since August and have several skiploads by now, so many in fact that they can produce a new line of cards.
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
im glad that amd filled that massive price gap but still, the performance on the 5830 was kinda disappointing. Oh well, makes my overclocked cf'ed 5750's look even better ;)
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
[citation][nom]Fortunex[/nom]What? The amount of stupid in this post is astounding.[/citation]
this sounds like some rouge computer from the 1970's sci-fi movies gone bad........ NO SALE!
 

brisingamen

Distinguished
Feb 3, 2009
201
0
18,680
these things easily oc to 1ghz gpu and at that clock speed within 1% of a 5850, tons of potential with these cards and aftermarket coolers.

amd is rockin it hard right now, if this last round of 58xx chips is this overclockable imagine the next spin, rev 2 of these should just be insane.

way to go amd. cant wait to buy as many of these as possible, . . . depends on how much overtime i get :p
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
I find it funny to see all the delighted fanboys who cheer for ATI's dominance, while at the same time complaining about prices. Cheer for competition, you fools!

Dominance by any one company is bad for everyone.
 

WarraWarra

Distinguished
Aug 19, 2007
252
0
18,790
As much as the AMD fan boys login with their 20 user id's voting thumbs down for my posts, this is and will stay a thumbs down product and for that price it is a blatant slap in the face of any intelligent human to think they would buy this.

I love AMD and would everything AMD but too far is too far even for AMD.

As a end user it is useless item to me and a waist of money by AMD to develop it. The damage AMD is doing to itself by releasing this instead of putting it in a wood chipper is amazing.
If AMD continues like this I might start switching to Nvidia.
 

Do you not think that if the GPU were defect free it would have been used for a 5870? Or are you under the impression that they had a bunch of perfect GPU's and then said "let's cripple them and sell them for less money".
 

jdog1089

Distinguished
Feb 28, 2010
38
0
18,530
I was wondering if this card is PCI-e 2.0? Cause I looked on newegg.com all I saw was 5830 pci-e 2.1. Can a pci-e 2.1 work in a pci-e 2.0?
 

the_brute

Distinguished
Feb 2, 2009
131
0
18,680
[citation][nom]WarraWarra[/nom]I love AMD and would everything AMD but too far is too far even for AMD.As a end user it is useless item to me and a waist of money by AMD to develop it. The damage AMD is doing to itself by releasing this instead of putting it in a wood chipper is amazing. If AMD continues like this I might start switching to Nvidia.[/citation]

If you haven't paid attention to the new die process, there are aLOT of crippled chips. (Much less then when they started on the new 5800 series) So instead of throwing them away for complete loss, why not sell them for some money. I will agree it is depressing to see performance in the price range go down all along ATIs low end cards, I feel this will be better in the long run with its full 256 lane of memory vs the 5770 with 128. Drivers will improve and performance will go up.
But if you really want to throw these chips away, be prepared to pay more for the fully functional chips.

Alot of your reviews are raited down because they arent logical, not because it sounds like nVidia/Intel fanboy. So before you state what you think as fact/ask a question or look it up.
 

xcamas

Distinguished
Mar 3, 2010
17
0
18,520
needless to say. the radeon hd 5830 is not intended to fill any gap between the 5700 and 5800 series. they are just defective chips that can steel make amd earn money. they dont care if you love this product cause this is just a LE version of the cypress chip that tsmc failed to yeald
 

tmc

Distinguished
Aug 22, 2007
99
0
18,630
ATI is finally realizing the innovation gap between making better products versus finding a "killer application" for those products.. Who can justify these quad+ core cpu processors without applications that take advantage of the computing power. Sure vista and windows 7 are supposed to bridge that gap for software that wasn't "specifically" written for it.. but when your talking about $200+ video cards or even the $100+ cards the killer app besides gaming is multi-monitor support and the kinds of (media display/sending) things you do with a HTPC (home theather pc).

Oh yeah, about this thread.. a $200+ range video card is generally for a PC that ranges around the $1k mark. That said, you will (this year?) have a NVIDIA based Direct-X 11 (finally) series of cards so get ready for the competition. I haven't owned an NVIDIA graphics card for over 15 years.. back when Mortal Kombat the the like were all the rage. We all know NVIDIA had a nice run of superior market strength.. but ATI (now AMD) seemed to keep legacy driver support across the series which I liked while NVIDIA pulled a few **Apple Style** nasty tricks on consumers which $ cost $ them dearly. I'm willing to forgive & forget, but not having a high performing dx-11 card compeitively priced for so long cost them again as I've got the 5750 and will keep it for however long it takes the industry to get into the next dx version(s) and upgrade to pci-e 3.0 spec. I don't see an enthusiast need for dual cards (or faster performing cards) even though my m/b & p/s can support it. By that time.. lots of things be "standard" such as the next sata & usb in m/b's. It would be sweet if Intel leaves 1156 around long enough for m/b manufacturers tempt consumers to upgrade a m/b 3+ years out.

Don't know if waiting for NVIDIA is a good move or not.. much like the telecom industry we have these duopolies which seldom get into "price wars" such as what they've done in decades past. I think that 97% of consumers can be better served with a cheaper card than the 58-59xx+ series. The other 3% are already committed to spending thousands on a PC and can fully justify the spending.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
Another crappy move by the gray screen green dotted 2D crashing random gaming BSOD crawling Dx11 FPS firefight lockup bloated cumbersome CCC in the red billions dollars a year loss no profiles no Cuda, no Physx, and yes even NO OPENCL, "D" compute 5.0 only and epic failure at that ati card company.
 

redraider89

Distinguished
Feb 4, 2009
109
0
18,680
Some are saying that $240 is too much. That may be true. However, I got my 5830 in a combo deal on newegg where I got the XFX 650 watt power supply for around a total of just less than $300 drive out. I couldn't see myself affording an extra $70 or more for a 5850 and the $100 power supply thrown in for about $60 made the XFX 5830 worth it for me.
 

redraider89

Distinguished
Feb 4, 2009
109
0
18,680
All you complainers about how ATI/AMD is doing this, if you don't want it, you know what? DON'T BUY IT. If you don't like it. Buy a 5770 or a 5850. Otherwise, stop bad mouthing AMD/ATI.
 

redraider89

Distinguished
Feb 4, 2009
109
0
18,680
All AMD/ATI is doing is making the most of what they have. There isn't anything wrong with that. Let the market decide you influence the market by either buying what they are offering or don't buy it.
 
G

Guest

Guest
i get 62 and 53 fps with 8xAA in 1650x1050 all settings max out(dx11). Something is wrong here...
 

cleeve

Illustrious
[citation][nom]Belferu[/nom]i get 62 and 53 fps with 8xAA in 1650x1050 all settings max out(dx11). Something is wrong here...[/citation]

Yes. What's wrong is that we have no idea the game, benchmark, hardware you're using, what you're comparing it to, or the point of your post.
 
Status
Not open for further replies.