AMD Radeon HD 7730 Review: A Harbinger Of The Kaveri APU?

Status
Not open for further replies.

rmpumper

Distinguished
Apr 17, 2009
459
0
18,810
Just look at that performance difference between DDR3 and GDDR5. AMD sure needs to implement GDDR5 in Kaveri in order not to waist all that GPU potential.
 

designasaurus

Honorable
Sep 1, 2012
52
0
10,630


Your comment, while being perfectly accurate, actually made me notice something interesting. Take a look at the charts, and you'll see that the GCN 7730s are less affected by the switch from DDR3 to GDDR5 than the VLIW 6670s! That means GCN is leaving less performance on the table if it's paired with slower memory, which is ideal for the situation an APU is usually in (PS4 aside).
 
At the right price, this looks like a real winner, especially the GDDR5 version. If a $90 HD7750 is too rich, an HD7730 for $65 looks like it is still capable of playing most games, especially at 720p resolutions where a lot of HTPC cards operate.
The "secret sauce" that could really catapult this one would be if some of its disabled pieces might be able to be switched on.
 

ET3D

Distinguished
Mar 12, 2012
117
59
18,760
Regarding Kaveri, it looks from this that CGN means better performance with DDR3, but also higher power consumption. If AMD wants to keep to the same power envelope it might have to reduce GPU clock speeds, which will eat into the performance advantage. In the end, it feels from this that Kaveri will offer only a minor performance boost.

Hopefully AMD has done more power optimisations and that won't be the case.
 

shikamaru31789

Honorable
Nov 23, 2012
274
0
10,780
I'm a little dissapointed if this is what we can expect from Kaveri. I was hoping for something that would come closer to matching the APU in the Xbox One. I want to build a small HTPC for gaming in my living room, and an APU would have been ideal for that since the smaller HTPC cases don't have room for large discrete graphics cards. I don't know, maybe Kaveri could still be useful if they actually get Hybrid Crossfire working properly, a Kaveri APU paired with a discrete 7730 and DDR3 2133 might just work out for my purposes.
 

slomo4sho

Distinguished
I don't see why manufacturers continue to utilize this ram in newer products... Just get rid of DDR3 already...


Also, how is this Cape Verde GPU a "Harbinger Of The Kaveri APU"? It is a trimmed down 7750 and since a 7750 can provide no real insight into the performance of upcoming Kaveri APUs then how does this entry level card provide any better insight?
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
On page 10, does the reference HD 7750 really have a load temp of 97C?
(73C + 24C ambient = 93C)

Anyway, the GDDR5 HD 7750 looks like a viable game-enabler for PC gamers on a budget. Also, it's a nice refresh for that price point's options. I do hope it sells for around $60 or less. :-D
 


I have found a Trinity based APU more than enough for a HTPC. I would not have banked on Kaveri matching the XB1 and PS4's custom build silicon as in the name the APU designed for MS and Sony was of custom design and the hardware was always going to scale beyond that for desktop parts. If you offered me HD7730-7750 performance on a Kaveri I would be very pleased with that. What hasn't been brought to light is that the Spectre IGPU on Kaveri features around 512 Stream Processors and increased ROP's and compute units so it may very well be a potent iGPU.

Dual Graphics has improved with Catalyst 13.8 to the point it is now playable, if a Richland can DG with the HD7730's that would be tremendous fo gaming under $200 for chip and card.

 

animadvert

Distinguished
Oct 26, 2011
1
0
18,510
Wouldn't it be cool if the DDR3 version was made because in hybrid crossfire, with a kaveri APU, the hUMA architecture allows the CPU and integrated graphics to use the 2GB of dedicated Vram as well? Maybe it's AMD's secret super weapon? That way a system with 8Gb of DDR3 now has 10 Gb :0. Or at the very least maybe it'll sync better with the integrated graphics, since memory isn't added together in crossfire?
 

hannibal

Distinguished
If Kaveri is even near the performance of 7730, it will blow Hasvel GPU put out of the sky! But as it has been said they have to reduce the clockspeed, so the difference would not be so huge, but as it allso has been said Trinity allready is very good and if the CPU will get even better, the Kaveri will be a great HTPC CPU. It may even be reasonable good all around CPU and that is a big deal. Ofcource if you want good gaming rig, you need something much, much more powerfull, like 3 way Titan and so on, but that tould cost near 5000$ without desent monitor (4K monitor +5000$) so they are not exacly in the same league :)
All in all, not a bad budget card!
 


Trinity's HD7660D and HD7650D are already faster than Intels desktop Graphics in HD4600 ilk and not just on Frame rates but latencies as well, They have done a review of HD5100 and 5200 Pro's with the HD5100 still slower than Richlands HD8670D by a margin and Iris is faster but costs around $600 for the all in one BGA setups which prevent expansion.

Kaveri will be based on the smaller Steamroller cores which will improve performance/watt clock for clock with Vishera based cores and the IGPU will be on a more efficient GCN opposed to the Turks based VLIW4 architecture. If Trinity and Richland are already impressive I think Kaveri is going to top it by at least doubling iGPU over Trinity/Richland and x86 anywhere from 20-40% depending on the nature of the application, then there is HUMA and the HSA environment it think its going to be a very exciting release if you accept it for what it is, it will not be a champion chip in traditional computing sense but in a HSA environment it will blow away everything before it, cue the adobe premier pro benches 6800K vs 3770K is already around 500% faster in that environment AMD is king in integrated graphics AMD will be head and shoulders above its competition.

 

emad_ramlawi

Distinguished
Oct 13, 2011
242
0
18,760
hmmm... why no HD 7750 DDR3 in the charts, if you want to mention kaveri in the article, then please remember that high end kaveri will ship with 512 shaders and not 384, so comparing it with 7730 doesnt make much sense , i like the review of the 7730, but its not comparable with kaveri, the hd 7750 1GB DDR3 which is missing from this article is much more closer to kaveri
 

Maxime506

Honorable
Apr 22, 2013
1,032
0
11,960
That's an impressive news for some guys who has only OEM rigs and it helps a lot for them to easily upgrade their GPU without worrying too much about the power consumption.

But for me, I care more about the Kaveri APU. I've heard rumors that the flagship Kaveri APU, probably A10s, would sport iGPU like HD 7750/7770 which has 512 or 768 shaders (but the 768 may blow the sale of HD 7790 so making it 512 shaders is more possible), so we would see better performance. This review has a better picture for us and even if the flagship has equipped only 384 shaders the performance could be still impressive esp. using 2133 or 2400 mhz memory. We're looking forward to the launch of the new APUs and hope it would be a great leap forward compared w/ Richland and Trinity APUs
 

megamanxtreme

Distinguished
May 2, 2013
59
0
18,630
I have a HD 7750 GDDR3 and I want to know where it stacks up, since I also have the GTX 650 GDDR5 and GT 440 GDDR5. If the 7750 is better than the GT 440, I will replace it for my tower.
 


Just using the toms graphics hierarchy chart, it looks like the regular 7750 is 6 tiers above your GT 440 with GDDR 5. Given that you have a GDDR3 version of the 7750, that puts it probably no lower than 3 tiers under the regular 7750, which is still 3 tiers higher than your GT 440m which should be a pretty solid upgrade honestly. The chart isn't perfect but its a good general reference for where things stack up, and honestly that 7750 should give you a nice boost.

http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html
 

twstd1

Distinguished
Nov 26, 2008
104
0
18,710
I fail to understand why you tested all of these cards at 1920X1080 when they will most likely not be used at that resolution. Why not test them at the resolution that people who buy these would actually be using? Like 720p? I think that it would make for a better review and be closer to real world use.
 
Status
Not open for further replies.