AMD 785G: The Venerable 780G, Evolved

Status
Not open for further replies.

mcnuggetofdeath

Distinguished
Oct 9, 2008
301
0
18,790
"refined architecture" ? To my knowledge, and please correct me if im wrong, all that was changed between the original phenom and the phenom 2 was the addition of more L3 cache allowing it to do more simultaneously and a die shrink allowing for higher clocks. That does not a refined architecture make. When AMD added an on die memory controller to their processors years ago they had made a huge advancement in architecture. Im sad to see them fall away from the performance crown. Here's hoping their new Bull Dozer architecture brings something genuinely intriguing to the table.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
Very interesting.
A integrated GPU that can game. =D

Makes my lil Pentium D with a 4670 seem puny...
3.3GB/s memory bandwidth (single channel DDR2 533... though 2 sticks, it runs in single channel... damn prebuilts) also seems sad on my rig...

[citation][nom]macer1[/nom]the real question is how would this perform if mated to an Atom processor in an nettop.[/citation]

Good question. A dual core Atom with a 4200 integrated would be nice.
We all know Intel makes shitty mothebroards and AMD makes kickass motherboards anyways.
 

SpadeM

Distinguished
Apr 13, 2009
284
0
18,790
[citation][nom]mcnuggetofdeath[/nom]^^^ and support for DDR3. Although thats a change to the board, not the CPU.[/citation]

Not correct, the P2 has a built in memory controller so the switch to ddr3 affected that controller
 
[citation][nom]anamaniac[/nom]Very interesting.A integrated GPU that can game. =DMakes my lil Pentium D with a 4670 seem puny...3.3GB/s memory bandwidth (single channel DDR2 533... though 2 sticks, it runs in single channel... damn prebuilts) also seems sad on my rig...Good question. A dual core Atom with a 4200 integrated would be nice.We all know Intel makes shitty mothebroards and AMD makes kickass motherboards anyways.[/citation]


Native ram for a pentium d is PC4200 which has a max of 4.2gb/s per channel etc and the FSB has the max of 6.4gb/s

The Intel atom would most likely underpower any video card out there, and Intel does actually make a good reliable business platform where video performance is not required etc
 
G

Guest

Guest
I'm sorry, is this an Intel benchmark site? All other reviews put SYSTEM power consumption for Athlon II 250 well below Intel E7200.
 

aproldcuk

Distinguished
Aug 4, 2009
3
0
18,510
This article raised a lot of questions for me. What about Hybrid Crossfire for example? What kind of cards can be used together with this new IGP? Is the discrete graphics card on standby if no performance is required? If no then how much extra outlet wattage is expected? And how much extra if actively in use? I'm interested in using the 785G solution in the 24/7 HTPC setup with the possibility to do occasional gaming as well. My current setup with 690G chipset and Athlon 64 X2 BE-2350 CPU draws around 50 watts most of the time and up to 90 watts under heavy load. Is it too much to expect similar levels from 785G and Phenom II X3 705e combo for example?
 

wh3resmycar

Distinguished
when can we see the mobile version of this? this is most certainly a welcome update compared to the 780g-hd3200 chipset. and beats any nvidia igp hands down. id love to see this on an $700-$800 laptop. good thing im still holding back on buying a new notebook.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]Article[/nom]There are two lessons to be learned here: first, if you really care about the environment, turn your PC off (or at least configure it to enter sleep mode) when you're not using it, and second, don't be afraid of purchasing a better processor for fear that it will cost you big money in power consumption.[/citation]

Perhaps the next task could be a power comparison to tell us how long a computer needs to stand in active state to consume more power than turning it off and back on again (including starting msn,av software and a bunch of other stuff running in the background).

Anyway good article :)
 
G

Guest

Guest
McNuggetOfDeath: There were changes to the Phenom II architecture, 45nm is not what enabled higher clocks, it was architectural changes(mostly regarding internal latencies). There were also other changes as well that enabled higher IPC and smoother overall performance.

PS: Phenom II does support DDR3, there are only 2 models out of 12 that don't...
 
[citation][nom]Pei-chen[/nom]Good timimg, I was wondering if 785G is better than 790GX or not yesterday. Thanks.[/citation]
========
My take on it is except for some specific HTPC features, the 790GX is still the better of the two, especially if any gaming is involved. They compared an OC'ed 785G to a stock 790GX; what if they'd OC'ed the 790GX also?
And, lest anyone develop any false hope, the Intel IGP has once again been shown to be a toad.
 

DarkMantle

Distinguished
Aug 6, 2008
131
0
18,690
One of the best things this chipset brings is a lower cost on AM3 motherboards, if you want to use PhenomII processors paired with DDR3 ram and a single video card, you can pay 89-99 dollars for the motherboard. I think this is important.
 

judeh101

Distinguished
Nov 27, 2008
73
0
18,630
I would totally use this with my home theatre PC.
Let's seee... Decent performance, able to play HD videos, low cost. That covers everything I need for a HTPC!
 

cleeve

Illustrious
[citation][nom]aproldcuk[/nom]This article raised a lot of questions for me. What about Hybrid Crossfire for example? What kind of cards can be used together with this new IGP? [/citation]

We concentrated on the new aspects of the 785G in this article; hybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.
 

Ryun

Distinguished
Oct 26, 2006
133
0
18,680
"At idle, the Phenom II X2 is drawing the highest load: 92 W on the 790GX motherboard. In contrast, the E7200 is drawing 68 W on the most efficient platform, Intel's G45. It looks big on the chart, but it's a difference of 14 W."

Nope, it's using a 24 W difference. I think that's why your numbers are different too. I get:

24 Watts * 24 hours = 576 WHrs / 1000 W/KW = .576 KWHrs * $0.15 cents/KWHr * 365 days = $31.54

Good article otherwise, thanks.
 
G

Guest

Guest
refined architecture" ? To my knowledge, and please correct me if im wrong, all that was changed between the original phenom and the phenom 2 was the addition of more L3 cache allowing it to do more simultaneously and a die shrink allowing for higher clocks. That does not a refined architecture make. When AMD added an on die memory controller to their processors years ago they had made a huge advancement in architecture. Im sad to see them fall away from the performance crown. Here's hoping their new Bull Dozer architecture brings something genuinely intriguing to the table.

That is incorrect, if that was the case, the Phenom II wouldn't benchmark so much better and it wouldn't overclock so much better. Just because it has the Phenom name to it, doesn't mean all they did was give it a bit more L3 Cache and call it a day. You could've given the original Phenom more L3 cache all day long and it wouldn't still ran like poop. Not necessarily poop, but just not as well as the Phenom II.
 

KT_WASP

Distinguished
Apr 16, 2008
125
0
18,690
[citation][nom]cleeve[/nom]We concentrated on the new aspects of the 785G in this article; hybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.[/citation]

If this is true, then why does the Hybrid crossfire graphic on the first page show HD4350, HD4550 and HD4650 as compatible hybrid crossfire GPUs?

It makes sense.. the 780G used an integrated 3200-series GPU, so it was compatible with lower-end dedicated 3000-series GPUs. The 785G uses an integrated 4200-series GPU, so it should be compatible with the lower-end dedicated 4000-series GPUs.

Can you clear this up? I was also wondering what GPU's can be used as Hybrid crossfire with the 785G. I thought I knew from that graphic on page 1, but your response confused me.

Thanks
 

aproldcuk

Distinguished
Aug 4, 2009
3
0
18,510
[citation][nom]cleeve[/nom]hybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.[/citation]
Thanks for clearing it out, Cleeve! There is not much sense using Hybrid CF then. However, my original question still remains: how much extra wattage may one expect with mid-range 4600 or 4700 card added for example? Does disabling the device help here a bit more when not in use? Hope this is not too off-topic already...
 

cleeve

Illustrious
[citation][nom]KT_Wasp[/nom]If this is true, then why does the Hybrid crossfire graphic on the first page show HD4350, HD4550 and HD4650 as compatible hybrid crossfire GPUs?[/citation]

It doesn't - it says "add faster graphics".

AMD specifically made it a point to let me know Hybrid Crossfire is no different than it was with the 780G.

Remember, the 4200 has the exact same number of shaders as the 3200. Hybrid Crossfire is limited by the power of the integrated GPU, and on that front the 3200 and 4200 are absolutely equals.
 

cleeve

Illustrious
[citation][nom]mcnuggetofdeath[/nom]That does not a refined architecture make.[/citation]

The word 'refined' can be interpreted a great many ways. I think we might be splitting hairs here, fellas.
 
G

Guest

Guest
This board, to me, seems like an interesting option for a budget gaming system. I say this because, at least for the Asus ASUS M4A785TD-V EVO, you get both Hybrid Crossfire and Crossfire-X for $99.99.

According to the graphic on the first paid, the 785 chipset allows for Hybrid Crossfire with up to an HD 4650 card, so if you already have Vista, you have possible a decent budget video system.

For those of us holding onto XP, you can have a Crossfire (8+8)capable AM3+DDR3 board for $100.
 

KT_WASP

Distinguished
Apr 16, 2008
125
0
18,690
[citation][nom]Cleeve[/nom]It doesn't - it says "add faster graphics". AMD specifically made it a point to let me know Hybrid Crossfire is no different than it was with the 780G.Remember, the 4200 has the exact same number of shaders as the 3200. Hybrid Crossfire is limited by the power of the integrated GPU, and on that front the 3200 and 4200 are absolutely equals.[/citation]

Ahh... Ok, I see it now. So, basically it is a HD3000 series GPU, but ATI pulled a "Nvidia move" and tweaked it slightly, then relabeled it so people think they are getting the next generation.... This is starting to get old with these company's pulling this crap....
 
Status
Not open for further replies.