AMD Radeon HD 8790M: Next-Gen Mobile Mainstream Graphics Preview

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I see this making a HUGE difference in battery life, especially if lower power modes are made available.
Think of a software toggle: almost double the performance, with similar battery life, or similar performance with 50% more battery life (not "double" because of the CPU/platform).
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
Kind of funny that AMD is sending people Intel processors to test AMD GPUs with. But it does make sense, the highest end AMD processors would still limit framerates at times.
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
[citation][nom]maxinexus[/nom]Great preview but I kinda expected desktop version first.[/citation]

The desktop version has been out for quite some time...This is the GCN architecture, aka the 7000 series on desktop. The 8000M series uses a tuned version of that architecture, it isn't a new one.
 

BulkZerker

Distinguished
Apr 19, 2010
846
8
18,995
I know people play BO 2 but why would you include it in the benches? It's the Q3 of it's time. A game that hasn't really needed a stronger rig since it's original release in 2007. Of course it's going to be playable at any resolution or quality setting you put it on, with a toaster.
 

piesquared

Distinguished
Oct 25, 2006
376
0
18,780
[citation][nom]BulkZerker[/nom]I know people play BO 2 but why would you include it in the benches? It's the Q3 of it's time. A game that hasn't really needed a stronger rig since it's original release in 2007. Of course it's going to be playable at any resolution or quality setting you put it on, with a toaster.[/citation]


Then inetl oughta start shipping toasters with their slide show projectors...
 

markem

Honorable
May 1, 2012
37
0
10,530
looks like nvidia is late to the party again!!!

Somebody wanna take a guess by how long? 5 months would be a good guess
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
Hitman uses a proprietary game engine that won't (likely) be used in anything else...Not sure if it means anything other than "Hitman" performance in todays hitman...LOL. I'd rather see them add games that are based on engines used in MANY games so one benchmarked game gives you an idea of perhaps 5-10 games on the market rather than ONE.

It would have been nicer if there wasn't such an advantage for the tested 8790. It's really not doing that much more when you consider it has a 300mhz clock speed advantage (50%?) and a 20+% memory bandwidth advantage. I'd have been more impressed if they were doing this performance with 600-700mhz, but at 900...This is about right on target. The clock/memory advantage should yield what it did...Gee, I wonder why they didn't send an more comparable model? :) Isn't this kind of like testing a 650 vs. a 680 and saying the 680 blows away the 650? Well duh, more memory bandwidth, more clockspeed etc...No surprise here. Move along...LOL.

It looks like these will be a 10-15% bump over a like model from the previous family. The 8790m is not a replacement for mid 7600 series (7670m). More likely a replacement for the 7690m correct? That's not to say this card is junk, just that it's not quite the gain the article seems to show. I hope we get more from NV's refresh. I can't get exited about replacing anything for less than 40%. I end up skipping gens these days on either side (do the same on desktop). It takes two or more gpu gens these days to make any real gain worth noting or spending an arm and a leg on (yet again).
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
I believe it was mentioned in the article that AMD sent them.

 

dvo

Distinguished
Jan 16, 2008
90
0
18,660
[citation][nom]Robert Pankiw[/nom]If you look, there is only one chip (shown on this page) which means it is not being CrossFired. I agree that 8780M would be a better name than 8790M. Andrew Ku, maybe on the front page you can clarify this?About using desktop parts, it is my understanding that they sometimes do exactly that. Take the 7970M, which as far as I can tell, is an 78XX part (I forget which one) except the mobile chip has MUCH higher binning than the desktop 78XX.[/citation]

they do that all the time. i have a 5870m, which has a juniper core from the desktop 5770
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060
This card is an absolute piece of crap. Basically gt 650m performance but almost a year later and with driver (enduro issues). And trying to sell it as an upgrade. A 7670m is a 6770m rebranded.

A gt 650m gets about 2300 pts in 3d mark 11, a 6770 gets about 1500 pts. A gt 650m is about 50% better (source: Notebookcheck). The 8790m does not look like it has any advantage over the 650m.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
Wisecracker: How many people you think are using their laptops for bitcoin mining?...LOL.

There are not many people with compute on their list of most important things to do (at home). You can't sell a vid card to a home user claiming it wins in compute and expect it to mean much. Saying you win in GAMES, well that's a better sell.

Folding@home is pointless to except to run up your electric bill. If I solve cancer will they send me a portion of the pills the sell after solving it?...LOL. NOPE. Bitcoining the easy ones are gone, and takes much more to find them now anyway. You used to be able to pay your cards off with this (last year) but now it costs too much in electricity to find enough to say this works.

So what are you using compute in a laptop for? This isn't going to slobberknock a tesla. I wish they'd just remove this from home cards and go all out gaming perf, because that's what we do with our HOME cards (laptop or desktop). If you really need compute (as in make money from it) you don't waste your money on some desktop/laptop regular card. You buy a firegl/quadro/tesla etc. I'd venture to guess most people don't even know what folding@home or bitcoins even are. Why waste the gpu space on this junk? Pro cards are for that.

What will you do with a laptop based on this card/650m where compute matters? It's a pretty silly argument.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@somebodyspecial
Hello again! (Not sure if you remember me.) :)

I'm thinking Wisecracker was addressing whyso about whatever advantages the 8790M has over the 650M, and I think his/her response was valid. It was just one statement with a plain and simple point (though comically written) without any other thoughts attached. :)

But if ever, there is a case to be made. Laptops may be used for photo and video editing, file compression, etc., and the most apparent one, gaming. All of which have examples of programs (or at least functions) that can utilize GPGPU. I won't deny that it isn't that big of a thing (yet), but it still proves to be an advantage nevertheless. (I would call the PhysX feature of Nvidia cards an advantage, though it really is only so in some games.)
Also, there's a case to be made about "future-proofing," though the future is always uncertain, I'd rather have something around that may prove useful in the future. :)
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


How could I forget someone who insulted me 42 different ways defending blazorthon & luciferano's ridiculous statements all while claiming not to be on the "team"... :hello: :) The repeated attacks were something you just don't "forget" :)

Since the things you mention can all be done with cuda (adobe plugins etc, CS6 directly supports it now, or direct support in apps themselves, or via other api's, opencl etc), I'm not sure his statement holds any weight even if it ever was used on a laptop with this card or NV's. I'd say more future proofed on NV currently:
http://www.hpcwire.com/hpcwire/2012-11-12/nvidia:_70_leading_apps_add_support_for_gpu_accelerators.html
"GPU computing first gained momentum among researchers who could download CUDA to accelerate their own applications for scientific discovery and research," said Addison Snell, chief executive officer of Intersect360 Research. "We are now in a new era where more commercial software is GPU-optimized, providing accelerated options across the full spectrum of engineering and business computing."
A partial list of other GPU-accelerating applications shipping or in development include:
Computer-aided Engineering: Abaqus/Standard, Agilent ADS & EMPro, ANSYS Mechanical, CST MWS, MSC Nastran, Marc, OpenFOAM solver libraries, RADIOSS
Defense & Intelligence: DigitalGlobe Advanced Ortho Series, Exelis (ITT) ENVI, Incogna GIS, Intergraph Motion Video Analyst, MotionDSP Ikena ISR, PCI Geomatics GXL
Media & Entertainment: Adobe CS6, Autodesk 3ds Max & Maya,Cinema 4D, Houdini, Lightwave, Blackmagic DaVinci Resolve, Chaos V-Ray RT, Elemental Server, Telestream Vantage
Oil & Gas: Acceleware AxRTM, ffA SVI Pro, Headwave Suite, Paradigm Echos RTM, Schlumberger Visage, WesternGeco Omega2 RTM
Scientific Computing: AMBER, CHARMM, Chroma, FastROCS, GAMESS, GROMACS, GTC, WL-LSMS, MATLAB, MILC, NAMD, QUDA, VASP, VMD
Weather & Climate Forecasting: COSMO, GEOS-5, HOMME, HYCOM, WRF, NEMO, NIM
A complete list is available at www.nvidia.com/teslaapps. "
http://www.nvidia.com/object/gpu-applications.html
http://www.nvidia.com/docs/IO/123576/nv-applications-catalog-lowres.pdf
Sure some of that stuff is HPC serious work, but some of the apps in there are desktop engineering etc. CAD, Lightwave, PTC etc...Done on any cuda gpu.

An entire site dedicated to this stuff, 415Million Cuda gpu's supported (I think all NV gpus that are still running...LOL...but at least since 2006, noted in the article). Do all of AMD gpu's support gpgpu and to this extent? Nvidia's been laying groundwork on this since 2006 which as you can see above is now coming into full use. Year after year of not being in debt while sitting on billions allowed them to invest in this & foster it's development to the point where now, you really can do almost everything on their proprietary platform. With the release of cuda 5 recently this is only growing faster now. It's kind of like being in the apple ecosystem at this point. It gets harder and harder to leave all the compatibility/interoperability etc (I own no apple product, just comparing situations). Whether a fan of proprietary crap or not (I'm not, but good for my stock eventually...LOL) NV has pretty much won this war. Let's not forget where more money has recently went to; Nvidia VGX. Full access to your gpu's abilities direct, no abstraction, virtually. Open stuff is one thing (but NV supports that too, same as AMD, like opencl) but NV has a whole slew of things optimized that AMD can't claim for now like it or not. We're not talking folding@home or bitcoin mining here. The future is NOW on NV, it doesn't need to be 'hope it's future-proof' or maybe one day opencl will rule. Cuda isn't going away, rather it's growing by leaps and bounds. I think I "Made my case" no?? :kaola:

I understood exactly what he was saying and who it was too. Not a valid response IMHO. I see no advantage AMD. Don't start this crap again please. :sarcastic:
 

freeads360

Honorable
Jan 4, 2013
2
0
10,510
Good One. Did any one using this board what about the power & Heat consumption. planning to use it as a sharing system for my local network, can any one suggest.

Last week only i have sold my old chip set with DDR 2 Ram in sites. can any one suggest the can go for this board....

Thanks & Regards,
Raja S
 

whyso

Distinguished
Jan 15, 2012
689
0
19,060
With regards to compute, 8790 gets about 62 gflops dp. Which is about equal to what you can get from a mobile i7 cpu. AMD crippled their mobile gpu's just as bad as nvidia. The gt 650m will get about 35 dp gflops. The radeon is better, but not much better than CPU.
 
Status
Not open for further replies.