AMD CPU speculation... and expert conjecture

Page 90 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin

Two things jump as a concern for AMD here:

AMD CrossFire and Eyefinity Concerns

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand. We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern. Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case. Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?” My answer based on the below graph would be no.

And prehaps even worse:

At first I was afraid something was going on with our capture hardware, that somehow the EDID of the Datapath VisionDVI-DL card was incorrectly communicating with the AMD GPUs. But in fact, we saw several problems and inconsistencies with AMD’s graphics performance when more than one display was attached to the system, even if we were NOT in an Eyefinity setup! As I later learned, enabling Vsync actually does not work at all with Eyefinity and that, combined with the results I have seen (of which the screenshot above is an indicator) with our testing, lead me to believe that something is fundamentally wrong with AMD’s Eyefinity implementation. And if it’s not “wrong”, it is definitely counter intuitive. We’ll be asking AMD for more information in the coming weeks and hope to get more information from them as our Frame Rating process evolves.

I'm hoping Techreport, PCper, and others continue to press AMD until they fix their drivers. I mean, when I still had a 4890, the driver install program crashed and I had to install drivers via command prompt. Throw latency and non-working Vsync on top, and right now, I would never even consider an AMD GPU.
 
bonaire shows that future gcn cards have potential. and driver's likely not gonna get worse since reviewers will catch amd (and nvidia) if they slack off with drivers. my only concern is at which point the upcoming price war will stabilize. amd has dropped 7850 prices quite a few times already and phasing out 1gb models don't help. the 1gb 7850 is still better than gtx650tib though.
 
Only your loss.
If CFX is your game, then yes, they have much work to do, but easily solved.
Its also ignoring the gains not only in smoothness, but:
Perhaps the most interesting thing about this entire process – and the most embarrassing thing for AMD – is not just that stuttering was occurring and they weren’t looking for it, but by not looking for stuttering they were leaving performance on the table. Stuttering doesn’t just impact the frame intervals, but many of those stalls where stuttering was occurring were also stalling the GPU entirely, reducing overall performance. One figure AMD threw around was that when they fixed their stuttering issue on Borderlands 2, overall performance had increased by nearly 13%, a very significant increase in performance that AMD would normally have to fight for, but instead exposed by an easy fix for stuttering. So AMD’s fixing their stuttering has not only resolved that issue, but in certain cases it has helped performance too.
http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps/5

As stated in the Anand article, CFX comes later, but already many games are done, with more every day.
At the current pricing, any higher end card from AMD is a great deal
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Last year AMD had all those similar things, Brazos/Hondo/Llano/Trinity/7xxx/FirePro. The marketing has stepped up a bit and the game partnerships will help. It's not like night and day though.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
My opinion is that the HD 7790 is the perfect budget card... it has a low power draw so it requires a smaller PSU, and it's crazy fast for how small it is.

The HD 7870 LE is the perfect mid range performance card... slightly more powerful than a GTX 580 and Just under a 660 TI performance.

However, if you can afford it, the GTX Titan is currently the perfect enthusiast card. But the HD 7990 Should be a good dual GPU too... can't wait to see that thing running Crysis 3! I would be curious to see if Nvidia puts out a Dual Titan GPU card out there.

As for CPUs, an FX 8320 or an i5 would do good for gaming... People over-dramatize this... it's freaking nuts trying to convince someone to buy an AMD CPU!

*Edit* What is the Sky GPU thing, and what would I use it for exactly? Sorry if this is a noob question, but I can't make heads or tails of it... it's not for gaming I'm sure... but is it a new server GPU?
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


It is impressive considering it only has a 128bit (96GB/s) memory interface. Unfortunately for AMD the GTX 650 Ti Boost has a 192bit (144GB/s) interface. Big advantage for Nvidia at the same $150 price. The 7790 does have a massive (50W) power advantage for those under constrained power budgets.
 


My best guess to this is next gen/8000 series, with no real new node, they can then find a sweet spot for a larger , or higher in their heirarchy, SKU
 


I wouldnt say 2 gens, but definately behind
But then again, nVidia is also ahead of their compute designs too by a gen or so.
GCN is still being discovered by AMD
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


How so ? The 7970 kills the gtx680 is compute, and is +- 5% in games. Plus, it is priced lower.
Compared to a Titan, the compute perf is strong, considering that Titan has 7.5 Billion transistors, and the die is really huge.

Plus, the yields of Tahiti are much better than Gtx680, even though the die is bigger.

I would say that AMD comprehensively wins the compute part this time. But they need support from commercial software developers to incorporate compute. This is where driver team and marketing come into picture.
 
What people are forgetting is that the HD7790 was created for AMD to alienate the low end HD7700 parts in preparation for HD8600 family, also remove the HD7850 1GB model, thats less SKU options making choice at the low end easier. The HD 7790 was also created to ensure low power and performance met at the intersect of price to performance. That 650ti Boost is Nvidia's unsold silicon from the GTX660 family recycled at a lower price point, but kid not it is a higher end part pandering around a low price bracket which Nvidia are selling at a loss.

The attractive features of the 7790 over the 650ti Boost include a more streamline efficient design for HTPC/Gaming SFF factor setups, where 30 degrees centigrade at load is quiet severe on internal thermal balance, not to mention that ITX boards do not like a lot of power through them. The 7790 is not obsolete contrary to some reviews, it still suits a low power conscious builder well.

Yes the ultra thins and tablet market is saturated but AMD have released a very good 1080P Win 8 ready tablet based on its APU technology, to be playing Dirt Showdown at 1080 on a tablet is quite impressive. Although gaming wasn't focused on, things like streaming and playback quality were tested and using AMD quick stream technology the opinion in the field is that it represents a seriously good product for a very competitive price. Kabini notebooks and ultrabooks, 50-60% improved iGPU performance over HD4000, 30% better power efficiency and usage over trinity based parts, added APU exclusive features, lower thermals, I don't think its quite as uncompetitive as its been.

Yes it is about sustainability but AMD this year and next year have something they haven't had in a long time, a product line and partnership deals.
 


nVidias design is set to their platform, one theyre miles ahead of AMD on.
Not widely used, but you too recognize the value, as we all see AMD also trying to create such a platform as well.
Starting off with great HW as AMD has done is the first step, but only the first step, and theyre still cutting their teeth on GCN, as it will get better, as assuredly will the following iterations.

Listening to devs about nVidia and their platform, some use it not only because its being sold to them
Oh, and by the way, I prefer red over green, but also recognize the greens contributiuons as well.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
^ A lot use green CUDA because earlier they had no choice but to go CUDA. SO they converted all their code. Now, they are tied in a vicious circle of which they cant be freed. It is far easier to throw money at green. than to convert the code from CUDA back to x86/OpenCL.

Also,isnt AMD's proprietery FireStream/Appstream/whatzisname dead ? They have moved mostly to openCL/HSA. I wonder what happens to the users of HD6000 based compute cards.
 

which one is better? higher performance part(7850 1gb) that's being phased out in favor of a crippled part(7790)? which one costs less to market, new gpu or a lower binned one? is higher end part selling at a lower price good thing or a bad thing? although gk106 is hardly higher end. it's more like mid to upper mid range depending on binning.
nvidia's loss/gain will be apparent in their quarterly report. :)

orly. temps depend on ambient and cooler design. however, power consumption is a much easier way to find out how these cards really behave in htpc tasks:
7790
http://www.techpowerup.com/reviews/Sapphire/HD_7790_Dual-X/30.html
http://www.techpowerup.com/reviews/Sapphire/HD_7790_Dual-X/24.html
gtx650tiboost
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/29.html
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/23.html
ofc, if you can find something to back up 7790 being better for blu-ray playback than well... gtx titan...... :D
only one using the term 'obsolete' is you. :)

the real reason is because intel locked out amd by trademarking the formfactor and by literally paying oems to build ultrabooks in various ways. amd doesn't even have the money to save themselves let alone fund something like this. but, they didn't try much either. they should have pushed harder.

 
So, the answer is to use product thats not as near to the current competition then, and use low resources, than to wait til youre closer in power/perf to do so?
AMD cant be as diversified as Intel, and trying to be, or fooling themselves as to having the mindshare and resources of an Intel didnt work.
Or have people forgotten the "we arent competing with Intel" thing already?
Put it into perspective first
 

truegenius

Distinguished
BANNED
Charlie at Semiaccurate says that Nvidia is 1-2 generations BRHIND AMD in GDDR5 memory controller.
cat-macro-wasted.jpg

how ?
 


There's a reason why CUDA is still used more then OpenCL: NVIDIA has done a better job with the development tools, integration, and showcasing what the tech can do.

If dev tools didn't matter, everyone would be using ICC over MSVC for the performance benefit.
 


On the HD7790 is pure and simple AMD streamlining its SKU's removing the 7750, 7770 and 7850 1GB with a solitary SKU which is on a more efficient TSMC process, since a) it consumed less power than the 7850 and 7770, barely more than the non auxillary powered 7750 and delivers 85% of the 1G 7850's performance, it is not really hard to understand why. Sell a good product relative to your line at a good price intersect, cut the cost needed to batch produce more silicon for three SKU's.

Since there is no way any AMD solution or step will ever appease you, maybe they should just go out of business, AMD's position to some people is one of continual bleakness. You say not trying, how is having a diversified product line and a actual marketing plan not trying. Nvidia and Intel but for this case Nvidia are in the position where they can bin excess silicon at lower cost and the losses will not be as hard felt, but in any principle of business if you are selling at a loss then how is that sound. That is like our business selling components off at bellow VAT pricing, sure the first 6 maybe 12 months will be fine but the 12-24 month will start to see us having debts forcing liquidation, same principle, Nvidia priced its 660 family in the 250-350 bracket, selling that at $150-180 is a loss.


 
Status
Not open for further replies.