AMD CPU speculation... and expert conjecture

Page 629 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid


I know this is mean but how can Amd improve their efficiency 200% in one generation at 28nm? Even last gen Amd was behind 30% which is why i went with the 770 vs the 280x.The issue is they didn't do anything to their design for efficiency since the 7970ghz came out. That was 2 and half years ago.
 


The 280x consumed pretty much the same amount of power as the 770. Its all within 10W of eachother.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
347 watt is not 38% more power than 214 watt but 62% more!

But those numbers have been cherry picked because are comparing an OC 280X against a 770 stock. It is unfair!

And that is only for power consumption. We are discussing efficiency. Efficiency numbers are found here

http://www.techpowerup.com/reviews/Gigabyte/R9_280X_OC/27.html

The 770 stock was only ~8% more efficient than the 280X stock. AMD efficiency was not "behind 30%" as some pretend.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
The GTX770 is awesome, but i would pick the 280X since they are both the exact same performance but since i have a crap CPU i would benefit from MANTLE in the long run.

The 280X version JDWII compared is a massively OCed version, the stock 280X is just 20-30 more Watts from the 770 stock.

AMD is preparing their R300 series, i bet they will make poop with the 980 but i also bet nVidia is preparing a 980Ti and 980 Titan when this happens, AMD should be aware of this.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


Where i live power consumption is very expensive... I pay around 35 USD every Two Months, but 20-30 extra Watts is not a problem... that´s basically what a energy efficient Light Bulb uses per hour, also the 280X is aimed at the high end gamers, i don´t think any of them would care much for 30 extra watts, if it where a Laptop then hell yeah... but not on Desktop, if it where maybe 80Watts or higher, then it would be a BIG issue.
 

jdwii

Splendid


http://tpucdn.com/reviews/MSI/R9_280X_Gaming_6_GB/images/power_maximum.gif
I never listen to average anything just the max and idle when it comes to power consumption. And perhaps video playback.
Edit i now see i did mess up in the math but its still pretty inefficient no one should stick up for lesser parts doing so makes you a enabler
See what i mean he overlooks the bad and favors the good for well Amd
"FurMark on the other hand, being the TDP buster that it is, paints a different picture of the situation. The 280X can generate and sustain a much higher power workload than the comparable GTX 770, and still more yet than the original 7970. FurMark isn’t a game and that’s why we primarily use it as a diagnostic tool as opposed to a real world test, but it does lend credit to the fact that when pushed to its limits 280X is still a high TDP part."

Actually my PC uses 580 watts when pushing prime95 and furmark at the same time getting a 280X would of pushed my aging 650watt PSU to far. Even now i need to buy a new PSU. But i will be sticking to efficient parts for the future.
 

jdwii

Splendid
I’m happy to have no bias for Amd-Nvidia-Intel i will buy whatever is the best performance per dollar as long as it’s not too crazy on power consumption i will never allow anyone to claim a part is better than this or that without proof or with using fallacious arguments(argument from small numbers is the common one here).
IF someone tries and recommends a 8350fx for gaming over a unlocked I5 Haswell or a higher clocked I5(3.6Ghz+) i will claim no over benchmarks in even next gen gaming such as watch dogs unless the budget is tight. If someone claims a 280X is a better deal over a 970 i would once again tell them no. Or if someone claims an I3 is better for gaming compared to an 8320fx i will once again show them the majority of next gen benchmarks.

Never have a bias for any company they do not care about you and yes i know some of them are more "evil" then others. Tomshardware is about giving new people advice and helping them build great builds that is why I’m on here really and of course I’m on this section to hear about new things I never heard of or perhaps expert opinions on things
 
Well, the R9 390X has something very interesting going for it: the new compression format. I have a lot of faith in that making a noticeable difference when the drivers have the proper support and the actual card shows up. They went from 384bit to 256bit with the R9-285? I mean, for all the crappy names convention, the ballpark they hit thanks to that is very encouraging.

So, in short, the power efficiency might see a good improvement thanks to the bus width downsize, just like in the 970/980 at the same 28nm. They've done it before as well. From the 5800 to the 6900 series the performance was very good IIRC.

Anyway, my 7970 is still going strong, no need for a replacement just yet. I wonder when AMD is going to say something about the unified platform. 8350rocks, do you know when the NDA lifts? Or is that under the NDA as well? (Hey, I thought you liked NDAs, so I put an NDA to you NDA! lol)

Cheers!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yes 20--30W at those levels is pretty insignificant.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


WCCFTECH says that Peregrine Falcon only supports dual channel DDR3, but also supports dual channel DDR4 and LPDDR3. I didn't check if there are more mistakes in their news.
 

jdwii

Splendid
http://www.kitguru.net/components/graphic-cards/anton-shilov/amd-and-synopsys-to-co-design-14nm-10nm-apu-gpu-products/

"AMD to use Synopsys IP to design 14nm/16nm, 10nm APU, GPU products"

http://wccftech.com/amd-announcing-generation-product-25th-septemeber-runs-pills-required-marketing-campaign/

Perhaps a 970 killer is coming already that would be amazing to see
 

jdwii

Splendid


oh it was u that's why i missed it
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360
Nvidia has done, hilariously, what we never thought they would do and support the VESA standard for adaptive sync. http://wccftech.com/nvidia-promises-support-freesync/ I can only think of all the posts made here along the lines of "that will never happen".
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Excellent news!
 

this is may not be related to wecopycrediblefactsoftech, but when parrots parrot credible sources, they turn out to be right. except... they take the credit for themselves and often stoop to slander the real source whenever the source(s) doesn't agree with the parrots' claims/biases/delusions etc. you should be okay as long as you know that the real source was someone/where else who did the due diligence and the parrot(s) is simply taking credit for someone else's hard work.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680


I see what you did there ;)
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


If GTA5 support MANTLE then i see nVidia also supporting it, that game is just too huge.
 

8350rocks

Distinguished


Not yet, that is about all I can say...
;)

:na:
 
Status
Not open for further replies.