AMD CPU speculation... and expert conjecture

Page 723 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
I have explaned my point about 4K/8K several times and several of you continue ignoring what I said. I will not explain my point again nor will reply to your further misunderstandings of what I said, but I will note the next:

First, the 980 is faster at resolutions from 1080p up to 4K. No sure why some people insists on negating that.

Second, thanks to its efficient architecture, the 980 can be OC at factory whereas power is still maintained under control. It is funny that some people pretends we cannot discuss benchmarks of the GB card. It would be so ridiculous as pretending that one cannot discuss benchmarks of the FX-9590 used in your review, because this CPU is essentially an 8350 overclocked at factory.

Third, it is also funny that some people continues figthing against Nvidia emphasis on efficient cards, when the new cards of the AMD 300 series will precisely improve efficiency over the 200 series. What do you think replacing GDDR5 by HBM does but improving performance and reducing power?

Fourth, performance is not a linear function of cost. You can spend $100 millions on increasing performance 20% and however, you can need $400 millions (not $200M) on increasing 40%. AMD achieving 80% of the performance with less than 80% of the cost doesn't deserver merit. Moreover, the one-dimensional performance per cost charts don't account for other people's needs such as reduced power consumption (electricity bill) or reduced noise/heat. There are people who want to pay extra for silent/cold hardware, for instance.

Fifth, it is funny that some people is moving the discussion from performance to performance per cost, because nobody mentioned costs when the original article 8K article was introduced in the discussion. The 8GB version of the 290x costs $500 for instance. I suppose cost is not an issue when playing at resolutions used by 0.00% of gamers.

Sixth, I find very funny that some people think that two 5.6TFLOPS 8GB cards can be used for 8K gaming, but a 20 TFLOPS 64GB APU (as that described in the APU silicon article) would be limited to 1080p. LOL!

Seventh, to all those who have ignored my point about 4K/8K my question is as follows: which will be the excuse when the 300 series is released and will suffer the fate predicted in the article linked?
 

con635

Honorable
Oct 3, 2013
644
0
11,010



I'd be careful on the psu wattage for the 970/980, I'd go as beefy for either brand, maxwell average is impressive but look at the toms review, I think amd will be using similar tech soon or at least they had a slide describing it. 970 can pull 240w and 980 300w. Toms are very good with power consumption tests imo.
http://www.tomshardware.co.uk/nvidia-geforce-gtx-980-970-maxwell,review-33038-11.html

 

truegenius

Distinguished
BANNED
it looks like nvidia provided maxwell tdp numbers are misleading by huge margin, well its not good
i was thinking to recommend 970/980 with cx430 psu, glad i haven't done this
looks like time to bring out old discussion or "amd uses tdp number, intel/nvidia uses sdp numbers" :whistle:
if amd is also going to use numbers like this then it will become hard to recommend gpu without looking at reviews

btw, hey look at hd7950 numbers, the max gpgpu power consumption is 179w which is under 200w of amd specified tdp while gaming power consumption is just 146w ( so this is why my 4 years old 500w fsp smps is able to handle overclocked hd7950 and 1090t )

64GB apu ! , is it all "on die" ? how much it will cost to us ?
 
had to color-code some of the parts (red means red herring, geddit?). let's see:

that's false. no one said gtx 980 isn't faster. however, your own posted link shows that the r9 290x 4GB, an older, cheaper (also more power-consuming) card is faster in some benches. check the tpu article you linked earlier.
i only pointed out the straw man arguments, other fallacies and inconsistencies in your claims.

straw man argument, again. overclockability has nothing to do with the point i was making. i mentioned the gigabyte card's overclock to isolate the ref. card's performance and other data from gigabyte's. if you're gonna stick with the gigabyte one, you should realize that that one had even worse perf/price than the ref. gtx 980 (from your own posted link). i picked the ref. gtx 980 data to maintain consistency. amusingly, this bit is also a red herring since maxwell's gpus' efficiency and gigabyte gtx 980 g1 has nothing to do with the original argument.

didn't ignore what you said:
"Thanks to using 8GB, AMD card is better on a resolution used by the zero percent of gamers, whereas the card is poor on resolutions used by gamers. I already suspected that when the 390X was released the marketing dept will emphasize how good is the new card at 4K or 8K."
there is no mention of price, efficiency. perf/price came up because you cherry picked skewed relative performance advantage while disregarding undeniable perf/price differences. no one said gtx980 is not efficient or less efficient than r9 290x 4GB/8GB. please read the earlier posts. your own info shows that the r9 290x doesn't perform poorly, you didn't understand why TT ran those tests or why anyone would attempt 8K gaming.


the herring redder in this bit. in case you didn't know, the TT article mentioned why they chose fx9xxx cpu based test platform. and if you don't know why high oc'ed cpus and other such parts are used in test benches, here's why: to remove various points of bottlenecking and measure the gfx card performance only.

another straw man argument. perf/price only came up only because you tried (and failed) to build an argument using perf/watt disregarding the significant difference in perf/price, and the fact that you were arguing gamers' preference instead of data. and you've failed once again by attempting to use the r9 290x 8GB's price. the r9 290xs in cfx would cost $1000 (according to your posted price) where your picked gigabyte gtx 980 g1 gaming 4GB in sli would cost $1260(price taken from the tpu link, toms mentions $550 apiece for gtx 980). the base test platform stays constant i.e. i am isolating the gfx part. so in TT's 8K tests, the r9 290x 8GBs have not just better overall performance, perf/price too, according to your own posted information.

but you suppose right (for the first time in ongoing argument)- cost is moot here. cost wasn't discussed simply because right now, a full 8K capable config (incl. displays) would cost a lot more than $1000-1500 that most d.i.y. pc builders spend. perf/price hardly matters for the kind of people who'd demand or attempt 8K gaming now. same with perf/watt. i already mentioned that only high-paying, early adopters would spend for 8K gaming capability or benching. using stupid blanket statement "playing at resolutions used by 0.00% of gamers" doesn't strengthen your claim though. that's as far as i can go discussing preferences.

as for r9 290x 8GB prices, i am looking at newegg's and the cheapest one is selling for as low as $430 excl. shipping. the one in TT article is at $460, $440 after m.i.r., free shipping. gtx 980 starts at $540-550 range to $600+ (the out of stock, kingpin version was $799). keep in mind that price may vary at different places. and check your info before posting, in worse cases they might refute your own claim(s). :D

the competition/comparison isn't over yet. best to wait till nvidia and partners release gtx 980 or higher end GM200 gpu with 8GB or higher vram or dual GM204/200 gfx card with 12-16GB vram and have reviewers pit those against amd's counterparts. right now the available flagships are amd's r9 290x and nvidia's gtx 980. you failed to argue the data and succeeded only in posting more fallacies.

multiple edits: fixed typos and other errors
 


You can hardly call the 290 / 290X bad performers at 1080p when they can max pretty much every game out there at high frame rates.

They were the best cards on the market imo up until the release of the 970 / 980, these new cards don't really change that. Yes the 980 is more efficient (however no where *near* as efficient as nVidia would have you believe when you look at actual power usage). It gives a bit of a performance boost, but nothing that spectacular. This advantage diminishes as you go higher in resolution, and at the end of the day, buying a card like this you're not likely to be gaming solely at 1080p as anything from an R9 280 or GTS 760 upwards will keep you fine for that. You buy a top end card for higher resolution screens (1440p, 1600p, 4k) or multi monitor setups using multiple cheap 1080p displays.

As the resolution goes up the 290 / 290x improve their position against the 980, and the 970 has proven problems at very high resolutions due to it's bizarre memory config. The 290, despite being a cut down chip like the 970 has no such problems and is a much better bet if you intend on buying a higher resolution screen down the road.

I really cannot understand the current praise nVidia is getting, it's released one badly crippled card, and another rather under specced (specifically in memory bandwidth) card that adds lots of nice 'theoretical' fps at a resolution that was already exceeded by the previous generation cards (of which the 290 / 290X are members). I'd sooner take a 290 series or Gtx 780 / 780ti over either of these 'new' cards, as frankly I can see them ageing terribly.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The first question is answered in the article. Note that the 64GB are only DRAM. Total memory per node (both DRAM+NVRAM) is 1TB.

The answer to the second is: "no more than if the memory stacks were added to a dGPU".
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


From 20% faster on 4K to 25% faster on 1080p and 69--79% more efficient as shown in the review given. This efficiency means Nvidia can increase power consumption and match 390X in performance as leaked benchmarks show.
 
One argument in favor of the "power consumption" thing is PSU price. Remember one of the most important components (if not *the* most important) is the PSU, so you know going from 550W-ish to 850W-ish is a big jump. That could eat the price difference between 2 290X'es and 2 980's. Now, that is assuming the 980/970 combo *is* as efficient as nVidia claims. I have a mixed bag of information here.

In any case... I'm sure Juan will say in 5 more years: "but no one uses Virtual Reality rooms in Ultra Reality resolution!" when 8K is mainstream, hahaha.

And just for the sake of argument and reading Juan's funny rhetoric: what about 1920x1080@144Hz? Are there 1900x1600@144Hz screens?

Cheers! :p
 

yes.
let's see what $200-300 bucks of change gets us and as my bias towards seasonic psus (newegg prices):
SeaSonic Platinum-1200(SS-1200XP3) 1200W 80+ platinum at $250, $230 after rebate excl shipping.
SeaSonic Platinum-1000 1000W ATX12V / EPS12V 80 PLUS PLATINUM at $220 disregarding rebate.
assuming the savings go entirely towards psu purchase.

using toms numbers as source, a pc with a sapphire r9 290x 8GB has roughly 333w power consumption delta over idle at gaming load. meanwhile, ref. gtx 980 4GB use 185 at gaming load. however, it is possible to get gm204 eat much moar power if the load negates maxwell's boost optimizations. using this as base test platform and my newbie math:
http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-2.html
assuming the platform running either cards idle at 85-87w.
assuming the ref. gtx 980 uses 185w at gaming load, the system's consumption at gaming (bf4) would be roughly 85+(185*2) = 455w total.
assuming sapphire r9 290x uses 333w at gaming load, 87+(333*2) = 753w total.

the lowest 80+ gold/platinum psu(i am biased) for ~450w - rosewill capstone 550w $70, $50 after rebate.
2x gtx 980 would go for as low as $1100 to as high as $1600(evga kingpin).

2x sapphire/msi r9 290x 8GB would go for roughly $900 (actual price is lower than that). you'll have $200 spare for psu. may be a SeaSonic Platinum SS-860XP2 860W at $170? and 30 bucks for a corsair 100R case. i am kidding, make that an antec solo. i am kidding again, the price is really $175 sans rebate, so you only get enough for a tabletop fan to cool those space heaters.

p.s. do not take my calculations as accurate. i took a lot of liberty adding and rounding up.

edit: fixed typos.
but not the math mistakes!
 

Embra

Distinguished


Most serious gamers, which these cards are meant for will have a 750-850 good quality PSU.
If one was to upgrade there GPU every 1-2 years it would be silly not to invest in a good PSU.

 


So, a "reliable" (I filtered 80 plus gold) 550W PSU is around $100USD and a 850WPSU is $180USD. That's a $80USD delta for 300 extra watts. I think the delta is not enough to eclipse the price difference between a 980 and a 290X, even less on dual cards. Then it is important to take that into the equation, but it's not a big enough amount to justify the price delta.



There's that as well. More to your point, most gamers, unless on a budget, usually say "yes, I will definitely update to a second VGA next year, so I need a beefy PSU!" and then we never do get the second one and instead get the direct upgrade of the current one we have, hahaha.

Cheers! :p
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Tomshardware must use different methodology but results are very similar. Techpowerup gives an efficiency gap of about 75%. Toms gives about 85%. The verdict of toms review is also interesting to quote:

If you've skipped to the last page to see our conclusion, I'm not going to waste your time: both the £260 GeForce GTX 970 and £450 GeForce GTX 980 represent tremendous, earth shattering value from a price/performance perspective.
[...]
The GeForce GTX 980 not only sets a high-water mark for single GPU performance, but at £450 it does the trick for less than the GeForce GTX 780 Ti costs at e-tail. That's a mere £50 more than the Radeon R9 290X, by the way.
[...]
Nvidia, you had us at the price/performance ratio. But you didn't stop there: the power draw is very low when you consider the performance these cards deliver in return.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Unsurprising news:

Back when it was announced, AMD were very bullish on the future of their Mantle API. They were confident it could help unleash the full power of gaming PCs, and they thought that developers would adopt it because of the advantages it offered.

But today, in a statement on their website, AMD explained that their plans for Mantle are changing rather drastically, and even recommended developers focus their efforts on other APIs, like DirectX 12.

http://www.pcgamesn.com/is-amd-walking-away-from-mantle

With Mantle being locked to AMD GCN hardware and Windows, with Apple rejecting it, with Valve rejecting it, with many announced games being canceled/delayed, with SDK in perennial beta stage, with DX12 being faster than Mantle, and with OGLNext coming, Mantle couldn't have any future...
 


Yet they haven't.... not even close. Remember that the GTX 980 has a pitiful 256 bit memory bus. That is a significant part of where that efficieny comes from as well. There is very little room for them to push the memory bandwidth higher given they are already running their ram at previously unheard of speeds.

The 980 is going to hit a bottleneck very quickly. Will an overclocked 980 be able to match a stock 390x, quite possible. Will an overclocked 980 match an overclocked 390x, given it's advantages (based on the same leaked info you refer to), imo no chance.

If nVidia want to best the 390x they're going to have to release a much bigger gpu to match it. I'm not saying they can't do it, but I don't see a 980 managing it if the leaked specifications prove to be true. To achieve the efficiency of the 980 nVidia have had to sacrifice too much for it to scale up that far- it's already running at very high clocks, with very high memory clocks to compensate for the lack in fundamental hardware. As I said, I don't see the 980 fairing that well moving into the future, like the 680 it's more of a mid range card and as the software actually pushes it properly this will become more apparent. In contrast cards like the 7970 and the 290X will age better (the 7970 was bested by the 680 when it was released, only for that position to be reversed as time moved on, I think if anything AMD aren't that clever at getting good early drivers out for their cards).
 

jdwii

Splendid


I actually tested this myself and always see the power consumption i wonder if it was some driver issues they had my card never spikes like that.

DE5
Don't even compare the 980 to a 290x compare it to a 970(which is 80$ more then a 290X)
970-290X perform similar.
 


It would be really sad they end up saying "but hey! MANTLE lives in spirit inside DX12 and OGLN!".

There's nothing really AMD can do about it, I guess. There's not much extra MANTLE can give AMD to spark that extra edge they need, right? Anyone knows? Well, they said in GDC will let us know more.

Edit: I found this: http://www.phoronix.com/scan.php?page=news_item&px=AMD-Mantle-Docs-Coming

Looks like AMD is going to give more information on MANTLE instead of hiding it under the rag.



Holy Cow... Take a look at the Tablet / Phone market. It's pretty much equal to PC now!

Cheers!

EDIT: Added extra quote XD
EDIT2: Added link.
 

when i started looking into the data, i wasn't expecting to fit in a high end psu within the cost of gtx 980 sli, because those cards used so much less power than the hawaii cards. i was half-way sure i could fit a 750w, 80+ bronze psu. then a few things happened - i could fit in an 80+ platinum rated 850w psu and it was indeed possible to run gtx 980 sli on a good quality 550w psu. here are some things to consider:
■ in multigpu configs any price delta will double, so will power consumption delta.
■ i've already mentioned that these amd cards are flagships, but they're of last generation. amd has dropped prices since launch. by how much? r9 290x 4GB gfx cards launched at $700 (i remember mocking them). comparing two newly launched counterparts would be moot as both flagships would be much closer in price. gtx 980 are newer but they haven't come down in price (must be selling pretty well :)). comparing r9 290x 4GB and gtx 980 using launch price gives the 980s overwhelming advantage ($1400+ vs $1100+) in terms of price. viva competition!
■ we're discussing gaming at 4K and 8K res using hardware available today. this is where the latest gtx 980 4GB do not have a clear and decisive performance advantage in every single gaming benchmark.
■ power supplies also came down in price.

thanks to amd's price drops the price delta turned out to be so big that one can actually afford two of these and a high rated, beefier psu. i assumed that r9 290x 8GB cost would be less enough to justify choosing them over gtx 980. :) the psu choice isn't that much important since one can use a 550w 80+ gold psu as much as a 900w 80+ platinum one. the point was to see if the price delta would allow adding a whole psu. to my surprise it did despite purposefully biasing towards pricier psus and lowered price delta. i don't think this will last long though (so you can darn sure expect a troll to come back and stupidly gloat :pt1cable:) as new launches get closer, the older cards will typically gain price (290x is $10 higher this month) before phasing out.

edit: if you seriously consider a full build other parts will influence as much as the psu.


well yeah, the 980 is one price tier above the r9 290x after all. but check out the points i mentioned above, 970 is less powerful than 980 so any advantage the r9 290x have over gtx 980 would carry over. it'd certainly be interesting if reviewers test 970s for 4K and 8K. 970s might have a much better chance to undercut the r9 290xes.

edit:
amd is changing direction about mantle
http://www.anandtech.com/show/9036/amd-lays-out-future-of-mantle-changing-direction-in-face-of-dx12-and-glnext
http://www.pcworld.com/article/2891672/amds-mantle-10-is-dead-long-live-directx.html
resistance is (was) futile!

edit2:
Confirmed: Vulkan Is The Next-Gen Graphics API
http://www.phoronix.com/scan.php?page=news_item&px=Khronos-Vulkan-Graphics-API
psyche!
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
first off its only 20% when your comparing a factory oc 980 to a stock 290x, but juan wants to cry because no one wants to listen to him.

04-Performance.png


but hey, lets ignore everything else and focus on one single aspect to win an argument. ERMAGO electricity costs $200 per watt. it takes 1 hr to cover the extra cost of the 980. yes, NVidia did some impressive work on the power consumption on the 900 series, that doesn't automatically mean to put 100% of any discussion on that one aspect. After all, the 290x is a 2013 card and the 980 is 2014, just shy of 1 year apart to be specific. the simultaneous amd/NVidia releases are going to be absent for a while, 390x coming soon and nothing from NVidia till next year.
 

Reepca

Honorable
Dec 5, 2012
156
0
10,680


There's only so much to talk/read about, and this is a LONG thread.

If you further narrow it to AMD CPU speculation and conjecture there's even less that is relevant.

There just isn't really much to say about potential AMD CPUs that sticks to reason. And when there is anything to say, it typically follows some new information posted somewhere - then some genuine speculation happens, which tends to devolve into argument, which devolves into insults, and so on... the actual "meaningful conversation" stage of a discussion's life cycle is relatively short.

 
AMD Releases Mantle Programming Guide and Reference API
http://www.techpowerup.com/210328/amd-releases-mantle-programming-guide-and-reference-api.html

moar vulkan
http://www.anandtech.com/show/9038/next-generation-opengl-becomes-vulkan-additional-details-released
open cl 2.1
http://www.anandtech.com/show/9039/khronos-announces-opencl-21-c-comes-to-opencl

i heard you liek cpnspiracy theories
http://semiaccurate.com/2015/03/02/behind-fake-qualcomm-snapdragon-810-overheating-rumors/
i toldja long ago (in internet time) arm chip competitors won't play nice when deluded people were singing hARMony.
another prediction confirmed!
 
Status
Not open for further replies.