AMD CPU speculation... and expert conjecture

Page 702 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Or less than about $800 (mustn't forget they have the fastest card out there still, that's half the price of nVidia's equivalent) :)



 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


What is so special about that? It is the same turbo frequency than one month ago


http://www.tomshardware.co.uk/forum/352312-28-steamroller-speculation-expert-conjecture/page-350#15089095

Let me draw your attention to this old post from mine. I bold the parts relevant about turbos


http://www.tomshardware.co.uk/forum/352312-28-steamroller-speculation-expert-conjecture/page-351#15105195
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


ES is not even guaranteed to be the same stepping as release model. Usually with AMD part numbers, the trailing characters are the stepping, and production models have been coming out with letters at the end IIRC. And the stepping code usually starts with numbers, then moves to A, then moves to Z.

If the sisoft benches are legitimate, I'd imagine they're really early engineering samples from China. China has a way of leaking things.
 

jdwii

Splendid


What the 970 has been out for 4 months now? 330$ card uses 1/2 the power consumption as a 290X and performs similar.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Top Carrizo is 35W.
 

logainofhades

Titan
Moderator


I own an AMD card, but it is in my file server. My HD 7970 was having issues, and I couldn't turn down a GTX 770 for $120. :D
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


GPU Scaling

For four cores, DX12 provides virtually the same performance increases than Mantle (95%).
For two cores, DX12 can bring up to 14% more performance than Mantle on AMD GPUs!!!
980+DX12 is waaaaaay faster than 290X+DX12 or 290X+Mantle

CPU Scaling

As expected no scaling beyond 4 cores.

Conclusion

Mantle is no longer relevant. It had been a nice approach if had been developed earlier, was much faster than competence, and was not limited to Windows.
 

8350rocks

Distinguished


Mantle is completely relevant. DX12 is still not available off the MS reservation...so how could something like Mantle be irrelevant at all?
 

jdwii

Splendid


WHAT that's crazy no wonder Mantle support dropped. They probably saw 12. Then again lets wait for real game performance numbers from independent reviewers.
 
You guys realize DX12 is *based* on what MANTLE did, right? For MANTLE to still keep a constant advantage over DX12 shows it is a good API. Plus, the Demo being used was created FOR MANTLE.

So yeah, let's wait for more Demos; specially something from 3D Mark.

Cheers!
 


One tidbit:

Having effectively ruled out the need for 6 core CPUs for Star Swarm, let’s take a look at a breakdown across all of our cards for performance with 2 and 4 cores. What we find is that Star Swarm and DirectX 12 are so efficient that only our most powerful card, the GTX 980, finds itself CPU-bound with just 2 cores. For the AMD cards and other NVIDIA cards we can get GPU bound with the equivalent of an Intel Core i3 processor, showcasing just how effective DirectX 12’s improved batch submission process can be. In fact it’s so efficient that Oxide is running both batch submission and a complete AI simulation over just 2 cores.

Oh boy, the i3 is going to look SOOOOO good with DX12. AMD is going to regret Mantle if that's the case, since they competed themselves out of the low-end CPU race as a result.
 

con635

Honorable
Oct 3, 2013
644
0
11,010
Why did they not test any amd cpus? I dont think so gamerk, intels were already brute forcing dx11, its amd that is going to look a hell of alot better with dx12, 860k vs i3 and 8320 vs i5. Also can we now call the gtx980 a space heater?
71452.png

 

8350rocks

Distinguished


LOL @ GTX980 is a "power sipping", "super efficient", GPU...so much better than 290X...."Think of your electric bill!? You will save thousands!!!"

People drink too much kool aid from NVidia and Intel, it is honestly not that big a performance gap as many would have you think.

*inb4juanrgawithmarketingslidesandpropagandaaboutnvidia.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I explained why, in my opinion, it is no longer relevant. Read it.



AMD's Huddy promised us lots of new Mantle games and final API before the end of 2014. It is February 2015, Mantle continues in beta stage and no word on the new games. I agree on that it looks as if Mantle support dropped abruptly.

Yes, we must wait to more reviews and DX12 games and all that, but first results put Mantle on bad shape.



I see it different. Mantle is a lower API than DX12. Mantle, being more close to the metal, would be much faster than DX12. And we see DX12 even beating Mantle on GCN hardware!!! This is very weird. And the demo being originally created for Mantle adds to the general disappoint because runs neck-to-neck and in some case runs significantly better on top of another API.

Sincerely, I was expecting DX12 to be somewhat in the middle between DX11 and Mantle.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


People, in fact the entire industry, has been praising Maxwell efficiency. But you confound efficiency with power consumption.

(performance) = (efficiency) * (power consumption)

People doesn't care by the 290X consuming more power than the 980, but care by the 290X consuming more power whereas giving only a fraction of the performance of the 980

71450.png


The 980 is 56% faster on DX12 and 222% faster on DX11, whereas consuming less power thanks to an efficient architecture.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


Sometimes you sound as if you`re trying to sell us AMD Hardware.
 


GTX 980 is so efficient it only uses 40W full load, the rest is CPU, thats why its half the power use of the 290x. In mantle mode, it actually creates power.
 

jdwii

Splendid


http://s23.postimg.org/6a0r4bcff/IMG_20150206_200815.jpg

That is a picture of my rig doing basic things such as chrome and typing this to you. When i had the 8350+770 i used 100 watts more doing the same thing. Its not about saving power its about efficiency i like to have efficient parts just to have them . Sorry but according to techpowerup the card uses 190 watts or so a 980 that is. If Amd makes a part that is that great i will say its bad @SS. I'll buy it to. I'm not a Amd hater or anything like that i want them to come back so badly but i'm not going to buy a card that uses twice the power consumption just to have them. Just like i'm not going to buy a CPU that has inconsistent performance and keep it for long.
 

jdwii

Splendid


That is actually quite interesting and its outside of other sites but i'm not claiming its fake. I'd like to test that myself actually. It seems like its using the same amount of power. Making this efficient stuff less noticeable. Lets keep in mind we are cherry picking results but tomshardware also saw spikes or something as well and i don't think they ever really went into it after that like they were going to.

http://tpucdn.com/reviews/MSI/GTX_980_Gaming/images/power_maximum.gif

In this test the 980 uses 190 watts while the 290X uses 328 watts. During gameplay the 980 uses 184 watts while the 290X uses 294.
 

jdwii

Splendid


absolutely not, future products are always coming and its a bit sad to always say such things. Even more so when both are built on the same 28nm process.
 

Max power means next to nothing. Its only looking at the spikes and it really isn't giving you any information. Anyways maxwell's power saving is from good power gating and it can generally save a decent amount of power on average gaming situations but when the GPU is actually stressed like in the starswarm demo, it will use just as much as any 250W GPU. I would say the 980 uses about 70W less than the 290x when gaming based on reviews I seen. Sometimes more, sometimes less. Its not that much difference really unless power is extremely expensive to you.
 

jdwii

Splendid


I always joke with my friend with tech and say we live in a capitalist country what matters the most is whats out right now. Now i hear Amd might be doing stack memory even people who actually know about things seem to hint this cough "Tek Syndicate" cough cough but i really want a 200 watt card not a 300 watt one. Be nice to have my whole rig continue to use 300 watts or less when gaming. Funny the PS4 uses 140 watts or so when gaming.

Like i said i expect i'm in the small majority i just like efficient stuff i just replaced my 100 watt monitor with a 22 watt one that is superior in quality.

I'll say this for the 101 time Amd needs efficient GPU's(more so then Nvidia or Intel) for their APU series Intel isn't joking around with this i used their 4600 graphics it is around Llano level when it comes to graphics something i never thought was possible. I was going to make videos but 720P medium on 2012 games and before was very possible on it. I read comments all the time and most claim the intel graphics are good enough now. Amd honestly has nothing else to give to this market if Intel beats them in this.

Edit when i say this market i mean mainstream products not actually the game market.

Edit again when you say bad directx 11 optimization gamer hinted at this 100 pages ago(or more). They might care more about Mantle in these games and possibility optimize more for that for marketing purposes. Its something i'd suspect Intel or Nvidia would do as well.
 

truegenius

Distinguished
BANNED


its more like star swarm review and which vendor it support better
this bench is not showing fair results

71450.png

so is 260x equivalent to 290x in terms of performance, simply no but in above pic it shows that they are equal in dx11
290x is 3x powerful than 260x but above bench shows that it is 2x in dx12 and equal indx11
so either they are equal or 290x is not getting utilized to max (or bench is not utilizing it fully)
71452.png

but we know that former is not true so it must latter, and by looking at the power figures we can see that latter is the cause ( or amd's 290w power sipper is actually a 180w gpu )
if we assume that 980 is efficient and will be only consuming max 165w thus this leaves us at 106w consumption by cpu
thus 290x is only consuming 179w (but it should be using 290w at full performance), so it is not working
so this preview wasn't fair enough
 
Status
Not open for further replies.