AMD CPU speculation... and expert conjecture

Page 704 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

truegenius

Distinguished
BANNED


because anything above 7 $ucks, looks horribly ugly, ugly level over 9000 reaching level 10k, who use plastic sheets in their windows ? in current pic/videos of 10 i see that ms didn't changed this plastic feel at all ( may try developer preview to actually feel its ugliness )
support from ms ? they ditched 8's support for dx12 so what else to expect from them
at this level of ugliness, i won't take it even for free, and many games are still using dx9, and gta5 is using dx10 (probably), so no reason to go for dx12 now thus no reason to upgrade, maybe after 2-3 year but for now even paid 7 is still much better than free 10
 

jdwii

Splendid


But we both know 12 is coming and it will support many games in the future. The past means nothing its only the future genius. :), lets hold off on the 12 statements until this time next year. If i had more money i'd bet money with you guys.
 


Notice I didn't say anything about theft.

It's extremely common for one brand of a product to add popular features from another brand. Graphical Mouse support is a popular feature in practically every application in the world. Each product vender didn't "invent" mouse support nor did they each "invent" a graphical user interface, they are all supporting a popular feature created a long a$$ time ago. This is no different, AMD Mantle had support for certain features, those features were received positively by software venders. Microsoft decided to support those features in their own implementation of a hardware graphics API, OpenGL has also decided to support these features.

The line of "well we have no proof ..." would hold some weight if the timeline didn't blatantly suggest otherwise and we didn't the exact conversation of "well we hope AMD's Mantle push's Microsoft to support similar features in Direct X". It did and MS did, which is actually a really good thing. I remember having this discussion with Gamer about standards wars and how it's good to occasionally have one so that newer features can be tossed around and prevent stagnation. Eventually different entities adopt good features from other entities until your left with one standing that ends up having what most people want / need.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


You are pursuing a logical fallacy (Post hoc ergo propter hoc): "A" followed "B" thus A must the cause of B. But reality is more complex than that and this is the reason why in sciences we have much more serious methodologies to know if A is the cause of B or not behind a simple temporal sequence.

jdwii is entirely right. We can suspect that DX12 is a sub-product from AMD Mantle, we can suspect that AMD and Microsoft have collaborated on DX12, we can suspect that DX12 is based on part of Mantle, but his point is that we have no strong data to make final/rigorous statements beyond personal suspicions.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Ugliness is a subjective term. To me W7 is terribly ugly!

Said that, if ugliness was the problem, M$ would devote 15 minutes to change the decoration of W10 instead giving it for free, loosing millions. The reason why M$ is giving W10 for free is to try to recover market share from linux and unix (Apple).
 


Standards wars are only good when there's only one left standing. Which in this case, it looks like we're getting. There's no reason to support a vendor specific API if the generalized one is almost as good.

It should also be noted for historical context: Both NVIDIA and ATI came up with Pixel Shaders on their own. They were unified in the DX spec as SM 2.0 back in DX9. So this wouldn't be the first time MSFT added something the vendors were using into the mainline API.
 

truegenius

Distinguished
BANNED

future (dx12) will take time to show significant presence maybe 2-3 years, dx12 looks great performance wise but it is w10 exclusive is not good, i hope to see dx12 mods (or ms forced to support dx12 for 7) for 7 :pt1cable: so that we can drag w7 till 2020 :whistle:

ugliness is subjective but many people like aero and MS ditched it like it was unneeded and unused feature, this is why people didn't switched to 8 , 8.1 even for as low as 10$, some like 8 because metro is lite on their igpu and thus cause less performance or battery loss and using new thing is always a fashion even if this new thing is taken from past ( pic below :p )
if they can then they should devote 15 minutes to provide option to choose metro or aero as default look and then they will have option for everyone
the reason that classic shell software exists and people are using it is because people love aero and MS is loosing those people with metro
btw, you like metro, memories of 90's refreshed :whistle:
misccAOL-1996-vs-Microsoft-Windows-8.jpg


1 fresh and 3 refresh :pfff:
re-branded 290x to tackle 980 :rofl:
 

cemerian

Honorable
Jul 29, 2013
1,011
0
11,660

neither windows 8.1/8 or 7 will ever have dx12, ther isn't even the smallest bit of chance that would ever happen
Quatin from anandtech dx12 preview "Because DirectX 12 and WDDM 2.0 are tied at the hip, and by extension tied to Windows 10, DirectX 12 will only be available on Windows 10. Windows 8/8.1 and Windows 7 will not be receiving DirectX 12 support.
Backporting DirectX 12 to earlier OSes would require backporting WDDM 2.0 as well, which brings with it several issues due to the fact that WDDM 2.0 is a kernel component. Microsoft would either have to compromise on WDDM 2.0 features in order to make it work on these older kernels, or alternatively would have to more radically overhaul these kernels to accommodate the full WDDM 2.0 feature set, the latter of which is a significant engineering task and carries a significant risk of breaking earlier Windows installations. Microsoft has already tried this once before in backporting parts of Direct3D 11.1 and WDDM 1.2 to Windows 7, only to discover that even that smaller-scale project had compatibility problems. A backport of DirectX 12 would in turn be even more problematic."
 

8350rocks

Distinguished


At $0.11 kw/hr, I am buying a product I want over something that uses less electricity..."just because".

I tried the "super efficient" light bulbs that are supposed to last 2 years or something like that a while back, you know what happened? In the same 6 months my $3 pack of light bulbs would have died, I was replacing the $12 light bulbs.

Did it save me money? Honestly, I replaced 14 light bulbs in my house, and my electric bill was perhaps $3/mo. cheaper, if even that..and that was a ~50-60% reduction in power consumption for all the bulbs in my house. From 60W to ~18-22W per bulb. Now, in that 6 months, I saved ~$18, we will call it $20 to be generous.

However, the bulbs cost me ~$10-12 per 3 bulbs instead of ~$3 per 4 bulbs, so the cost for the bulbs was so high it actually *cost* me money to "save money" on electric through efficiency.

The reality of all this "more efficiency" garbage is simply this: You might save money somewhere, however, you will not come out ahead overall because of initial costs. This is the same thing with hybrid cars. Would you spend an extra $6,000.00 to buy a car that would save you money on gasoline and be marginally better for the environment? What if you knew the fuel savings in dollars would require you to drive that car until it had 160,000 miles on it just to break even? Or, if I told you that the difference in emissions was ~10 ppm between a PZEV and a ZEV?

So, I am a power user. In the end, I run rigs maxed out, straining components. What it does under "average load" or "consumer load" is irrelevant to me. I want to know what it does when utilization stays at 90%+ for hours on end.

In that scenario...there is no NVidia card that makes enough difference for me to not buy AMD.
 

8350rocks

Distinguished


I cannot confirm or deny that a certain 3 letter company acronym was working on their API before a certain 2 letter company acronym was working on their API.

I cannot confirm or deny it.
 

8350rocks

Distinguished


AMD may, or may not, be doing the same thing with Khronos that they did with MS, regarding APIs. Since they would have to finance the leg work to do it, they can instead have it implemented into OGLNext.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I agree. So lets see Mantle running on Intel and Imagination GPUs as well as AMDs so we can get Mantle in other platforms besides Windows. :^)

As palladin said, some of you either have very short term memories or are being deceptive. It just takes a few simple searches of the internet to see that Mantle came first.

If we had no standards wars, we would all be using DX11 for the foreseeable future. You are acting like competition between vendors is a bad thing.

Fiji looks like a nice upgrade from 280x or 7970. It's basically a doubling of 7970 resources (minus VRAM, clocks) without having to deal with crossfire. I just hope it can hit $400 to $500 price bracket. I am due for an upgrade, 3GB isn't enough VRAM for me anymore.
 
As palladin said, some of you either have very short term memories or are being deceptive. It just takes a few simple searches of the internet to see that Mantle came first.

To be fair, NVIDIA/MSFT claimed DX12 had been in development for over a year before Mantle was released. Whether that's political BS or not, I can't say, but I will say that I doubt MSFT could have come up with a brand new API, WDDM model, integrated it with its next version of it's OS, and get a working demo out running on old hardware using a subset of the new API, in just a few months.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Right, you cannot confirm or deny anything, because at best you only have info that one company decided to give you.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And OGL-Next will be presented next month, with "live demos of real-world applications running on glNext drivers and hardware."
 

jdwii

Splendid


Actually the unreal engine 4 has support for 12 already and you can make full games pretty quickly on that engine. Oh and by the way that is one funny picture thanks for sharing it. Microsoft sometimes i think they smoke crack half the day. Lately however i've been quite proud of them
 

jdwii

Splendid


I said i suspected i was in the small majority but i just like efficient stuff, kinda like solar panels i like them they are green call me a hippie i don't care. Nothing against the brute force method i just think its sloppy engineering.
 

8350rocks

Distinguished
You know, the funny thing is, if you want to support green companies, AMD is actually a Gold rated green company in terms of environmental footprint. NVidia and Intel are not even silver rated.

Keep buying efficiency though...
 

jdwii

Splendid


Yeah i do know that

If Amd makes something competitive in performance per watt in the 200 watt metric i'll buy it if its priced around 350$ or so.

Until then they can keep their re-branded slightly improved design(mostly the same since 2011). How much money will the 390X be anyways probably priced at the 980 both cards of which are to much money for most in PC gaming which the sweet spot is 200-350$.

If one day gaming was actually doable on linux(already that way) and most if not all games worked on it with very little effort (not lame wine) i'd switch and get off this windows crap.

If one day Amd makes a 100 watt CPU that is competitive in single core and multicore performance of the comparable priced Intel i'd switch.

I'm not tied to one brand and i'm happy i'm not if i was i would either have horrible single core performance with lots of TDP and a GPU with lots of TDP for the same performance as the competition that has half the TDP.
Or worse i'd have great single core performance but horrible graphics with the IGPU(Intel).
Or i'd just have a Arm CPU with a few cuda cores.

Amd, Intel, Nvidia are all out to get money that's the point just don't let them get to ya over a brand sticker.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
AT has a nice update. They did research why the 290X did run slower under Mantle than under DX12 in the dual-core test. They traced the cause to the higher batch submission time of the Mantle implementation. Oxide explains that the Mantle implementation includes an optimization routine for small batches that can increase the CPU bottleneck on the lowest CPUs. The interesting part is that the routine was giving Mantle and artificial performance benefit over DX12 when using more than 2-cores.

Thus my former words "DX12 was neck-to-neck with Mantle" must be replaced by DX12 is faster than Mantle.

If we turn off the small batch optimization feature, what we find is that Mantle' s batch submission time drops nearly in half, to an average of 4.4ms. With the second pass removed, Mantle and DirectX 12 take roughly the same amount of time to submit batches in a single pass. However as Oxide noted, there is a performance hit; the Mantle rendering path's performance goes from being ahead of DirectX 12 to trailing it.

This feature is enabled by default in our build, and by combining those small batches this is the likely reason that the Mantle path holds a slight performance edge over the DX12 path on our AMD cards. The tradeoff is that in a 2 core configuration, the extra CPU workload from the optimization pass is just enough to cause Star Swarm to start bottlenecking at the CPU again.

Now Windows game developers will have to choose between an API (Mantle) that works only on AMD hardware or a faster API (DX12) that works on AMD, Nvidia, and Intel.
 

8350rocks

Distinguished


LOL @ your doom and gloom.

Want to know the great thing about software? You can release a new driver update and optimize code to fix it.

Want to know the bad thing about hardware like the GTX 970? You cannot fix it with software, only lessen the impact...
 
Status
Not open for further replies.