News AMD Reaches Highest Overall x86 Chip Market Share Since 2013

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

filipberlin36

Reputable
Aug 28, 2017
8
0
4,510
too bad AMD can't compete Nvidia, the "real" detailed benchmark results will come I think today (In Germany its already 15th September, ~4 a.m.), I think selling of founders edition and probably some first board partner cards for 3080 (Which is the real flagship as it offers what we waited for in the 20 series and what could have been a much cheaper 1080 Ti in the Pascal series), an at least much cheaper card which AMD even with the new "Big Navi" won't be able to beat, I think they will remain between 3070 and 3080 for a lower price, but Nvidia made a clever step with "only" 10GB of GDDR6x (damn, and I bought a bit more than 2 years ago a 1070 which was from the current series in May 2018 and nobody knew that the 20-series cards would come only a few months later, but still the price for 1080 Ti remained over 1000€ for new cards and even used were sold for a long time for over 1000€ from the much better partner cards with 3-fans instead of 1 like in the founders edition, AMP!, Asus ROG and a few others were without raytracing still damn good opponents for the 20-series cards, and already in 2018 we waited for AMD and its new card offensive, I waited also a while, I got lucky because zen+ was just released and I got a R7 2700X very short time after release for a damn good price in the shop I bought the system, the price in online stores I checked later was sometimes even higher than what I paid months later...

now with Nvidia releasing the 30-series where one card has twice as much tensor cores or so they said that the process in nm is telling always less and less about the quality of the GPU's as are the cores (they doubled them with a trick?!), I remember that the Zen+ I bought used a 12nm "FinFET" process and most people said that this is a joke and intel is even better with than I think 14nm cards?! now I saw a review from march this year about a "Skylake-S" CPU which intel said would be the new (affordable?!) king of gaming, the old one being a i7-8700K or so I think?

The card has a 125W TDP they call "P1" I think and a "P2" mode which for 56 seconds(!) max. gives a boost up to 250W. With that they wanted to manipulate the often 1 minute benchmarks many wrote in the comments and that the skylake-s and everything what was released in the last time was based on the same technology with very small improvements, in the video I saw the guy who tested made it very detailled in german and said there is a + 2.6% gain in performance, especially in lower resolutions its a bit higher, the higher the resolution the more the GPU became a bottleneck (well, RTX 3000-series will change that for 4k for sure with cards being twice as good as the 2080) but for that an +40% energy consumption was needed, overclocking to 5.1GHz all cores was the most he could do he said (needed 1.4Vcore?), maybe he got a bad example he said and with +2.6% its "right" that its better than the previous model, but with the P2-mode (250W for 56 secs, but for gamers its not rly important he also said as the CPU the most time is running far below 75% while gaming, on FHD without ray tracing average more like 30 to 40%),


why is it so hard for intel to make progress here?! and why the hell do they start to enter the GPU market now?! a bit late...I love pc's and hope they will exist for a long time and I guess the GPU production for Desktops is also planned to be used for mobiles or consoles as a complete package (like the SoC on mobiles, and I think most if not all consoles also have such a tech with everything being together? so far I think nvidia is leading the market for console graphics? with microsoft being (the only?) user of AMD graphics for Xbox 360 and Xbox One, maybe they want to bring the generation which comes after the next one out with their own graphics technology?!
 
so far I think nvidia is leading the market for console graphics? with microsoft being (the only?) user of AMD graphics for Xbox 360 and Xbox One, maybe they want to bring the generation which comes after the next one out with their own graphics technology?!
Both Xbox One and PlayStation 4 consoles use AMD CPUs and GPUs, and so will the next generation of consoles launching in the coming months. Only the Nintendo Switch uses Nvidia hardware among the current mainstream gaming devices, which is probably down to them offering solutions more geared toward smaller portable devices like that.