AMD CPU speculation... and expert conjecture

Page 306 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


In most programs that have say a 3570k vs. a 8350, they get ~ the same performance, if not then the Intel processor wins. The only thing that AMD benefits from is OpenCL/GL usage. Who's to say that all the PC gamers out there dont use Windows. It is a fact that at least 90% of them use Windows for their main machine, which is enhanced for Intel. Also as a side note, it would not make sense for Intel to start making their programs more for AMD when they have better business and chance with Intel.

With that being said, just because Mantle uses software to accelerate it, doesn't mean it'll perform That much better. Really if that's case, wouldn't it make sense to also allow the full potential of the Intel processors to be used as well. I hope that you know that BF doesn't just support AMD on more than 4 cores, but also Intel. This new software may give a chance for HT on Intel to give much better performance. Even better yet, who's to say that there won't be a group of geeks that modify Mantle to work on Intel processors better than AMD?

The fact is that more and more people are using HT 4 core processors, and also getting 6 core processors from Intel. For games to not take an advantage of this would be brutal. If they start losing performance for Intel / Nvidia users, then they could lose o less than 20-30% of their sales and respect. No one on a normal budget (which seems to be most of the market), would trade their PC, or make a whole new PC with AMD parts just to play a game better.

This all does sound exaggerated, however, just keep that in mind.

We all know that if one business fails, we'll all be screwed. If AMD or Intel goes down, there's that chance that well be paying $1300 for a 3570k like processor. Unless it is balanced, then good for them.

It's the fact of Battlefield being a usually Nvidia exclusive game to start making these things for AMD is just astounding. Considering their Best single GPU card can't beat Nvidia's 3rd best card that's at the same price without excessive overclocking is what makes this point valid. Honestly I can't wait to see what the 9XXX series will be like. If their best card isn't ` on par, if not better than the 780, then they'll have major problems.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
Lol, Gomer pls, you won't be paying $1300 for a i5 3570K. You'll be paying $1300 for the quad core celeron.

Also, You CLEARLY haven't been reading. The R9 290x is the new series GPU, and it beats the Titan in most applications. I suspect it'll be even faster once it gets better drivers.

Also, Memory cubes... sounds good to me :D Can't wait to see what happens with this...

http://www.overclockarena.com/micron-hybrid-memory-cube-engineering-samples-start-to-ship/
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


GPU that's better than a Titan for only ~ $600. Something seems too good about that. I bet that it'll easily rise to $800 or more in no time. Just look at the GTX 780. The ASUS GTX 780 DCUII was $580 right when it came out and it stayed like that for about a week. Now it's going to be passing $700 and it's only been a few months.

There's going to be at least one problem with it. Mind if I share possiblities?:
- You can only use it with certain motherboard / CPU's / etc. something of that nature.
- The price will definitely rise.
- Heat? Anyone think of that.
- It's probably overclocked past its stock clock by a lot.
- It will only with with PCIe 3.0 or in x16 or no less than x8.
- There will be limited cooling solutions.

I can make this list go on forever. It's just too good to be true. It had this with Bitcoins. There's a $500 bitcoin machine than can do 1 Gigahash a day. In other words, you'll have enough to buy 5 more $500 machines by the next day. The problem though it that it can only run for a certain amount of time a day, you need to pay a monthly (expensive) plan to make it run, and many other problems that make it much less than it actually is.

Also to the fact that the bitcoin market is always changing and can change a lot by one person selling a lot of coins or buying a lot of coins.

We'll have to wait a see.
 


AMD has stated they won't pull off the B$, after all, the 780 with the smallest OC in the world is literally a TITAN, nVidia is just doing the "price raise" B$. Remember the 7970?
 

jdwii

Splendid


When the I5 and 8350fx are even its when the 8350fx uses only about 2/3 of its capacity or about 6 cores if they can get all 8 cores used efficiently(which i have to see it before i believable it) then yes the 8 core will be faster then the 4 core.


Its important to remember that the Piledriver design is probably half as strong as Intel per clock or 40% as strong at the worst now Amd has their processor clocked 25% higher to overcome this but at the end of the day its probably 15-20% slower per core. But it does have twice the cores.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


It is also important to remeber, hardware acceleration. When a 3570k on stock (plug and play) is used to play ArmA 2 with a 7970, which is a very CPU intensive game, it only uses ~50% of it achieving, on lowest settings, ~140 FPS. A 8350 on the other hand with plug and play (keep in mind ArmA utilizes all cores no matter the processor) with same settings, it utilized all the cores, however using ~70% of it achieving only ~110 FPS. Tell me that again?
 


>HafijurIPClogic
>IPCchangingsinceLynnfield
>Hafijur
>Turbo boost to 4.2GHz
 

Sigmanick

Honorable
Sep 28, 2013
26
0
10,530
First of all, this is a nice bulletin Board. There are a lot of options.
In the absence of Steamroller news, Mantle.
Don't feed the troll. Don't feed the troll.
too late.



So... what you are saying is; when software is compiled with the GCC, both Intel and AMD have Parity, including Piledriver et al. (that means they are equal). But when software is compiled with ICC, automagically INTEL becomes a better solution. Others have mentioned how GCC can/has provided AMD favorable results.

Since AMD licenses x86 from INTEL, I feel Intel should fairly support that license.

What I'm saying is that IF Intel wishes ICC to remain RELEVANT, they will need to fairly code the ICC to lookup supported instruction sets, not manufacturer names. These predatory practices have led others to invest in an alternative compiler. ICC will begin to lose favor as other compilers mature and finally become better than ICC. Once that happens, Intel will lose influence in the Benchmarking market. If Intel doesn't have marketing, what do they have left? So it behooves Intel to make a product that the customers want. Customers being Coders, Game developers, and ultimately users like you and I.

Let's think of the good things AMD has done. I mean Thank goodness AMD's x64 extensions won out over IA64; because how would we still run our 32 bit software except by holding onto older, inefficient hardware and OS's negating any advances in PPW and Total watts consumed.
Thank Goodness for the Integrated Memory controller. Intel sucked hind Teat until they adopted that little gem. There are others, as AMD's intellectual Property may be worth more than the physical assets of the company itself.

So. Mantle. A masterful stroke. AMD IP in 3 different consoles (and i wonder if Mantle will make it to Nintendo Wii U as that needs some desperate help) and the PC. Of course developers will jump on that to simplify cross platform development. As adoption of Mantle continues, I wonder what the chances are that AMD will continue to refine this API for older generations of APU's and GPU's.
 

jdwii

Splendid


I doubt that game scales to 8 core efficiently when i hear comments stating otherwise
 

jdwii

Splendid
The program i can think of that i use uses 8 cores if not more is handbrake no game i know uses 8 cores perfectly like that game you stated. We shall see how BF4 turns out but again i'm with gamer i doubt you will see to much work being done on 3+ cores.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


There are several ARM desktops. I gave you a link to one of them. I also said you that more ARM desktops would appear when ARM was moving to the desktop.

Ubuntu is the base for the SteamOS, which will be used by lots of people.

The first time that you asked me about Android I wrote:



The second time that you asked me about Android I replied



Now you ask again about Android on the desktop. LOL How many times I need to repeat you something before you stop from asking the same again and again?

The argument about AMD not shipping today is irrelevant because the battle for the desktop/server/supercomputer is not happening today.

The argument about Nvidia and the Shield is irrelevant, because Tegra 4 is not aimed to this battle. Moreover, Nvidia is already shipping dev kits for the future ARM supercomputer.

Samsung seems to be preparing an upgrade of its Chromebook and probably will release a custom ARM chip for servers in 2014.

Apple was an important enough client for Intel stoling it to IBM. Apple is also important enough client for Intel designing and fabricating a chip specifically designed for Apple: Haswell Iris Pro. Of course, nobody said that iOS is a DT OS. When Apple will turn to everything ARM, they will use OSX for the desktop.

We also know Qualcomm plans:

Long term goals include offering optimised platforms to customers starting with smartphones, expanding to tablets and later going after notebook and PC. From a hardware point of view Qualcomm is ready, now it is up to operating systems and vendors to do their part.

As you well say, now Windows is a minority OS, whith less than one quarter of the total market. Moreover, you forget to mention that most of that share comes from old Window XP/7. Windows 8 is about one tenth of that. People who was using W7 has the option to be locked to 7 forever, downgrade to 8, or upgrade to linux. This last option was selected by Valve and other game developers.

http://www.zdnet.com/valve-ceo-why-linux-is-the-future-of-gaming-7000020735/

Microsoft doesn't push the industry anymore. It seems you missed AMD early year talks, when announced that they break exclusivity with Windows. Intel did a similar announcement a pair of weeks ago:

http://www.dailytech.com/IDF+2013+Intel+Distances+Itself+From+Windows+8+Microsoft/article33363.htm

Regarding the GFLOPs I used several sources: details about the ARM arch, AMD official presentation of Seattle, and ARM presentation of A57.



Maybe those news were hidden in Intel-land, but the rest of the world knows that the sabotage was proven; Intel was obligated to add a disclaimer to the documentation of its ICC and had that pay lots of money to AMD. It is not the first time that Intel is found to be engaged in illegal and dishonest anti-competitive practices. In fact Intel has been processed in many countries.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Hum, I know JML... from reading its "AMD DIIIIIIIIEEEEEE!!!!!" ALL CAPS nonsense posts in other places.

ICC has a unfair CPU dispatcher that forces the chip to run a fast optimized version of the code when detect a "GenuineIntel" CPUID and slow unoptimized code when detect otherwise.

GCC has not anything as that and run optimal code for either AMD or Intel. The reason why AMD run better under software compiled with GCC is because GCC allows the chip to exploit ist true potential, unlike ICC which castrates AMD.
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


I know exactly who you're talking about! JML is a parody account that comments on other tech sites (the names escape me). Is hafijur really JML? I've been following this thread since pre-piledriver release and I'm amazed that you guys keep responding to him.
 

A few words: WHAT HAPPENED?
7Wk7j8i.jpg

 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
Lol, Gomer. It sounds too good to be true. But it is true weather you like it or not. You just can't accept the fact that Nvidia zipped up too early and AMD looked over at Nvidia's face and started laughing (Urinal stall jokes ftw). Hence Nvidia planning to release a Titan Ultra. But unless if it's as cheap, or has a better cooler than AMD's new GPUs, they're screwed.

hafijur is JML now huh? I'll buy it. Considering he thinks Intel is more powerful than IBM, I buy it.
 

Sigmanick

Honorable
Sep 28, 2013
26
0
10,530


A little of my Inner troll came out. Looks like some others too.

I've been reading this thread with interest, and watching in disgust at times, as this has been hijacked into unrelated intel-land. I just could not believe that the first person who replies to my first post on Toms was Haj.

I will be more professional in my responses.

So Steamroller. http://. Looks like AMDFXBLOGSPOT has borrowed the data from WCCFtech and posted some comparative numbers between the Kaveri part and Iris Pro. All values given are GPGPU - which are astounding if true. I would like to see some numbers of the CPU alone, but that's not the goal of the Fusion HSA.

I'm hoping that the drivers/codes used are not finished and that we'll see even better performance when measured as PPW/ PPD$/ IPC /FTW!

Sorry, that might be enough.

 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


It DOES, is the problem. It scales like it should and like any other game should. Also it's the fact that per core performance really plays a role with this.
 

jdwii

Splendid


when i hear comments stating the game uses 2 cores heavily and the others at only 50% or less i start to think otherwise.
 

jdwii

Splendid
To make sure that game does use 8 cores efficiently someone with a 8 core processor could force the game to use less cores(use the task messenger) when using fraps to make sure their is a difference. I did this with my 6 core and i noticed GTA 4 only used 3 cores and that Sims 3 only used 3 cores.


 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


If Kaveri does have a 13CU version that will be a pleasant surprise. Even if they are clocked slower (600Mhz vs 844Mz) it will still be a good 16% faster than 8C. Slower clocks mean better thermals and power consumption too.
 
Status
Not open for further replies.