AMD CPU speculation... and expert conjecture

Page 51 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid



Just because you cut your workforce doesn't mean anything besides hard times. Nvidia is worth 4 times more then Amd and at that Intel is worth 56 times more.

But the 8350 is still I5 level performance and some times the I7.

The A10 has better graphics then even haswell most likely

The 7970 ghz edition is very competitive with the 680.

I'd say Amd has great products they just need to push them to OEM's that is the issue. In fact that was always the issue and most likely always will be.
 


Intel cannot buyout AMD as a going concern or ten kinds of doggy doodoo will hit the fan.

Hector Ruiz and the former board of directors, sold off GF but profited massively from it even making themselves CEO's of GF and signing new agreements which saw AMD paying GF on fixed contract basis. That to go with paying double for ATI was basically the reason I hold that the entire former board must be held joint and severably liable for breach of fiduciary duties and fraud.
 


HD7660D is roughly 15-20% faster than the fastest GT3 if you take aggregate. GT3 is impressive but it doesn't play Skyrim maxed out at 1080 this I say without even breaching trust agreements, but then common sense would have said so.
 

viridiancrystal

Distinguished
Jul 27, 2011
444
0
18,790

Basic economics says that there is a point of diminishing return. Meaning, as you add more workers to your company, the product/idea that comes from those workers will not scale well to the amount of workers.

Workers (1-10) - Product/idea (How good it is)(1-10)
1-1
2-3
3-5
4-7
5-8
6-8.5
7-9
etc.
Having more workers will make a better product, but it isn't as efficient (think of it like performance/watt). When you're in a situation like AMD is, you have to be efficient.

EDIT: added etc.
 

m32

Honorable
Apr 15, 2012
387
0
10,810



Unless no one wants to buy AMD at all I doubt the separate states of the world would let it happen. It would be the biggest monopoly in the known universe!

 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810


I like your enthusiasm! No doubt that the 8350 is a great processor, especially in video processing, in gaming it performs fine, but in Blizzard games AMD just sucks. It is also a great overclocker, AMD needs more people like you!
 

m32

Honorable
Apr 15, 2012
387
0
10,810
I hate people who bring up lower rez games. Who plays at 720p with an i7, i5 or FX8320-50? Show some "1080p benchmarks!" is the line I tell them. ;)

@iceclock How many cores does Starcraft2 uses? That could explain it.
 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810



lower the res and setting, the more visible a CPU bottleneck is, thats why they do that.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

SC2 was made before Intel was court ordered to quit supplying crippling compilers.

The Intel C++ compiler has various options that allow the programmer to generate code for a specific instruction set or to make multiple versions of the code for different instruction sets with automatic CPU dispatching. Non-Intel processors will always get the generic version of the code if CPU dispatching is used. The default level for the generic code is SSE2 for version 11 and 12 of the compiler, and 386 for version 10 and earlier in 32-bit mode as indicated in the following table.


There is an option for setting the generic level higher or lower. For example, the options /arch:SSE3 /QaxSSE4.1,AVX will set the generic level to SSE3 and generate three versions of the code for the SSE3, SSE4.2 and AVX instruction sets. Non-Intel processors can only get the generic version, which will be SSE3 in this example. Code compiled with the /Qx option, for example /QxSSE4.1 will fail to run on non-Intel processors and processors without the specified instruction set.
http://www.agner.org/optimize/blog/read.php?i=49

In short, if you use the intel compiler even today without specifically telling the compiler what code to use, AMD gets sse2 maximum code, no sse3, no sse4, no avx, and prior to court order, AMD got general 386 code. If you specify AVX code, the program will not run on an AMD Phenom cpu, sse4 specific won't run on an amd athlon 64, ect. Intel did it this way to guarantee their compiler will continue to push faster code on "GenuneIntel" or not work on older AMD processors if specified to use a newer extension.

http://software.intel.com/en-us/articles/optimization-notice#opt-en

Intel's other solution to please the courts was to post their notice in gif format so you can't google search it unless you know exactly how its worded for the web page itself.

With this in mind, SC2 was plastered all over intel's websites for being optimized for Genuine Intel processors. Certain people believe this to be a conspiracy but simple logic should tell you what happened to SC2. http://software.intel.com/sites/billboard/article/blizzard-entertainment-re-imagines-starcraft-intels-help

Is SC2 simply cpu bound, or do AMD processors run generic 386 code, representing the peak of Intel's screw AMD over at all costs business model?

CPU%20cores.png


Bliz games still favor intel, but nowhere near to the same degree as SCII, and in fact more in line with most other games out there, where Intel is very slightly faster.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810



Because they use older game engines, that use 1 main thread, and 1-2 light threads, effectively making them mostly single threaded ?
And this is one area where AMD is behind Intel.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810



All this is very good, but in previous pages, gamerk316 and I have already given ample evidence that less than 5% of windows devs use Intel Compiler. 95% devs use Visual Studio, which is vendor agnostic, mainly because it produces slower code for all CPU's.
Intel Compiler produces faster code for AMD procs, than AMD's own compiler (which BTW, works only on linux).


So I ask you : if a dev is an AMD fanboi, and wants to produce fastest possible AMD only code for windows, what is he to do ?
1. Use Visual Studio, which makes compatible code for all procs, and is considered as standard?
2. Use Intel compiler, which 'fcuks up' AMD procs, but still makes faster code than Visual Studio ?
3. Use a non-existant AMD compiler ?
4. Compare performance of a non-existant compiler to an existing one ?
5. Write in Assembly language ? Should only take about 100 years for a million Lines Of Code.


And if you are on linux, why the hell would you use any other compiler other than GCC, which produces fastest code ?



In short, if you use the intel compiler even today without specifically telling the compiler what code to use, AMD gets sse2 maximum code, no sse3, no sse4, no avx, and prior to court order, AMD got general 386 code. Intel did it this way to guarantee their compiler will continue to push faster code on "GenuneIntel" or not work on older AMD processors if specified to use a newer extension.

i dont see what the problem is. Say you have Piledriver processor. You use GCC, which is most neutral. . Naturally you want the best performance possible, right ? So while compiling, you will use the option "-march=native" . This will produce the fastest possible code for the Piledriver processor.
Now take that code and run it on a Bulldozer processor. Big surprise, that code wont run on a Bulldozer processor. Even though Piledriver has almost the same architecture and instruction set, it wont run code meant for Piledriver.

If you specify AVX code, the program will not run on an AMD Phenom cpu, sse4 specific won't run on an amd athlon 64, ect.

Does PHII proces support AVX ? Does Athlon support SSE4 ?


Frothing at the mouth is OK, but get your facts correct first.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

read it again.

Does pII support avx? no. Will pII run code with AVX specifically used? no

What was it I said again?

As for your "ample evidence", does it even matter? It does not change the fact that SCII is plastered with Intel, and widely claimed to be the sole evidence that AMD sucks. Who had their hand in development in the game to make sure it was that way?

Do you really believe that Intel is going to help develop a game and make sure that it runs at 100% efficiency on an AMD cpu?
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Thats not what i am saying.
My point is : HOW many game developers use Intel Compilers for development on windows ? <1% is the figure.

Also, how can a piece of code be consciously optimized for BOTH intel and AMD ? It can either be Intel, OR AMD.
Optmized for both means optimized for neither. A generic code, which Visual Studio produces, which 99.9% game devs use.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
@ Noob222 :

I think you are confusing "active game development support" with "Using Intel compiler".
The former does not imply the latter. The former usually means help the game devs in writing code in such a way which takes into account the quirks of the Intel processor architecture. Which usually means it wont run too good on AMD hardware. It may also include giving them some money in return for using the game for "intel winz" promotion.

The 'discrimination' of Intel compiler has ceased to of concern in most cases. These mostly include HPC sector. And the people there do extensive testing before buying any hardware, and software.
 
Status
Not open for further replies.