AMD CPU speculation... and expert conjecture

Page 24 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Out of curiosity, what program are you using to scan? Still not really satisfied with the scanner I've been using...

Actually MOST developers use the compiler defaults. Perhaps not a game developer... but probably 90% of the rest of all developers.

(I have had to fight with lead programmers to get them to adopt more optimal compiler switches in the past.)

Ok, GAME developers typically won't use default switches, but just about everyone else will. You're right on that point (sadly enough).
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

let me ask you something. Did piledriver improve on BD? if so, roughly how much would you say.
 
Also (in an "I told you so moment"), AMD has announced they are going to release a series of GPU driver updates in an attempt to fix their bad average frame latency times compared to NVIDIA:

http://techreport.com/news/24136/driver-software-to-be-tweaked-to-reduce-radeon-frame-latencies-in-series-of-updates

In addition, they've "discovered" part of their driver was sub-par in terms of performance:



Boy, isn't it great when companies are actually held to task? THIS is exactly why companies need to be pressed, and results investigated and understood. Techreports investigation looks like it is going to result in a significantly better Radeon drive going forward.

So yeah, FPS is dead.

EDIT:

Also, a fun read on that topic: http://techreport.com/blog/24133/as-the-second-turns-the-web-digests-our-game-testing-methods
 

Chad Boga

Distinguished
Dec 30, 2009
1,095
0
19,290


The evidence doesn't stack up.

Intel agreed to stop doing what they claimed they were never doing in the first place and rather than AMD improving, they instead when further backwards.

That suggests to me that AMD's problems have always been their own level of competence, or lack thereof in recent years.

The damage Intel has done to AMD is to produce products that appeal to more people than AMD's products do.

That is how the system is meant to work. Understanding this doesn't make one an Intel fanboy, it just means one can connect the dots and make the correct attributions.

Irrational AMD fanboys always assume that when people point out that they are irrational, that the person must be an Intel fanboy.

Not so, they are simply someone with a low tolerance for nonsensical whining.
 
Having fewer options in any market isnt a wise idea.
Having an exclusive business model like what Intel used isnt good for all of us, as eventually progress will fade.
I dont see what Intel did as being wrong as pertains to Intel, but as for the rest of the semi industry, and particularly AMD, I do
 
^^ ARM is doing quite well. And programs like VMWare are making it kinda trivial to use any CPU architecture to run programs from any other architecture.

Again, its not the 90's. You can no longer treat X86 as the lone CPU architecture in town. The times have changed.
 


Legacy and Enterprise says hi.

And good for AMD to recognize the frame latencies at last!

I'll change the GTX670 to a 7970 in a few weeks, so it's good news for me, hahaha.

Cheers! :p
 

keithlm

Distinguished
Dec 26, 2007
735
0
18,990


You can use the Intel Compiler Patcher program.

However you can use just about any hexeditor or search program that will allow you to search for the string "GenuineIntel".

It is good that Intel stopped pretending that they were not doing their GenuineIntel check to discriminate against non-Intel chips. But it would have been better if they had just fixed the faulty logic rather than just posting a note on their website mentioning that their compiler does this.

 

90s had more competition than now...

 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

I would agree

51138.png


most games improved pretty much equally or close to it across the board.

BUT ...

51141.png


WHAT THE F .... 0.2% on the game that is claimed to be soley IPC and clock speed? There is a reason AMD can't compete in SC2, and its not just IPC and clock speed. 0.2% gain from BD to PD, there is something else holding AMD back. SC2 is the only game where overclocking an AMD cpu doesn't even speed the game up.

Considering SC2 was designed BEFORE the court order to Intel, and Intel plastered the game all over their website, what could possibly be the problem? Is SC2 performance all AMD's fault?

 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
It's just CPU bound. Considering it's almost 50 FPS, that's pretty good for AMD. But the fact that there is almost no improvement, that's bad. I've heard it's the socket in general and the Pin count. Which I don't know why Pin number would make a difference.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


A good question for Blizzard and AMD. No speedup from higher clock speeds is a major issue, but doesn't sound compiler based to me. I could see the compiler reducing the effective IPC but that would be across the line not frequency specific.

You can't fault Intel for wanting to make a top tier game run great on their hardware. For all we know the game could have been running at 30fps before Intel assisted them.

From Blizzard's perspective they care less about the top performance than getting it running on every piece of hardware out there, including Apple. Look at the Blizzard forums and there are 100+ posts a day from people who can't even get the game to run on their PC.

Post this on the AMD forums and see if they care. They may know about it and not want to release new errata. Look how long it took the driver team to acknowledge the average frame latency issue with their video cards.


 

m32

Honorable
Apr 15, 2012
387
0
10,810



You play at 768p? We all know clock-for-clock Intel beats AMD, so why you won't show an benchmark with them at stock than at same clock speed? It's amazing your using benchmarks that favors Intel... I wonder why?

We know Intel is better. :hello:
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


But according to Noob222, frame latency is just another term for "minimum FPS measure in 1 frame intervals".
So WTF is AMD actually improving ? I see a conspiracy by AMD here.. :whistle:

"minimum FPS measure in 1 frame intervals". is like saying "minimum kilomtres per hour measured in 1 meter intervals". Which means absolutely nothing.
 

@2nd chart: the intel quads perform similarly and the amd fx cpus perform similarly. aside from conspiracies... at 1024x768 resolution, the cpus are doing a lot of work, so if there are more factors than ipc (iirc amd fx has higher clockrate than intel's quads) then i'd assume it's the memory subsystem inside the cpus. could be cache as well.... and another factor could be that since starcraft is an older game, it possibly treats the fx in a way that it's architecture is left underutilised e.g. only one module gets used, poorer core use etc. i am interested to see if upcoming starcraft games display the same behavior as the current one.
or may be bliz didn't care to use the compiler switch instead wanting to access a larger userbase. :whistle: :sol:
 
I agree 100% Steamroller has great potential. About time they added an additional decoder set per module. Four decoders didn't make any sense at all for "two cores".


Wait, are we talking bout something else? I swear I clicked on the steamroller thread.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

and the prime example as I stated before, can't at all be anything Intel did when they assisted in programming the game itself. It must be somthing with AMD, anything that Intel could have done is just a conspiracy.

Intel has no reason to force AMD cpus for example to not use SSE instructions.
 
Status
Not open for further replies.