AMD CPU speculation... and expert conjecture

Page 179 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

amdfangirl

Expert
Ambassador
What irks me about Haswell is the price rise for the Xeon E3-1230 series SKU

SNB: http://ark.intel.com/products/52271/
TRAY: $215.00
BOX : $230.00
IVB :http://ark.intel.com/products/65732
TRAY: $215.00
BOX : $230.00
Haswell: http://ark.intel.com/products/75054/Intel-Xeon-Processor-E3-1230-v3-8M-Cache-3_30-GHz
TRAY: $240.00
BOX : $250.00

Right now the Haswell version sells for $270 whilst the IVB version sells for $235 on Newegg.

This was my favourite CPU series because you got HyperThreading like a Core i7 but paid a little more than a Core i5.

(But lost the use of the IGP for a lower TDP)

Intel figured it out. :p

At $270, I might as well buy a Core i7 for QuickSync. :/
 
@hcl123: i got the idea of hd4k@128 shader alus and haswell gt2@160 shader alus et al from techreport's gt3e review, compared with trinity and kabini specs and thought that those could be comparable.
thanks for the extra insight. :)

now i wait for analysis from english website. hopefully toms will publish more on haswell (and richland/kabini too...). those guys are strangely quiet. usually the oc/uc analyses, mobile comparos and intel/amd comparos come out after a week or two of the launch... :whistle:
 


Both wrong. 360 has a tri-core, POWER7 based CPU capable or running two threads per core.

http://en.wikipedia.org/wiki/Xenon_(processor)

Xenon is a CPU used in the Xbox 360 game console. The processor, internally codenamed "Waternoose", which was named after Henry J. Waternoose III in Monsters, Inc. by IBM[1] and XCPU by Microsoft, is based on IBM PowerPC instruction set architecture, consisting of three independent processor cores on a single die. These cores are slightly modified versions of the PPE in the Cell processor used on the PlayStation 3.[2][3] Each core has two symmetric hardware threads (SMT), for a total of six hardware threads available to games. Each individual core also includes 32 KiB of L1 instruction cache and 32 KiB of L1 data cache.

The PS3's PPE has 7 functional SPE elements, one reserved for the OS, leaving 6 hardware threads available to games.

So consoles have supported 6 hardware threads since 2006, making your entire argument wrong.

The reason PC's don't use the same amount of CPU resources is because the underlying architecture is so significantly different. Different memory and threading subsystems, and the lower absolute performance of the consoles forces developers to use REALLY low-level code in some places, and a lot of the memory that games use is static; EG: you know ahead of time where most every variable is located in RAM, which allows for very fine-tuned performance. You can't do this on a PC, where the entire memory subsystem is virtualized. You also don't have as large an OS, so you get a lot less performance loss due to context-switches locking up the CPU every time a kernal thread needs to run.
 

tuklap

Honorable
Jun 13, 2013
6
0
10,510
Well let's put it this way. in a world full of intel optimized softwares, we can see AMD cpus doing its job working around the optimization means a great job already for AMD right?
 

Can someone please explain what this means in the link in simple terms and if it matters that much.

"Paying the extra for a K-series product also means giving up support for one of Haswell's key features, the TSX extensions that enable transactional memory. Intel has stripped out the VT-d device virtualization and vPro management features in the K series, as well."
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Gamers on a tight budget would likely get the Athlon X4 750K for $85. Native 3.4/4.0 and unlocked.

The fused off GPU adds area to the die to allow better cooling. Which theoretically should let it overclock past an FX-4350 much easier.

 


I got a MSI FM2A85XA-GD65 + 750K + ARES Blue 1600's for $225 equivalent, paired it with a MSA Twin Frozr GTX650ti Boost 2GB card, games like a champion.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I suspect something similar happens with x264 (Windows). It is not compiled with ICC but seems to check CPUID and run optimal path for GenuineIntel.



I am interested. I could prepare graphics with the before/after scores, tweet about all this and use part of my server-space to upload the graphics. Although I will be very busy next week.
 
Looking at the linux benchmarks on phoronix for FX8350, it still doesn't really beat ivybridge. I doesn't seem much better than in windows for single threaded programs. ICC may be there to throw off AMD performance in a lot of synthetics but the real world software difference probably isn't that big.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


In average, the FX-8350 was about a 10% behind the i7-3770k in the old benchmarks. Being up to a 42% faster than the i7 in some test. Those benchmarks used a version of GCC (4.7) that lacked some optimizations beyond bulldozer.

There is some recent benchmarking using version 4.8 of GCC where the FX-8350 (8-threads) is competitive with i7-3960X (a 12-threads $1000 chip)

http://www.phoronix.com/scan.php?page=article&item=llvm_clang33_3way&num=1

You can even see the cheap AMD chip beating the eXtreme chip in some few tests

http://openbenchmarking.org/embed.php?i=1305170-UT-LLVMCLANG75&sha=b3c948c&p=2

http://openbenchmarking.org/embed.php?i=1305170-UT-LLVMCLANG75&sha=1593a32&p=2

http://openbenchmarking.org/embed.php?i=1305170-UT-LLVMCLANG75&sha=82ca41f&p=2

I don't know if GCC 4.8 is already using all the performance of the FX chips, but seeing it beating a 5x more expensive chip shows, I think, that the FX is a much better chip than usually considered.
 

Kulasko

Honorable
Jun 13, 2013
30
0
10,540
I just have tested Cinebench 11.5 with and without the Intel Compiler Patcher...
To my suprise, it made absolutely no difference in performance...
 


It won't. That patch is for older versions of ICC, the newer ones add a second check for the family of the CPU. If it doesn't recognize the family then it sets the CPU Options variable to 080H which is just basically SSE2. They talk about it more on those sites I linked as does agner. Mostly the ICC patcher will let you know that there is screwy code involved, it's hit and miss on being able to override it.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Those are for Cinebench 10

cine10.bmp


 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Well lets see. Battlefield 3 for example with the PS3. Battlefield 3 on the console version is Forced to run 720p since the highest resolution for the console version is 720p! Yes the system can support 1080p, but if BF3 ran 1080p on a PS3 it would most likely get no more than 20 frames if its lucky, hence why games don't always run on the best resolution. On the other end of the spectrum, (hypothetically) if we had pc games that only allowed 1080p minimum, but you loved the game and you couldn't run it because you PC wasn't good enough and you didn't have the money for a better one, wouldn't you be just a little angry?

I'll also give you a PC example. My friend just showed me a new game that is just now hopping out of its Alpha stage into Beta, called Planetary Annihilation. By steams account, the absolute minimum requirements to run the game on low settings is a Dual Core CPU, and 4GB of total RAM. My old PC, which I happen to be using right now, has almost the minimum requirements (Core 2 Quad @ 2.5), along with an AMD 5570 that is overclocked to almost the max on stock voltage. I bet I'd be lucky to hold 30 frames on the lowest settings with that.

Into further speculation, the recommended is a Quad Core CPU at 3.0 Ghz and at least 8 GB of RAM. Computers that are being sold within about the past year have really started to hit this mark of power, and some are surpassing it. However to those people who wish to still keep their old computers from say more than a year ago, good luck running the game.

Keep in mind, this isn't even talking about resolutions.

This is all just to prove the point of, many consoles are breaking into a very high end level of power, HOWEVER, you always need to keep in mind, the intensity of the game itself, not just your machine. I think its great that PS4 has an 8 core AMD processor and Graphics that are about on par with a ~GTX 650. But all in all, those consoles are really for people who like to game, but aren't about that total gamer/nerd life. If you are a true gamer, you need to invest into a PC for the sake of performance and overall quality.

This is also why I have the Xbox. Most Xbox users are little kids who can't afford crap and think they are all high and mighty with their system when they don't even know what a hard drive is. By the way, the Xbox is the most money sucking thing I have ever seen. Why do you need to pay for their internet when you are already paying for your own. And with this Xbox One business, it's basically like a death wish.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Nice to see you again juanrga. I would like to point out that you argument is both valid and invalid. Did they point out the clock speeds, voltages, and heat? I think you know this already, but for an 8350 to come close to a i7-3770k stock at best, it would need some serious tuning (overclocking), which also results in more heat (burns computers). Yes I know AMD can run hotter, but thats to the fact that they need more raw materials to make more cores which have to be clocked higher so they can par up.

Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).

Now what if we did the vice versa and made the i5 an 8 core processor and overclocked to 4.0. I would say that it's easily and extreme series CPU and may be the best on the market for performance.

It all comes down to the price. The main argument is between the (no offence) inexpensive, less wealthy people, versus the more profited people. If you have the money, Intel is the better choice (unless your on a budget or video editing or doing things with more OpenCL or just want to make your computer run very hot). Really now for gaming these days, if you have a Core 2 Quad Q8300 @ 2.5 (my old CPU), as long as, for GPUs, you have something like a 680 or 7970 or more, then your fine. ^_^
 
Food for thought. If I cut a FX-8350 in half (not literally, but on a core basis), and ran only 4 core, and under clocked it to 3.4 to compare it to an i5, how would it run. Well I can tell you now, it will do worse than probably a Pentium 4, and will have less than half of its original performance. Yes I think it would run much cooler, but it would be absolutely terrible. The main reason that Intel costs more is really that the cores are stronger (mostly from the material which happens to not be as heat resistant, but does conquer in the power of each core).

This is logically incorrect.

Clock speed means absolutely squat. The only thing that matters is performance vs cost and for some folks vs energy usage. There already exists that "cut in half" 8350, it's the 4xxx series and their significantly cheaper then an i5. FX8350 is $199, it's nearest competitor is i5-3570K @ $214. The FX will win in any benchmark that actually use's it's resources. If the application is heavily restricted to 1~2 threads then the i5 will be ahead. The best indication of this is the i3 scores. The i3 is exactly 50% of the CPU resources that the i5 has. If the i3 is scoring anywhere near the i5 (at same clock speed) then the application coding is what's limiting your system.
 

8350rocks

Distinguished


How about trying this one on for size...

Your "good PC" with a 3570k can run 4 threads at once...if you went and bought the best compute GPU available to a hardcore gamer for gaming...the HD 7990...you would be able to run 8 threads.

You can run 8 threads...the CPU on the PS4 can run 8 threads by itself...now...that's not the key part though. The GPU on the PS4 can run 64 threads at once.

So lets do some math for a second so you understand what is being discussed about consoles:

PS4 = 72 threads at once (max, likely closer to 65-70 including middleware and OS)

Your "EXCELLENT GAMING PC" = 8 threads.

My question to you is: How do you propose to keep up with 60+ threads with your current hardware when maximum potential from the consoles is achieved?

The answer (whether you agree or not), is simple: You can't.

That's why consoles are generations ahead of current PC technology.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Hi. The above figures are for the 8350 @ stock.



Very interesting link. Thanks!
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Right, only to add that the i5 is faster (will it continue to be after the FX-fix reported above?) if you run one of those applications, wait, close it, run another... Most people runs several applications at once and then they find how their FX chip is faster than their i5/i7.

 


Uh, no. For one, even though there are a couple thousand threads running at a time on a PC system, only a handful are in a "ready to run" state at any given time. You don't NEED to run that many theads at once. Secondly, most tasks aren't time sensitive; if your UI is delayed by 50ns because you have to wait for the thread to get swapped in, guess what? You don't care.

Thirdly, most modern GPU's with some form of compute [everything since teh 8000 series from NVIDIA] can offload work from the CPU in some fashion; the PS4 is hardly unique in that regard. Whether you see a performance increase is largely dependent on scale though; I wouldn't bother to offload anything unless it scales to AT LEAST 32-GPU compute units, due to how relatively weak a single GPU compute resource is compared to a single CPU core.

So no, if you installed a full version of Win 7 64 on it, the PS4 would have roughly half the performance of a medium-grade gaming PC.
 
Status
Not open for further replies.