AMD CPU speculation... and expert conjecture

Page 126 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


You can't possibly say "Haswell will do better than a SR chip" unless you are the first person is history to figure out time travel. So do we got Dr Who on this topic then or what? Neither chip has been released, there are no reviews of the actual hardware and there is no way of knowing which one will be better. It is Intel fanboyism and ego at its most extreme.

I personally love history, do you? If there is one thing that history shows us its you can't possibly predict the future. If you were to ask a military strategist in 1776 if the colonies of the new world could ever best the British Empire how hard do you think the know-it-alls would have laughed? Yet we won, and became the most powerful military might the world has ever known. You can't possibly predict accurately or know how Steamroller or Haswell will perform. Steamroller may be great, it may only be marginally better than Piledriver. Haswell could be so great that NASA uses it to colonize Mars, or it could be Intel's version of Bulldozer and be a big fudd, no one knows.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


The Gartner estimate is 352 million PCs sold in 2012.

http://www.gartner.com/newsroom/id/2301715

XBox 360 sales were 72 million total (since launch till 2013)

http://www.gamespot.com/news/77-million-xbox-360s-sold-6407243

PS3 sales at 77 million total (since launch till 2013)

http://www.gamesindustry.biz/articles/2013-01-09-idc-game-consoles-discs-to-remain-revenue-mainstays-for-years-to-come

PC sales far outsell the consoles combined sales.
 


Thats what I am on about, its the nonchalant attitude towards the PS4/XBox and the fact that the APU brings hardware and software into uniformity for untapped resources. By owning all the platforms AMD will now have optimizations as developers will want to achieve the full potential of the PS4/XBox which is said to be quite limitless relative to old cell core based systems. There is now a parity between hardware and software maximising potential that the hardware itself will become less pertinent.



Thats 3x the score and faster than a HD6670/GTX260, I am calling BS story. When the HD4600 leaks came out the HD4000 scored around 762 marks, the prototype i5 with 4600 scored 732 and the 4770K rumored around 800 marks. Since the composition is not scaled that much higher I can't see HD5000+ achieving more than the 7660D's 1300-1500 marks Possibly around the 1100mark range.

I am also getting tired of that muppet from Anandtech saying GT3 is a GTX650M beater it is lunacy of the highest level. For all I know those charts could be more liberal, never has 3x performance ever remotely been achieved and there is no inclination to believe it will now unless the clockspeed is in overdrive.

 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


Agreed, what I was talking about with consoles outselling PCs is the gaming market. One more thing to take into consideration though is how far has ARM Android come in the last year? How many people who bought PCs last year are going to get a new Android tablet and figure they no loner need the shackles of a bulky desktop or laptop when the hand held tablet can do near everything the conventional PC can? Most "light" users want to surf the internet, Tweet, use Face Book, Skype, do light word processing, ect. A tablet can do all those things and take pictures too. Really for home PCs the only advantage of a real laptop or better yet desktop is gaming. A tablet simply can't compete in the gaming world. But for people who don't game on their computers I think the trend towards smartphones and tablets has already cut into PC sales and will only get bigger going into 2014. People who opted for a tablet over desktop who want to play video games are going to turn even more to the console systems.

It sucks, as I personally love my desktop computer, and would surely break a tablet within a week, but it is the current trend. Thankfully people who still want to game on their desktop PCs make up a good enough market to keep desktops around for a little while longer at least.
 

8350rocks

Distinguished


Who was talking about nvidia was superior because of bad drivers? Whoever it was, there's you newest version of catalyst...looks like a winner to me.
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810
I have to go and get some work done now, but I just want to send out a thank you to everyone who has posted in this discussion. I hope that I have not outright offended anyone, as it was not my intent, its no secret that I don't like Intel or the fanboy rhetoric, but I got to give Intel credit they make fine chips. I just like all the posts, thoughts and ideas that show that AMD also makes some fine chips!! Thank you all for the great posts, links, information and ideas. This is one of my all time favorite discussions, please keep all the great thoughts coming!!
 
First of all, having all the market of 144 million spread over x amount of years, where most sold early on, lets say 20 million a year early on, and this is exclusive, and both gpu and cpu.
Looking at AMDs marketshare currently, 352 million, generously giving AMD 20%, thats 70 million, so approximately 30% of their sales, like whats already been said by AMD.

As far as lazy devs goes, if the indies dont get shut out, it would be a crime some indie comes out and smashes whats the norm for quality against the big houses wouldn't it?
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
You can't possibly say "Haswell will do better than a SR chip" unless you are the first person is history to figure out time travel. So do we got Dr Who on this topic then or what? Neither chip has been released, there are no reviews of the actual hardware and there is no way of knowing which one will be better. It is Intel fanboyism and ego at its most extreme.

Well Tomshardware leaks showed the perf of a 4770K. And the most realistic expectations from SR is ~15-20% over PD.
Based on that i would say that Haswell > SR in CPU.

And neither AMD nor intel have much of a presence in the Tablet market. So the trend towards mobility hurts both of them.
 


There were people lingering on AMD's past but in theory AMD sandbagged GCN for long enough for Nvidia to pander there parts as hero parts only to release 7-8 driver updates in a year culminating in Nvidia having no recommendable part anymore; The GTX670 is probably it but the 7870XT delivers within 5-10% of its performance at $150 less, the 650ti Boost is Slower than the 7850 and only now marginally faster than the 7790 all the while the Nvidia card sucks more power. If nvidia drivers were so amazing then why has their been no real progression. This round of graphics cards was a convincing AMD success on performance and value. New drivers improve CFX substantially and AMD has already said a new batch of drivers will be released in succession to there goes the microstutter arguement.

 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


Just have a minute here... "Tomshardware leaks showed the perf of a 4770K" more speculation made by experts about what may be in the future. Bulldozer was also supposed to be awesome before its release, it was supposed to bulldoze Sandy Bridge under.... didn't happen, but experts were speculating it would. Until they are actually released there is no way to know which will be better. It may be that the 4770K will perform well in certain niches (especially programs where single core execution are important) but will give room or be beaten by Steamroller in other niches (new games ect) where mulit-threading is most important.

"And neither AMD nor intel have much of a presence in the Tablet market. So the trend towards mobility hurts both of them" Indeed, however what I was trying to point out is the people who opted for mobility in a tablet may still want to video game, have a child or grandchild who wants to video game. Those people are going to go for a console system, even if they use to game on their PC. The trend towards Android tablets is going to drive the sales of console systems up as you can't game on a tablet (not the high end games on PC and console). Most people will upgrade a tablet every two or so years but will consider a console (life span ~7 years) a good gaming solution.
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


I'm just entertaining the idea of an APU; 22nm, Haswell cores + Radeon GPU. It would never happen but this would be a theoretically killer product.

I remember when i first got into all of this i was reading console vs PC articles and Desktop is dieing and Intel will buy Amd or Amd is doomed articles all the time and this was over 7 years ago now i'm reading them again. Lol Maybe technology changes but the Questionable writers don't.

The technology changes, the people don't.

Also, when HSA is introduced with Kaveri, how will hybrid crossfire with a dGPU affect HSA in 3D applications? Will it completely nullify the gains? I can only imagine the crossfired card would only force the APU to handle graphics loads traditionally.
 

8350rocks

Distinguished




I should think that when the drivers come out and you can pair it with higher end HD7XXX series cards, you'd nearly be a fool not to use a CF setup, which would only magnify the gains from HSA.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



Well you have to read the fine print. It was a single benchmark from the 3DMark 11 suite. Until we know more about the inner workings of Crystallwell (64/128MB eDRAM) its not a stretch. It's like the difference between a benchmark that fits entirely in the L3 cache vs one that is too large and needs to keep going to main memory. The bandwidth and latency difference is significant.

I have no interest in buying Crystallwell but I am definitely curious about the performance tradeoffs they can get by having a large embedded memory.

If you go way back the difference between a Pentium and Celeron was the Celeron had no L2 cache. The performance difference was like night and day. Now a 4th layer of cache is being introduced and some operations will definitely see a big improvement with that. IBM has been using eDRAM in their servers chips for years and they wouldn't do that without a good reason.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


In one sense, Bulldozer/Piledriver suffered from software not using all their real potential. It is just now when we start to see recent games such as Crysis 3 showing how a FX-8350 can outperform an i7-3770k.

Just curious what the respected reviewer told you was about the potential of the APU or of the APU-HSA? Because I agree about the latter being a kind of quantum leap.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Haswell-GT3-graphics-vs-AMD-A10-6800k-KitGuru-Analysis-of-manufacturer-expectations-at-CES1.jpg

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Eight-cores are accessible to game developers and they will be programming for eight unlike what happened with previous consoles where they programmed for single-core essentially

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

 
Now you could dedicate a core or two to the OS, but it doesnt need it, and if it doesnt fully use it/them, it would be a waste of resources.
Others pointed out, the OS determines everything, no need for a dedicated core, unless its the ARM core, which could be as powerful as those old cores or moreso, and switching to borrow another if needed, when not in standby/low power etc using HSA and hUMA
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


That "VERY low level" coding was needed to compete with 10x more powerful PCs.

First generation of next games will be probably fast ports from PC games/codes. But how PC hardware advances, console programmers will start to use optimization to compete with much faster PCs.

The advantages now are that (i) the PC-like hardware simplifies optimizations a lot of when compared with previous consoles and (ii) some of those optimizations could be reused in PCs with 'compatible' hardware. E.g. in PCs with AMD APUs.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Yes. A game as Skyrim sold about 4x more copies for consoles than for PCs.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Do you mean if you count the vast amount of older games, which were single-threaded, and the recent one being dual or quad. But this is not true for the next generation of six- octa-treaded games

You seem unaware, but all triple-A game developers poll by Eurogamer selected the AMD eight-core FX-8350 chip as the better choice for a future gaming rig.



Do you mean some wait so? Other wait the contrary

http://news.softpedia.com/news/AMD-s-Steamroller-To-Be-Faster-than-Intel-Haswell-289980.shtml

http://www.overclock.net/t/1356581/amd-steamroller-the-secret-weapon-that-could-decimate-intel
 


HSA itself is a good idea, but understand that it clashes with the idea of a GPU with its own cache. So something has to give on that front. Its possible that Intel/AMD eventually make discrete GPU's uncessesary, or GPU's lose the cache and access main memory. But until then, HSA is only valid for integrated GPU's. And from a coding perspective? No chance necessary, as its all hardware driven.

You also have major downsides: Currently, GPU data is independent from the OS (CPU) Address Space; eg, only about 512MB is reserved at any one time, despite the GPU's large (2GB) cache. So its quite possible to have a memory footprint of 3GB or so, while still having large texture caches. One address space though, and guess what? 4GB limit, period. Think textures are going to cut into that quite a bit? [Granted, this is a non-issue once 64-bit becomes standard, but don't be shocked if this causes 8GB+ systems to become mandatory]

We are now moving on to new Unreal engines and Frostbite engines that are legitimately using 4-8 cores effectively and the scaling is showing.

battlefield%202560.png


It doesn't help FPS though. Which is the entire point, isn't it?

I've actually started a thread analysis for BF3 using GPUView and PerfMon. Preliminary stuff right now, but it looks like three threads do the majority of the work; two render threads (DX11) and the main game thread. A handful of others do minor background work, and a bunch do <1% workloads. So you have a game that scales to about 3-4 cores, and after that point, the number of cores won't improve performance, just lower the overall usage statistics somewhat.

Cryteks own engine was capable of using more than 4 cores effectively and this leveraged performance with the 8350 capable of drawing upto the 3770.

But you missed the discussion on how they did that; it wasn't any special property of the game engine, they instead moved work from the GPU onto the CPU. Which sounds all nice and dandy, until you remember two facts:

1: Parallel workloads execute faster on the GPU
2: GPU performance is increasing several times faster then GPU performance.

On non-GPU bottlenecked systems, what Crytek did for Crysis 3 will end up reducing performance due to a CPU bottleneck. Hence why I consider what they did idiotic.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


They can go poor. Now Intel is hyped as the new GPU company that competes with "performance mainstream"...

Intel’s performance target for the highest end configuration (GT3e) is designed to go up against NVIDIA’s GeForce GT 650M, a performance target it will hit and miss depending on the benchmark.

Regardless of whether or not it wins every benchmark against the GT 650M, the fact that an Intel made GPU can be talked about in the same sentence as a performance mainstream part from NVIDIA is a big step forward. Under no circumstances could Intel compete with NVIDIA on performance and still do so under the Intel HD Graphics brand. Haswell is the beginning of a new era for Intel. The company is no longer a CPU company forced into graphics, but with Haswell Intel begins its life as a GPU company as well. As a GPU company, Intel needs a strong GPU brand. AMD has Radeon, NVIDIA has GeForce, and now Intel has Iris.
 


Uhh...no. Your second part of the statement is simply BS.

There seems to be a general lack of understanding of the differences in coding to a dedicated piece of hardware with a very light OS, and a general purpose PC with Windows.



Here's the thinking on that one: By putting the OS on a single core, you know NOTHING will be running on the other seven. Now, if thread management/core loading is still handled is useland (rather then by the OS, as in Windows), then your program is essentially deterministic: At any period of time, I know EXACTLY what is running, where every piece of data is stored in memory, and whatnot. This allows for VERY powerful performance optimizations.

If the thread scheduling is up to the OS though, then you can not guarantee what threads are being run at any point in time. You have a non-deterministic program, like PCs have. This basically forbids any form of assembly optimizations that make any assumptions in regards to the contents of any other threads. Granted, its a workload off the developer, but you lose some performance as a result.

So the question becomes, would a loss of one core be offset by optimizations that could be made for the other seven?

Others pointed out, the OS determines everything, no need for a dedicated core, unless its the ARM core, which could be as powerful as those old cores or moreso, and switching to borrow another if needed, when not in standby/low power etc using HSA and hUMA

Remember, you are still thinking in PC/Windows terms. Console OS's up until now have been VERY light.
 
Status
Not open for further replies.