haswell or piledriver

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Solution
If you are planning to upgrade later on, Piledriver is in an architecture AMD is committed to until 2015 (AM3+ socket) and that socket will get Steamroller and eventually Excavator, if you go with something by intel, Haswell will be their new architecture so they will commit to that for at least 2 years.

If money is at all something you're concerned about, go with AMD, for the money you would spend on an intel system, a comparably built AMD system will have more processor and more GPU for the same money...

Also, PS4 and XBOX 720 are all running AMD hardware in their consoles, so future games will be optimized for AMD architecture anyway.


Serious news sites and hardware sites got the facts right. If Anandtech fooled you then blame on Anandtech, not AMD.
 


1) You continue approaching the PS4's new hardware from an outdated PC viewpoint. You don't still obtain the point. There is not CPU and GPU 'portions' working separately like in a gaming PC of the old school. The unified design allows for the CPU to move certain GPU tasks and vice versa; that is the main reason behind the unified memory subsystem.

2) It is well-known that APUs such as trinity can increase performance by using faster memory. Trinity has shown that can generate up to 2x more FPSs in games when you feed the APU with about 5 GB/s more bandwidth. The PS4 has about 150 GB/s more bandwidth...

3) Sony has already shown at GDC 2013 that PS4 is able to play high-quality games at the level of a high-end gaming PC. The comparison show hot the PS4 competes with an Intel i7 16GB RAM paired with a GTX 680 playing both unreal engine 4 at the same AA resolution, meshes, textures, DOF, motion blur... And the DEMO was not using all the potential from the AMD APU, of course. Please don't cry too loud 🙂
 


The are already pairing the Unreal engine 4 demo down for the ps4. http://www.youtube.com/watch?feature=player_embedded&v=gtfCWYjOsvI

Trinity has ~763 GFLOPS. the 7750 has ~ 819 GFLOPS. The two really arn't close performance wise. GFLOPS on an apu scale much worse than on a discrete gpu (probably bandwidth). (This comparison is for the kaveri vs 7750; the ps4 will have enough bandwidth that it should scale with gflops very well).

Yes, ps4 has more bandwidth, however that is for the cpu+ gpu. How much bandwidth does a i5 or FX + 7850 have? (20+ 153 GB/sec).

Many other manufacturers are implementing a HSA like system.

http://www.tomshardware.com/news/HD-4000-OpenCL-Drivers-Power-Consumption-Increased-Performance,21763.html

Another new extension is called InstantAccess which allows physical memory to be written and read from either the CPU or from the built-in Intel HD Graphics.

Needless to say, even in vantage and 3dmark anandtech did not see a 56% increase (approximately 35 and 44%; GAMES were on average 19%). You keep talking about real world performance (sandra isn't a good indicator of memory bandwidth) then defend amd who almost completely delivered on theoretical gains and not real world gains.

Am I really expected to believe that trinity is that much better than the 6630m?

3DMark%20V%20graphics.png


Trinity hangs with it in almost every case. Which is fairly amazing.

Curiously, a 8 core jaguar chip is not available yet making the 8350 the only chip amd could supply to devs at this point in time that would be similar to the 8 core jaguar in that it has 8 cores (multithreading aspect).

Mark Rein also said the majority of gamers are using a 32 bit OS. And seriously, look at his job, would he really have said anything different?

 


Well you can continue negating the facts presented to you if you feel happy... let me add more info.

Even Nvidia has already accepted that the PS4 performs like their high-end GTX 680 and now they are trying to save themselves from further embarrassment with ridiculous claims such as "our GTX Titan is better than the PS4". LOL!

On paper, the fastest PC gaming card released ever by Nvidia must be better, but nobody will be developing games optimized for it by the simple reason that only a very tiny subset of PC-gamers hold the ultra-expensive, hot, power-hungry GTX Titan released one month ago. However, all owners of a cheap, cold, power-saving PS4 will have the same powerful high-end hardware from AMD and games will be optimized for it.

Or said concisely: 1 TFLOP-CONSOLE > 1 TFLOP-PC


 


What's your source that Nvidia said a PS4 is more powerful than a gtx680?

I keep asking for benchmarks or respectable sources from you guys to back up your claims, but no one wants to provide any data/evidence.

I deal with facts, not speculation.

edit: And I expect direct quotes with source provided. I'm not reading through forum posts or random webpages to try to find what you're talking about.
 


Do you even read the messages before answering them?
 


Very true that console games often perform better than the pc equivalents but also remember that ps4 is going to be much easier to code for and port to the pc vs ps3 or xbox360. many of the optimizations are easily going to be carried over (possibly not the HSA ones). Also console magic sauce exists mainly because amd and nvidia stop releasing drivers that improve performance on older cards and not because the cards are not capable of playing the game.

In strict GFLOP terms a 630m (~310 GFLOPS) easily hangs with an xbox 360 (often better) in straight console ports such as Dishonoured, Mass effect 3, etc (often quite a bit better, 630m gets about 45-55 fps at dishonoured high settings 768p). Its more the drivers that cause older cards to lose performance (look at games like crysis 3, tomb raider on nvidia before drivers were released improving performance). Older cards don't get these drivers.
 


I read it. Are you and 8350 going to start backing up your claims or continue to spew nonsense?

I'll use a little logic -- in order to match something like a 8350 + 7970 in performance (AMD's own current flagship parts), they're going to have to use roughly the same amount of energy. In order to do work, you have to use energy.

Let's assume they made an AMAZING 100% increase in performance per watt in both their CPU and GPU technology -- and they're strangely saving it for console APUs while their CPU division struggles. Since there are claims a PS4 matches a high-end gaming PC, that would equate to a 8350 + 7970 (in terms of performance) from AMD's own hardware portfolio. An 8350 is a 125W part and a 7970 is a 250W part. That's 375W, but 100% increase in performance per watt brings it down to 188W. What are they going to do with 188W of heat on an APU in a small enclosure? Keep in mind their current APUs are 100W. Again, that is assuming a 100% increase (double) in one generation -- something that will never happen. A more realistic estimate would be 10-20% over their existing APU technology... which again would put it in entry-level gaming PC performance.

If you guys just thought about this scientifically for one second, you'd realize your claims are ridiculous. If you want to claim otherwise, I need something credible. Extraordinary claims require extraordinary evidence.
 


1) Using energy efficiently is different from wasting energy due to inefficiencies. AMD has already stated why your logic is faulty:

For us, really by looking at that APU that we designed, you can’t pull out individual components off it and hold it up and say, ‘Yeah, this compares to X or Y.’ It’s that integration of the two, and especially with the amount of shared memory that Sony has chosen to put on that machine, then you’re going to be able to do so much more moving and sharing that data that you can address by both sides.

Or in more concise terms: 1 TFLOP-CONSOLE > 1 TFLOP-PC.

2) Several game developers praised the hardware on Sony PS4. Avalanche Studios said:

The PS4 will not only be a very powerful gaming machine from a hardware perspective, but it will also be a social tool and integrated marketplace more akin to the successful mobile devices. It's the best of all worlds in a way; great performance for demanding high-end gaming, good social ecosystem and connectivity, and integrated business marketplace. For Avalanche Studios as an open-world games developer this is super exciting and opens up many new opportunities.

It's a perfect fit for the types of games we do, and we are confident that we’ll bring open-world gaming to a whole new level because of it. I’m glad Sony decided to go with 8gb RAM because it means that the PS4 will out-power most PC’s for years to come.

Then Nvidia started its attack on PS4 with

If the PS4 ships in December as Sony indicated, it will only offer about half the performance of a GTX 680 GPU

Epic shows at GDC (Game Developers Conference) a demo of the PS4 running against a top high-end gaming pc: an i7 (Ivy Bridge) + 16 GB RAM (DDR3) + GTX 680.

Nvidia is no more mentioning the GTX 680 these days and has changed now its attack to

PS4 & X720 are 3x slower than the GeForce Titan.

I would like to know how Nvidia obtains the "3x slower", because ( 4.5 Teraflops / 1.84 Teraflops ) is not 3x.
 
You still need transistors to do work and you still need to power them with electricity. More transistors = more computation ability = more electricity usage = more waste heat. That heat doesn't get "used up" -- it has to be dealt with.

The CPU and GPU being on the same chip only affects their communication between each other, not their respective speeds. And yes, an APU is CPU cores and GPU cores on the same die. It's EXACTLY like combining a CPU and GPU. You increase the speed they communicate at, nothing more.

Still waiting on your link from the Nvidia rep....

As far as nonsensical quotes, John Carmack said this last summer:
"When you look for the best graphics available in the whole game industry today, you look at Xbox 360 and PlayStation 3..."
http://www.tomshardware.com/news/Kepler-Nvidia-Tim-Sweeney-Unreal-Engine-UE4,15571.html

Console developers say consoles are faster than PCs because they sell more games if people believe that. It doesn't make it true. Or are you also going to argue Xbox 360 and PS 3 are the pinnacle of gaming right now?

Are we still talking about measuring a GPU by TFLOPS? In that case, I have 4.18 TFLOPS -- 2.2x more than PS4. That debunks your whole argument about "next-gen" consoles being faster than gaming PCs.
 


http://en.wikipedia.org/wiki/John_D._Carmack
 


THANK YOU! For god sakes this BS mantra of "The PS4 is using AMD therefore AMD will do better in gaming" is pure speculative NONSENSE. Yet I see AMD fanboy troll on the site spouting this complete BS as legitimate fact. I've seen this line repeated 6 times in 3 different threads already. I think AMD fanboys are living in an alternate reality than the rest of us.....
 


Some facts.

1) AMD's Gaming Evolved program.

2) Main game consoles will be using AMD chips.

3) Game kits will be using AMD chips. Sony is selling PS4 development kit based in an eight-core FX chip.

4) AMD will be selling a version of the PS4 APU for the PC market.

5) Games with the label "Optimized for AMD" are already launching now.

6) Common sense:

http://www.tgdaily.com/opinion-features/70487-amd-builds-a-game-console-monopoly

7) AMD own elaboration on their relation with the Sony PS4 (bold face from mine):

In the case of the PS4, we leveraged the building blocks of our 2013 product roadmap [...]

This is going to be a very exciting year for gamers, especially for those with AMD hardware in their PCs and consoles, as we have even more game-changing (pun intended) announcements still to come.

Look for some more exciting things happening at the Game Developers Conference (GDC) in March when we will provide even more info on how we are working with game developers to make AMD the hardware of choice for running the best games!
 

1. many gaming evolved titles are not performing as much in amd's favour as expected (crysis 3, tomb raider).

1920-VH.png


index.php


After nvidia released drivers for tomb raider.

13632141234v2TkTbPdM_5_4.jpg


2. True, AMD will be receiving optimizations.

3. True. However, there are no 8 core jaguar chips out their so FX would be the closest chip to the chip used in the ps4.

4. True, however, it will only be 4 core and cut down compared to the ps4 chip. There is some degree of speculation whether the ps4 cpu is really two 4 core jaguar units (because jaguar is essentially a four core unit) strapped together like a core 2 quad. Not sure about this. Look at 7, amd is probably using similar pieces compared to its existing technology as much as possible to save money.

5. True, however the degree of optimization of most games is slight (not more than generally 30%).

7. Do you think they would honestly say anything else? Of course they are going to promote their products.

 
Ignoring all comments above (too many for me to read), I believe it best to go with Intel. Amd have a worse track-record, meaning their products are known to be less reliable. Also with the new haswell architecture that is arrving Q2 2013, the Amd CPUs will be much less power efficient, and produce more heat. Taking one of the new Intel chips will let you save on the cooling and on your electricity bill whilst gving you equal, or even better performance than what Amd will offer for a similar price.
 
Less reliable how? I have never had an AMD chip fail on me that wasn't the fault of either myself or another component like a power supply that decided to die and take CPU and board with it. The difference in power consumption would be equivalent to a single incandescent bulb and a CFL. In other words, it isn't exactly going to affect power bill in any noticeable way unless you are doing something like F@H 24/7.
 


AMD chips are reliable and sound. They are used in servers and top supercomputers

http://en.wikipedia.org/wiki/Jaguar_%28supercomputer%29

http://en.wikipedia.org/wiki/Titan_%28supercomputer%29

AMD chips have the world-record of overclocking due to their excellent fabrication. Intel does not hold the record due to less reliable designs (low quality thermals) in their ivy Bridge.

The power consumption of high end AMD CPUs is higher than i7 (ivy Bridge) at full loads (because the chip is more powerful), but it is lower at idle.
 


Pentium 4 had really high clockspeed and wasn't that good.

Any chip being sold in a server is very reliable. Generally speaking all other things being equal, a chip using less power and at lower temperatures is more reliable. However, even in consumer cpus, the cpu is virtually never a failing part, assuming it is not being overvolted.

Idle power usage is similar enough that the choice of motherboard is going to affect the overall power usage of the computer more. AMD is very competitive in this aspect.

Undel load however, amd uses significantly more power. Unfortunately for amd, perf/watt is often very important for many servers running cpu intensive tasks (as opposed to I/O where the cpu is generally not running at max).

Intel has used a poor solder/ heat spreader on ivy bridge to save a few bucks. Shame on them.

World record overclocking proves nothing because it is out of the reach of virtually every consumer. People would rather have a chip with a much lower max overclock that can get closer to that overclock than a chip that overclocks very high but is extremely difficult to reach that overclock.
 
Interesting thread...

Just in case anyone's still there, I'll post my 2 cents on the original "Haswell vs Piledriver" question:

As always, it comes down to the BENCHMARKS. Just because the PS4 is using an AMD 8-core CPU does not necessarily equate to the AMD desktop CPU being the best choice over the Haswell CPU (i5-4670K?).

It's POSSIBLE that upcoming games will suddenly benefit more from that AMD CPU but that's pure speculation. Speculating that future AMD CPU's and graphics cards will be the better choice due to the PS4 using AMD is fine, but there's little evidence that this is actually the case.

If you're building a new PC and are looking at a CPU that's roughly $200+ the choice is probably the i5-4670K, though I'd really love to see an 8C/8T CPU from AMD that solves some of their current flagship issues such as single-core performance.

It will be fascinating to revisit this issue in a year to see if any games based on UNREAL 4 engine that are designed for the PC and PS4 simultaneously start favoring the current AMD FX-8350 for example.

Until then, benchmarks, not crystal balls.
 
Well that was quite alot to read through, now that I'm down I will simply say: Let us look at the past. For anyone attempting to deny that hardware optimization from a console to a pc port won't matter. you are more far gone than we really care about and not worth communicating with. It is pure logic, if software is highly threaded it will better perform on multi threaded cpus. You can not deny that or term attempt to downplay it based on the fact it is speculation, it is very well guided and evidence backed speculation, more of a reference to past ports. I'm not staking a claim for Intel or AMD here, but wakeup people, stop attacking other posters about AMD when they are the only two to actually add some reference information, all any of you have done is argue with Juanrga and 8350rocks, why not present some information negating their facts and intelligent speculation? Argument is futile when they reproduce factual information from numerous sources.

Thank you, good day.
 


Except that claims are being backed up on data and common sense. Eurogamer approached a number of AAA game developers asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. All of them opted for the AMD FX-8350 (eight threads) over the i5 3570K (four threads).

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

As shown in the article, a new game such as crysis 3 runs faster on an AMD FX-8350 than on an i7-3770k (both eight threads). An ultra-expensive twelve threads i7 3930K beats the FX... but the hypothetical Haswell i5-4670K (four threads again) don't be that fast, and recall the rumour that AMD will be releasing the Centurion chip for beating both the 3930K and the extreme chips. And next becomes AMD Steamroller chips...
 


I didn't say the FX-8350 wasn't the way to go, I said it's quite possible it was.

Also, when you buy a CPU you want it to be the best performance on AVERAGE, not just for a couple games. From your own article they state the i5-3570K is the better CPU for most games now:

"the Core i5 3570K - offers four cores at 3.4GHz. In a world where single-core performance still dominates, the Intel offering is still considered the better buy"

So it's not that simple.

We're talking about SOME games in the near-future benefitting from the AMD CPU, not all or necessarily even most. And or course, most people will buy older games through Steam (or have them already) that run best on the INTEL.

That's my "common sense" argument.
 


You have so many replies that I am not sure you will get to this one.

Ivy Bridge motherboards have been around long enough that they should have the bugs worked out. Also, the capabilities of Ivy Bridge are well known and can be found on the internet.

I still favor an Intel build for gaming; and I believe gaming was your concern. Hover your mouse over my build to see what I put in it. By the way, a SSD is not necessary, save the money for a faster video card or high quality power supply [never skimp on a PSU].

Actually, once you get past 4GHz [which is very easy to attain] on an Intel i5 3570K or 3770K the bottle-neck with most games will be the video card.

Ivy Bridge and motherboards that support them are blazing fast. The vast majority of games are very happy with no overclock.

There are some NON-game apps that benefit from highest end AMD CPUs over Intel CPUs with less than 6 cores + 6 virtual cores.

For gaming you will not need any more than I put in my computer, although you may want to get a faster video card.

You can do as you please, and you have been given much advice. I do know that my computer runs very fast.