AMD Piledriver rumours ... and expert conjecture

Page 241 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 



read my post carefully
a CONSTANT 30 fps and a CONSTANT 50FPS you would NOT see a difference
the human eye cant possibly tell a difference
it is only the difference in FPS and the drops in minimum frames that you would see
but if each was at a CONSTANT 30 or constant 50fps you couldnt tell the difference
simple biological fact
 



I guess you forgot about China?
The largest communist country in the world.
Only 1.4 billion people give or take a few million.

 


You would. My eyes don't lie and I game nearly daily. There is a large difference. You wouldn't know that difference with an old Phenom and a 6870.
 
"AMD’s upcoming desktop Trinity parts are expected to launch in a few weeks and a US retailer has already jumped the gun with some early listings. The listings do not reveal much, but they seem to confirm that Trinity chips will end up on the cheap side.
The pre-order price for the A4-5300 dual-core is $59.60, which sounds fair for a 3.4GHz/3.7GHz part with 1MB of L2 cache and HD 7480D graphics. The black edition A6-5400K comes in at $73.59. It is clocked at 3.6GHz and it can hit 3.8GHz on Turbo and it features beefier HD 7540D graphics. Both dual-core parts are rated at 65W.

In the quad-core world, AMD offers the A8-5500 for $109 and the A8-5600K is also listed at $109. The 5500 is clocked at 3.2GHz/3.7GHz and it has a 65W TDP, while the 5600K runs at 3.6GHz/3.9GHz, but it ends up with a 100W TDP rating. Both have 4MB of cache and HD 7560D graphics.

The A10-5700 and A10-5800K follow the same pattern and both cost $131. The 5700 is a 65W part, clocked at 3.4GHz base and 4.GHz turbo, while the 100-watt 5800K runs at 3.8GHz/4.2GHz. Both have HD 7660D graphics and 4MB of L2 cache"

source - http://www.fudzilla.com/home/item/28599-desktop-trinity-listed-in-us-affordable-pricing-revealed
 
yeah really is no news out there

did find this too


"It appears that a forum member over at Coolaler forums managed to get his hands on an engineering sample of AMD's upcoming Piledriver FX-8300 series, codename Vishera, eight-core CPU and even run some benchmarks.
As it was rumored earlier, AMD plans to launch a total of three CPUs in the FX-8300 series, the FX-8300, FX-8320 and the FX-8350. The flagship FX-8350 is expected to work at 4.0GHz base and 4.2GHz Turbo clock. It is still not clear which CPU did Coolaler forum member managed to get, but it did work at 3.3GHz, according to the Cinebench R11.5 screenshot, so it all points to the FX-8300. On the other hand, the CPU-Z shows the TDP at 124W which is the rumored rated TDP for higher clocked FX-8320 and the flagship FX-8350.

As far as the benchmarks are concerned, the tested engineering sample scored 18705 points in 3DMark Vantage benchmark CPU test, 4986 points in 3DMark 06 CPU test, did SuperPi 32M in 24m 20.946s and scored 5.73 points in Cinebench R11.5 CPU test.

Of course, these are only early benchmarks done with an engineering sample CPU but they do give a clearer picture of what we can expect from the upcoming Vishera based FX-series CPUs"

source- http://www.fudzilla.com/home/item/28303-amds-vishera-es-fx-series-cpu-benchmarked

might be old news to some but interesting reading for those catching up
 



China is communist the same way America is a monarchy. :ange: Now north korea, that is communism.
 


Providing to stuttering, which can happen if it takes too long to create a given frame. Example, if one frame takes 30ms to be created, and the next only 2ms, you have a "steady" 60fps (1 frame every ~16ms). Core 2's, even if they can spit out decent frame counts, are going to be far more jittery simply because of the FSB. Techreport did an examination of this a few weeks back, and it was QUITE enlightening.
 


There is a difference between being able to distinguish a single frame, and noticing how much smoother the screen is between 30FPS and 50FPS. Trust me, I played for years on my 240Hz CRT; theres a difference. You won't be able to distinguish a particular frame (eye isn't that sensistive), but you CAN distinguish the final result.
 


Capitalism IS dead, and has been for almost 200 years now. Every "Capitalist" country embraces some form of Socialism.

And before anyone complains, answer this: Would you support the removal of every worker protection ever passed (Minimum wage, all benefits, maternal leave, 40 hour work week, child labor, and so on and so forth)?

I also note the 1% with 90% of the wealth is unsustainable; money in a bank that is not being invested anywhere contributes nothing to economic growth. Hence the western worlds problems right now. No surprise that as the few gain more wealth, the economies fall into sustained recession.

So yeah, Piledriver...
 
Hey AMD fans, do you think haswell could help out AMD in the long run? I mean, in the short term, it won't help profits much, but the fact that all models have at least 4 phsyical cores might create a shift towards programming for mulltiple cores, something that favours AMD's desktop buisness model.
 

+1. it is quite possible to distinguish the fps differences.
additionally, if you've played with higher fps, it's harder to go back to playing with lower fps.
 



You totally missed my point and just repeated yourself.
 

AMD is dead....
 

Well, the 8120 @ 3.1 scores about 5 points, which would make that about a 15% improvement. No guarantees until the chips are released though.
 
That is not quite accurate. The figure was between 10-15% faster from interviews with actual AMD engineering executives. Do not rely on trolling reports. There was a fraudulent benchmark reported someone called OBR on his website. The test results were supposedly based on an engineering sample, but there were too many factual inconsistencies in his report to give it any credibility.





 


orly? Ever heard of radeon? Yep, GPU division is doing pretty well.



hmmm I would not be surprised if they pretty much rebranded the whole thing. 5770, 6770, 7770 werent far away from each other either. On a side note, 7870 entered the market at way higher price point than it should have. AMD could have sold so many more cards if they priced them right from the beginning, but its that EA's Battlefield 3 mentality "release sooner than competitors and those stupid customers will jump on my product no matter what just because it came out first."
 
Status
Not open for further replies.