AMD Piledriver rumours ... and expert conjecture

Page 292 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 


Hey gamer not being rude at all i'm really not. I was just wondering what you do for a living you seem to know a lot about programming is it a hobby or a professional job be cool if we are talking to a actual game programmer.
 


Really wow i play AOE3 and it uses 2 cores(even though the task messenger will show 6 in use :sol: )
 

PS3 has 8 as far as I can remember. Consoles are setup differently and most console ported games to PC run like crap.

There's still games today that eats up 1 core, like Guild Wars 2 and Mass Effect 3, but at least Valve made Half-Life 2 games scale with cores and BF3 clearly scales with cores too.
 

Again, single player. Most people don't buy a multiplier based game to only play single player.

I don't know of very many games that use more than 1.5Gb of system RAM as is. It's sad to see so few developers pushing what the PC is capable of doing. I always thought that if you were going to buy a fairly low end build for gaming, you may as well go console, because there wont be a big difference between them. This just seems like an endless loop. Devs don't push what PC can do because they want all the lower end systems to be able to run their games, people with lower end systems can run these games, so they feel no need to upgrade --> stagnation.

I really criticize other devs and praise Valve for their coding work. HL2 came out in 2004, and it scales 4 cores almost perfectly. L4D2, Portal 2, and TF2 scale 6 pretty well.
 

NO, do not get the FX-63xx or whatever they come out with...
*sigh*

that's the worse chip of the group just like the FX-6100 and FX-6200 are/were.
 


Thank god Intel finally caught up to Xbox 360 level graphics with the HD 3000.

They better ramp up IGP performance more with Haswell... I don't want to see anymore games developed with the GMA 950 in mind.
 


2 extra cores for $10. How is that the worst?

For gaming you're probably better off disabling the even cores and having 3 full cores with no L2 thrashing.
 

Actually where I live the 4300 and 6300 are both $129, which one would YOU buy?

I built a 4100 and 4170 system for a couple people and they were terrible, it made my old Athlon II X4 look good. The 6300 in benchmarks isn't that far behind the 8320 (and in some cases passes the 8320) while the 4300 sat at the bottom.
 


LOL - maybe so..

Kinda old article, but explains somewhat how Intel chooses their code names: http://www.tomshardware.com/news/Intel-Sandy-Bridge-Code-Name,11875.html

A recent blog post over at Intel sheds some light on the code-naming process - the ways how names are selected from a limited list of options and the legal as well as in-house challenges. Both AMD and Intel usually go after public places that cannot be trademarked and do not carry any risk of lawsuits. Until recently, AMD used the locations of Formula 1 race tracks (a choice made due the company's involvement with Ferrari) and Intel has been using geographical places "that can be located on a map", thanks to Frank Zappa.

According to Intel, the company used "fun" and artist-related code-names before such as the "Joplin" or "Morrsion", but was legally challenged when the Frank Zappa Estate was not too happy when it heard about "Intel Zappa". Intel previously also used Disney names, planets and cartoon characters. "There’ve been all sorts of names,” Intel's Russ Hampsten said. “We used planets, moons, cartoon characters - we even had a dinosaur series of codenames around the time ‘Jurassic Park’ came out. My all-time favorite was probably ‘The Raptor.’ It was interesting, fun and it sounded mean.”

Sandy Bridge is a rather boring example. The chip was originally named "Gesher", which means "bridge" in Hebrew. However, Intel discarded the name when it was discussed that Gesher is also the name of a former unsuccessful political party in Israel. While Sandy Bridge sounds like a place on a map, it isn't: "Despite sharing its name with a bridge in Singapore and a historic town in West Tennessee, Sandy Bridge isn’t named for an actual place," Intel said. "Instead, it’s a result of a switch, and a suggestion from upper management following a meeting with analysts."

The successor of the current Sandy Bridge generation was originally supposed to be called, Molalla, after a town in Oregon. However, that name isn't especially easy to pronounce in all languages, so the company looked for a different name. Eventually, Haswell was chosen. Haswell is "a town claiming the country’s smallest jail, Intel said.""I had to go all the way to Eastern Colorado and a town of under 100 in population to get a tolerable name not taken or trademarked,” Russ Hampsten said.

Wonder if the P4 shouldda been named "The (c)Raptor" 😛
 



for gaming....
4300 over the 6300
now multi-tasking is something different and make clear of what your talking about.

and this disabling half of the core or module (whatever) this to get better performance....
I read about that but my thing is why should you or anyone have to be bothered with that
if the chip wasn't a design flaw or crud in the first place...?
you don't hear about that disabling nonsense with Intel chips..
 


BF 3 is one of the most CPU heavy games in multiplayer, it hammers dual cores and even Q6600 @3 ghz any day. It also scales cores well, so its good for measuring total CPU juice.



That's pretty much what I keep saying. Next gen consoles won't run 8 or 10 core processors. They will most likely have simple quad cores and games would be optimized for them.
 


Well, I have the 2700k at 4.5Ghz and it doesn't seem to be suffering. Doing desktop stuff, no difference between the 965 and the i7 so far; gonna do some benchies this afternoon.

And we need more info (benchies) on the 6300 and 4300 CPUs. Maybe they do replace the X4's legacy... Finally, haha.

Cheers!
 

for one, welcome back home from the US...
two, I was just making light of the HT and the 3570K, and I wouldn't disable HT if I had it either.

no difference in surfing and light gaming, OK I can see that.
do the benches and get back to me...

I'm more curious about the 980BE vs 4300 chips in particular, I have NO interest in the FX-6 series.
 

read what I said abot BF3 .. in single player yes, in multiplayer NO WAY. I have tested this with my 8120 cut down to 2 modules. ITS NOT EVEN REMOTELY CLOSE.

latency IS fps

frames per second
seconds per frame

SAME THING just inversed. Longer latency = less fps. 16.7 miliseconds per frame = 60 frames per second. 32 miliseconds on one frame = an instantaneous FPS of 30 for that one frame, IE minimum FPS is the longest time between one frame. 99th percentile latency = minimum fps.

if you don't take my word for it, check out TR's own statement.

, with our latency-focused 99th percentile frame time results converted to FPS for easy readability.

so the entire article they call it 99th percentile frame time, for the chart, guess what ... MINIMUM FPS ...
 
Status
Not open for further replies.