AMD Piledriver rumours ... and expert conjecture

Page 93 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
Hell lets be generous and say 256MB per memory stick for Windows XP release.

256/8 = 32MB per chip or 256Mbit DRAM. Now your talking Windows XP on ... 32MB of memory.

There is no way around the fact that if you can cheaply put it on a CPU then you can cheaply put eight of them onto a stick for $40. And thus system memory capacity will always be 8~16x greater then what's possible with a CPU. Our memory needs are growing, not shrinking.
 
Hell lets be generous and say 256MB per memory stick for Windows XP release.

256/8 = 32MB per chip or 256Mbit DRAM. Now your talking Windows XP on ... 32MB of memory.

There is no way around the fact that if you can cheaply put it on a CPU then you can cheaply put eight of them onto a stick for $40. And thus system memory capacity will always be 8~16x greater then what's possible with a CPU. Our memory needs are growing, not shrinking.

I agree.

Taken into account that XP is WAY smaller than Vista/7 family....what can we concur on the software side?
 
"Minimum is running, recommended is smooth, beyond recommended is butter."

Can I use that quote JS?
I will give you credit LOL

Go for it. Same goes for games. I love when people get pissy because they have the minimum specs for a game and expect it to run smooth as butter. I always try to have beyond recommended for a game or OS. But then again thats just me.

I had 2GB of RAM with my Pentium 4 system and XP in 2003 (that was WAY beyond what the normal was back then). 4GB with Vista in 2007 (everyone was on XP and 1GB) and now 16GB with 7 while most stick with 4GB.

I always try to stay ahead of the curve. Now I need to get my wifes up to 8GB.

Running what exactly? OS without applications is as useful as a car without wheels. You can look at it all day long, but it won't actually accomplish anything.

Also keep in mind the memory density available when Windows XP was published. 64MB was standard stick size, that's 8MB per chip or 64Mb fab technology. So this super CPU with local memory be running Windows XP with 8MB.

I am not arguing that its better to have more. Look at what I said. SP3 is the main reason you need more than recommended. Hell you need 900MHz for SP3 while XP pre SP was 266MHz. It changes.

But considering the line of work I am in, I can tell you how a PC runs with apps on what amount of RAM and what OS. I have delt with XP machines with 128MB of RAM (crappy) and XP systems with 4GB (no different from 1GB unless the app can use it). I have messed with Vista with 512MB (holy slow PC Batman!) and Vista with 2GB (smooth but still a bit meh) and more. Same with 7. We used to build 7 systems with 2GB starting, and I have worked on them. They ran fine, mainly due to Superfetch but still its not as you seem to be implying, slow. 1GB it is slow. Even on 7 Starter with nothing loading. But 2GB it is very nice. 4GB is smooth as butter and anything else is just icing on the cake.

Windows ME = best OS ever...
:lol:

Yep. Had a friend who said it was great because he knew how to fix all the issues when they occured.
 
I agree.

Taken into account that XP is WAY smaller than Vista/7 family....what can we concur on the software side?

Memory needs are growing not shrinking. The biggest constraint over the past five years has been 32-bit code. On Windows NT x86 applications are only given 31-bits worth of address space, the final 32nd bit is used as an easy way to distinguish between shared kernel memory and local application memory. Every application gets 2GB of their own virtual address space but the kernel has a single 2GB of address space. This is a left-over from NT 4.0 days. NT Kernel x64 has no such limitation and applications have ridiculously large address space. If an applicaiton attempts to load more then 2GB worth of data into it's memory space it'll cause a page fault as it would attempt to write to the protected kernel memory space, it'll cause the application to crash. Anything over 1.8GB presents a risk that it would inadvertently cause a page fault by loading something slightly too big into memory. This is why you can have 16GB of memory in your system but Skyrim will only load a limited amount of data, the programmers didn't want to risk crossing that 2GB boundary.

As programs start to be compiled and released as 64bit executables this limitation will go away and you'll see programs start to load 4GB+ of data into memory. Games are already 4~10GB+ in size, so large that some of them take multiple DVD's to install. This combined with system cacheing points to more memory being required, not less.

Obsidian's developers went into detail about these problems after their made NWN2. NWN2 was a 32-bit executable but their aurora toolset they had made to create and develop for NWN2 would have big issues with crashing on their development machines. 2GB was simply not enough memory for their tool set to load all the required resources for development and they were forced to develop a 64-bit version of it so they could finish developing the game. NWN2's last official patch was 2009, it's now 2012.
 
I am not arguing that its better to have more. Look at what I said. SP3 is the main reason you need more than recommended. Hell you need 900MHz for SP3 while XP pre SP was 266MHz. It changes.

But considering the line of work I am in, I can tell you how a PC runs with apps on what amount of RAM and what OS. I have delt with XP machines with 128MB of RAM (crappy) and XP systems with 4GB (no different from 1GB unless the app can use it). I have messed with Vista with 512MB (holy slow PC Batman!) and Vista with 2GB (smooth but still a bit meh) and more. Same with 7. We used to build 7 systems with 2GB starting, and I have worked on them. They ran fine, mainly due to Superfetch but still its not as you seem to be implying, slow. 1GB it is slow. Even on 7 Starter with nothing loading. But 2GB it is very nice. 4GB is smooth as butter and anything else is just icing on the cake.

And look very carefully at the timelines that these happened.

Or do you think 2GB will be "enough" two years for now?

That's the thing I'm trying to get across, we're not talking today or even tomorrow, we're talking two or more years from today for the sizes your talking about.

Today would be Windows 7 with 512MB of memory, which you and I both know it's going to cut it.
 
With so much resources being stored, how does a processor handle that much?

I mean with 64 bit processing, up to 192 GB of memory can be utilized...is that how far we will go in the next 20 years?

The answer to your question is a simple one, how do you eat an elephant? One bite at a time.

CPU's are capable of processing at a ridiculous rate, more often then not it's I/O that slows things down as the CPU stalls waiting for something from cache.

If you want an idea of what memory will look like you just need to figure out what process will be standard at a specific period of time. The lions share of memory tends to have eight chips per stick at ~$40~50 USD. Find out the common process and multiple by 8 to find out the stick size.

Today it's 4Gb, so 4 * 8 = 32Gb per stick, divide by 8 to convert to bytes, 4GB per stick.

If 8Gb per chip become cheap then you get 8Gb * 8 = 64Gb / 8 = 16GB per stick.

So basically expect chip sizes to double every 2~4 years depending on technology advancements. Samsung's at 30nm now and working on 20nm as we speak.
 
You know guys... I really don't care about programs addressing more than 2GB of RAM or using all 2^64 address bits worth at all... I just want the damn devs to don't leave nasty memory leaks all over the place. Valve has a Bible about memory management when they talked about the XBox development. It's a pretty nice read, BTW.

Not even enthusiasts use more than 4GB or RAM, since games don't address more than 2GB and the OS is happy with 2GB for day-to-day tasking, we don't even get close to 6GB for gaming. I was just playing RAGE and it went up to 3.2GB combined; it was chugging like 900MB "only". I've seen CitiesXLs memory leak use 2GB, lol. Anyway, only workstations-alike PCs use more than 4GB.

Oh, and AMD needs to beef up the bloody HT link. It should have been done with FM2, but oh well... Trinity won't be satisfied with that small and old thing, lol. Maybe using that on-die/on-layer/in-chip memory would really help the APUs; but they do get really hot, so I don't think it would be feasible.

Cheers!
 
You know guys... I really don't care about programs addressing more than 2GB of RAM or using all 2^64 address bits worth at all... I just want the damn devs to don't leave nasty memory leaks all over the place. Valve has a Bible about memory management when they talked about the XBox development. It's a pretty nice read, BTW.

Not even enthusiasts use more than 4GB or RAM, since games don't address more than 2GB and the OS is happy with 2GB for day-to-day tasking, we don't even get close to 6GB for gaming. I was just playing RAGE and it went up to 3.2GB combined; it was chugging like 900MB "only". I've seen CitiesXLs memory leak use 2GB, lol. Anyway, only workstations-alike PCs use more than 4GB.

Oh, and AMD needs to beef up the bloody HT link. It should have been done with FM2, but oh well... Trinity won't be satisfied with that small and old thing, lol. Maybe using that on-die/on-layer/in-chip memory would really help the APUs; but they do get really hot, so I don't think it would be feasible.

Cheers!

Yep games tend not to use more then 2GB due to 32-bit Windows. That's slowly changing as we speak, any 64-bit software will be able to consume as much memory as it needs. 32-bit is dieing a slow painful death, it will die off just like 16-bit DOS did before it. This limit doesn't exist in Linux / Unix systems.

Your system can definitely use more then 4GB of memory, Windows 7 x64 particularly has a good caching system. And while I'd admit I'm on the upper end of enthusiasts, I do have 16GB and I put it to use.
 
With so much resources being stored, how does a processor handle that much?

I mean with 64 bit processing, up to 192 GB of memory can be utilized...is that how far we will go in the next 20 years?

The 192GB limitation is only because of the OS, not 64bit. 64bit can actually allocate up to 17,179,869,184GB. Yes thats 17 Billion GB. 2^64th power.

Doubt we will ever see that much in our lifetime. Maybe servers but even then thats a lot of memory to use.

You know guys... I really don't care about programs addressing more than 2GB of RAM or using all 2^64 address bits worth at all... I just want the damn devs to don't leave nasty memory leaks all over the place. Valve has a Bible about memory management when they talked about the XBox development. It's a pretty nice read, BTW.

Not even enthusiasts use more than 4GB or RAM, since games don't address more than 2GB and the OS is happy with 2GB for day-to-day tasking, we don't even get close to 6GB for gaming. I was just playing RAGE and it went up to 3.2GB combined; it was chugging like 900MB "only". I've seen CitiesXLs memory leak use 2GB, lol. Anyway, only workstations-alike PCs use more than 4GB.

Oh, and AMD needs to beef up the bloody HT link. It should have been done with FM2, but oh well... Trinity won't be satisfied with that small and old thing, lol. Maybe using that on-die/on-layer/in-chip memory would really help the APUs; but they do get really hot, so I don't think it would be feasible.

Cheers!

Skyrim can naturally use more than 2GB (just patched a bit ago) but memory leaks suck.

As for HT, I agree. My 2500K is pushing almost 22GB/s and a friend who has his memory at 2200MHz is at like 27GB/s. Not sure why AMD hasn't pushed it higher.

Still more memory bandwidth than is needed for DT but servers sure don't mind the extra space.

and how is William 'Bill' Gates nowadays.? :lol:

Probably very happy since he does what he loves, philantrophy. Guy is a great person. Only wish I could meet him. I say Bill Gates for President.
 
The 192GB limitation is only because of the OS, not 64bit. 64bit can actually allocate up to 17,179,869,184GB. Yes thats 17 Billion GB. 2^64th power.

Doubt we will ever see that much in our lifetime. Maybe servers but even then thats a lot of memory to use.
.

Thanks for clearing that up.

And I want to be an IT Tech this summer? :/
 
Applications are slowly converting over.

Waterfox .... yes it's what you think it is.
http://www.makeuseof.com/tag/waterfox-speedy-64bit-version-firefox-windows/

Just example.

Is there a 64-bit skyrim client? If not then the most you can hope for is 3GB using a NT kernel trick, something's that dangerous as it screws with kernel memory. Otherwise you'd have to have a separate process with it's own address space and communicate via IPC ... eww.


Also Windows 9 (or whatever it's name) will be 64-bit only. MS is wanting to phase out 32-bit.

http://www.windows7news.com/2011/07/16/windows-9/

And I agree memory leaks suck. Mostly bad programming practices with allocating memory space and never unallocating it.
 
@palladin- my system has 8gb of ram-does using IE9-64bit make a difference compared to the 32bit version?

Depends how many tabs you have open and what your doing.

Security wise, definitely. 32-bit memory model of Windows NT was designed back in NT 4.0 days, programs are allowed to make privileged calls to kernel memory. 16-bit programs are allowed to write to kernel memory indiscriminately. 64-bit NT Kernel enforces a much stricter memory architecture, only kernel drivers and process's are allowed to write to kernel memory and user mode drivers are restricted from privileged access, even if you have administrative rights. 32-bit applications must run in an emulated 32-bit environment known as Windows on Windows 64 (WoW64), this includes emulated system access. Thankfully a 32-bit program can never access kernel memory of the 64-bit Kernel but they can interfere with the WoW64 virtual kernel and screw up other 32-bit programs that might be running at the same time.

Performance wise, running a 64-bit client natively on a 64-bit OS would be faster then running a 32-bit client in an emulated environment on a 64-bit OS. Running a 32-bit client natively on a 32-bit OS would be faster then both above due to the reduction in administrative overhead.
 
When talking 32 vs 64 for code you gotta understand how CPU's work. Before you start sending instructions to a CPU you must first tell it which mode to operate in (if it has multiple modes). So 64-bit Windows sets the x86 CPU into "long mode" which is where each register is treated as a 64-bit register instead of a 32-bit one, the stack and MMU also are treated as being 64-bits long. A CPU can freely change between operating modes, but not without first clearing out all state information and doing a full reset (wiping stacks, pointers, registers, register rename file, cache, ect..).

Now EMT64 allows execution of 32-bit instructions natively, which means you don't need to emulate the HW or translate the instructions. What it doesn't allow is a 32-bit instruction to receive a 64-bit result from an operation. Executing 32-bit instructions on a 64-bit register can result in unpredictable results if measure's are not take before hand. So there is always some overhead when executing native 32-bit on a CPU running in long mode. Basically you can NEVER mix 32-bit and 64-bit machine code together without unpredictable results.

What is windows part in this? Well since the NT kernel is different from 64 and 32 bit windows, 32-bit programs expect the kernel to answer certain ways and expect certain shared code library's to be available, Windows provides them with what their looking for. There is a fake kernel environment called WoW64, included is a bunch of 32-bit DLLs and system files. You can see them for yourself if you look under C:\Windows\SysWOW64. C:\Windows\System32 actually contains the 64-bit library's and drivers. So windows creates the fake 32-bit environment and then runs all your 32-bit programs inside of it, it does this so it can segregate the 32-bit sections of the OS from the native 64-bit ones. Keep compatibility while also using a significantly more secure memory and driver model.

Also Windows provides a separate registry and program files folder for the 32-bit programs to exist in.
 
Memory needs are growing not shrinking. The biggest constraint over the past five years has been 32-bit code. On Windows NT x86 applications are only given 31-bits worth of address space, the final 32nd bit is used as an easy way to distinguish between shared kernel memory and local application memory. Every application gets 2GB of their own virtual address space but the kernel has a single 2GB of address space. This is a left-over from NT 4.0 days. NT Kernel x64 has no such limitation and applications have ridiculously large address space. If an applicaiton attempts to load more then 2GB worth of data into it's memory space it'll cause a page fault as it would attempt to write to the protected kernel memory space, it'll cause the application to crash. Anything over 1.8GB presents a risk that it would inadvertently cause a page fault by loading something slightly too big into memory. This is why you can have 16GB of memory in your system but Skyrim will only load a limited amount of data, the programmers didn't want to risk crossing that 2GB boundary.

As programs start to be compiled and released as 64bit executables this limitation will go away and you'll see programs start to load 4GB+ of data into memory. Games are already 4~10GB+ in size, so large that some of them take multiple DVD's to install. This combined with system cacheing points to more memory being required, not less.

Obsidian's developers went into detail about these problems after their made NWN2. NWN2 was a 32-bit executable but their aurora toolset they had made to create and develop for NWN2 would have big issues with crashing on their development machines. 2GB was simply not enough memory for their tool set to load all the required resources for development and they were forced to develop a 64-bit version of it so they could finish developing the game. NWN2's last official patch was 2009, it's now 2012.


What drugs are you on? i want them because 32bit rules the land.
 
Status
Not open for further replies.