Will we ever have 128-bit processors?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Fixadent

Commendable
Sep 22, 2016
307
0
1,780
We've had 64-bit processors for quite a long time now, but do you think we'll ever have 128 bit CPU's?

And what will be the advantage of that?
 
Solution
The number of "bits" a CPU supports has always referred to the Address Space the CPU is capable of addressing [EG: How much memory it can use at any one time]. From that perspective, there's really no need to go beyond 64-bit, as we aren't ever going to saturate the full 64-bit address space [16 Exabytes]
I'm going to focus on floating-point numbers.
If you use a real function in Excel, e.g., =rand(), you will notice that the output is a 15 decimal places number, i.e., a 64-bit number.
We will need a 128-bit processor when we have numerical problems that use 30 decimal places numbers.
Althought we could pack two 64-bit numbers in a 128-bit processor, it is easier to "repeat" N times the 64-bit processor design in a chip (the number of cores) than to pack numbers in a Nx64-bit processor (that would imply redoing everything, including the instructions set).
Now, we have the Xeon Processor E7-8894 v4 that has 24 cores (it would be necessary to design a 1536-bit processor) but, within 2 years, we will have more powerful processors by simply adding more cores (there is no theoretical limit to its number).
 
Not in my lifetime, and maybe not even in my kids, there'll be an exodus. With war on a large scale about done, medicines killing or preventing plagues or other mass nature derived deaths, ppl in several generations will be seriously looking for real estate. China is only the first to cap births. So where do you go? The moon. To start with. What do you take with you? Whatever you can cram onto the shuttle. World brain computers running Terra forming plants etc. You'll have pc's a hundred times faster and more powerful than now, smaller than 's cellphone. Warp drive calculations, craft like the Enterprise, it's coming. AI like in "I Robot"? Not that far off. Is there a need for 128bit processors now? Nope. Tomorrow? Prolly not. But what about the day after. AMD dropped a bomb several years ago, the FX 8 core cpu and was basically ignored as a serious contender since a dual core Intel was faster and stronger per core. Yet here we are now, the i5 has 6 cores, the i7 has 12 threads and Xeons are dropping 32 cores. Go figure progress. AMD's limitations on the FX were set not by the cpu, but by software development catering to Intels $. Now there's games that'll actually use 8 threads. Once cpus hit silicon limits with 5nm, physical core size will be limited unless you start stacking like HBM, so while theoretically there's no limits, physically there are. So cpu architecture will have to change how it's used per core. Maybe then someone will figure a way to make use of 128bit, same as ppl figured on 64bit vrs 16/32bit limitations. Maybe by then a 1Tb 3d nand SSD will be no better off than a floppy disc, and pc's will be using 1k Tb SSDs, so will have a need for larger scale addressing. Hackers have already shown the limitations of 128bit encryption, something supposed to be impossible, even 16bit encryption is beyond most normal ability. How long before 1024bit encryption is the new standard 'impossibility' and just what kind of cpu power is that going to require to use effectively.

128bit is coming. When? Dunno. But it's coming nevertheless.
 


Yes, because CPUs had so much trouble doing 4294967295 + 1 prior to 64-bit came around.

Oh wait, they could handle that fine. There's logic in there to handle numbers/decimals that overflow the bit-boundary.

The rest of your comment is utter nonsense.
 


That truly made me laugh! Considering we carry on our personal bodies more computer technology than all of that used to go to the moon, that is an extremely short-sighted statement. I remember when 256 and 512KB 3D cards were all the rage and that it was going to make it so that you would never need a new video card again because of their power, LOL. Fortunately for us, the people who design these things are so short-sighted.
 


To be fair. There are lots of bloat in current software. Eliminating all the bloat, PCs could be 10--100x faster. In the old golden epoch, programmers could do marvelous pieces of code with only 64KB. Now that amount is needed only to store a silly icon.
 


In terms of code space, yes, theres a TON of bloat. But memory use is generally fairly optimized via going through APIs that hide all that from the developer.

Again, the root issue here is there is simply NOTHING we can think of that requires that godly amounts of RAM. Not DNA simulation, possibly not even fully dynamic multi-body physics interactions, or even AI. You don't need a 16EB dataset at the ready for computation at any one moment in time.
 
And you just said a mouthful...

"there is NOTHING we can think of"

And that's it in a nutshell. I'm quite sure when the Tandy 1000 came out, there was NOTHING they could think of that would require even 64bit processors, 128bit encryption etc. I'm quite sure when Isaac Newton was thinking up all his theories he wasn't thinking about splitting an atom either.

Give it a minute, and we will think of SOMETHING.
 


Actually, WE (And yes, I can say "We" speaking as a software engineer) knew of datasets that would expand well beyond the 32-bit space as far back as the late 70's, specifically multi-body physics interactions and some fields of particle physics.

So please, stop it. Barring a full dynamic physic simulation of all of existence, there is literally NOTHING that needs that godly amount of data for computation at any one time.
 


So the takeaway is we won't need 128 bit until somebody tries to build a functional holodeck.
 
Pretty much. There's a lot of things sci-fi writers have imagined over the years that at the time were physically impossible, like cell phones in the early 60's, that later came to fruition. With today's abilities we still don't have what it takes to make 3d holos, even though they are on just about every TV show now from CSI to Shield. It might be plausible with a 32bit processor, it might take a 128bit or maybe even a 256bit, there's simply no way to guage that as it doesn't exist. Yet. As said by others, currently there's absolutely no need for that kind of ability. Tomorrow is a different story.
 
We now have 128 bit RAM controllers. So it isn't a leap to suggest that technology would expand to CPUs in time.
https://www.anandtech.com/show/12146/intel-launches-gemini-lake-pentium-silver-and-celeron-socs-new-cpu-media-features

And we have a CPU with the L2 cache that supports 128 bit.
http://www.electronicdesign.com/industrial-automation/ryzen-rise-amd-rounds-out-processor-family

So yeah, we are seeing some aspects using 128 bit. To say we won't ever see that in CPUs is a bit naive and ignores how rapid technology develops and grows.
 
Those 128bit in the cache and the memory controller are about something different of what is being discussed here.

The 128bit memory controller simply means that the controller is dual-channel (64bit x 2) and thus that the chip is able to read/write up to 128bit at once.

We are here discussing memory addressing capacity of 128bit, not data transmits on packets of 128bit, neither 128bit vectors (AVX,SSE,NEON), nor any other 128bit aspect of computation.
 
Yes, I think that there is going to be 128-bit build. I hope so because I'm working on a design; a logo for this new technological advance in computing. It is going to be a new, more powerful, and very fast computing age.

Things are going to be truly greased lightning!

Right on, I agree with you, Bo Lee!