Discussion Productivity Predictions

jnjnilson6

Distinguished
Anybody remember the Core i7-990X from 2011? It used to be a big thing.
Anybody remember the first Pentium 4s from 2001?
Anybody remember 486 processors from 1991?

Well that's going quite back in time. We've currently got consumer CPUs with 24 cores / 32 threads and the potentiality of hitting 6 GHz out-of-the-box.
How many cores do you think CPUs will retain between 2030 and 2033? What nanometer technology will we be running? Would the difference be higher than that of times afore or synonymous to a particular era?

Write out your most grasping thoughts and wildest predictions. That would constitute the beauty of symposium and the versatility of discussion.

Thank you!
 
I remember the Motorola 68000 @7mhz on my first computer... Amiga 500 in 1988. I was ecstatic over the 3MB ram it had and was literally doing cartwheels when I made the jump from 2400 baud to 9600 baud. All the BBS surfing/file downloads really needed speed back in those days.

As for now... I'm building my 16/32 Ryzen 7950x3D rig tonight. Definitely the best CPU I've ever had... and I'm horrible at predictions. 😉
 
Last edited by a moderator:
I remember the Motorola 68000 @7mhz on my first computer... Amiga 500 in 1988. I was ecstatic over the 3MB ram it had and was literally doing cartwheels when I made the jump from 2400 baud to 9600 baud. All the BBS surfing/file downloads really needed speed back in those days.

As for now... I'm building my 16/32 Ryzen 7950x3D rig tonight. Definitely the best CPU I've ever had... and I'm horrible at predictions. 😉
How much did the Amiga cost you, if you can remember? 3 MB RAM in those days was an incalculable amount!
 
How much did the Amiga cost you, if you can remember? 3 MB RAM in those days was an incalculable amount!

Yes it was. Booting off the Amiga Workbench floppies... DF0: DF1: drive names... I remember a lot. As for the cost I don't remember because my Dad bought it for me for my 14th birthday... but Google says it was like $699 without a monitor... 1084S was the monitor I had.

The Amiga was well ahead of its time... and Video Toaster presentations were incredible. I used one for many years... I didn't make the move to Windows till my Pentium 75mhz machine in 1997.

I upgraded to a Pentium 200mhz with MMX next... then it was the AMD Athlon XP 1800+ in 2001... then back to Intel a few years later... and haven't went back to AMD until now. Nothing to do with brand dedication just how it's worked out over the years. I'm jumping ship back to AMD now at the early stages of AM5 which makes the most sense for upgradability for the next few years.
 
  • Like
Reactions: jnjnilson6
Yes it was. Booting off the Amiga Workbench floppies... DF0: DF1: drive names... I remember a lot. As for the cost I don't remember because my Dad bought it for me for my 14th birthday... but Google says it was like $699 without a monitor... 1084S was the monitor I had.

The Amiga was well ahead of its time... and Video Toaster presentations were incredible. I used one for many years... I didn't make the move to Windows till my Pentium 75mhz machine in 1997.

I upgraded to a Pentium 200mhz with MMX next... then it was the AMD Athlon XP 1800+ in 2001... then back to Intel a few years later... and haven't went back to AMD until now. Nothing to do with brand dedication just how it's worked out over the years. I'm jumping ship back to AMD now at the early stages of AM5 which makes the most sense for upgradability for the next few years.
I do think it would be a wonderful upgrade. How's the i9-11900K doing with the RTX 4090? I think it ought to suffice nicely. Of course, it should be expected that a good margin for higher performance should be left with the upgrade to the 7950X3D. Perhaps at a higher resolution you'll be getting over 15% increase in framerate, which is purely speculative and variably dependable.

If you're using the machine for animating there should be a very good bonanza of lesser rendering times. All in all, a very worthwhile upgrade, and definitely of reasonable proportions.

It's great you made it to the RTX 4090, that thing's a game-changer, much more powerful than any card of the previous generation.
 
How many cores will CPU's have in 2030-2033?

I'll make a prediction of at least 100 cores right now. And it could be 200 or more,

As far as the nanometer tech, they should be down to tenths of a nanometer by then.

Ah, the memories.

View: https://www.youtube.com/watch?v=O4QjFai8kqo
That vibrant, metallic voice brings inexhaustible memories of when proper elocution entangled fully the artistic world (cinematic, commercial, documentary). Those deeper undulations upon the vowels and the beauty of expression, vivid and grasping, coming through in a fine modulated state... Good old times!
 
Anybody remember the Core i7-990X from 2011? It used to be a big thing.
Anybody remember the first Pentium 4s from 2001?
Anybody remember 486 processors from 1991?

Well that's going quite back in time. We've currently got consumer CPUs with 24 cores / 32 threads and the potentiality of hitting 6 GHz out-of-the-box.
How many cores do you think CPUs will retain between 2030 and 2033? What nanometer technology will we be running? Would the difference be higher than that of times afore or synonymous to a particular era?

Write out your most grasping thoughts and wildest predictions. That would constitute the beauty of symposium and the versatility of discussion.

Thank you!
How many cores? NONE. It will be organic based. That is as valid a prediction as any other you will see.
 
  • Like
Reactions: jnjnilson6
How many cores? NONE. It will be organic based. That is as valid a prediction as any other you will see.
It would be a sarcasm tinted with the hues of the numbered generations, through the ominous moons and worlds of whirling frequencies and CPU-cores ablaze with deviant electric currents where would be borne this new predicament of yours. :)
 
2033.

Extrapolate from 2013 to today (2023).

Make a straight trendline.

That is no less guesswork than anything else.
If we were to paint a beautiful, vivid picture, ceaselessly captivating the higher connoisseur with tinges and fainter nuances and lighter shades and lingering diamondlike glitters and aspects readily and surreally interpreted, we'd have to concentrate upon the moving, the impermanent, the ungraspable shadows and shades first. It would be the same within the sphere of discussion and symposium; to illuminate a point beautifully, we'd first have to stare in the darkness and by making guesses add up the deafened shadows, which, lingering and changing, in their culminative effect, portray a relative, and ever beautiful truth. Pure facts never elicited anything moveable, malleable and provoking; an exquisite delight would be the exquisite and delicate, and nevertheless disturbing portrayal of the commingling of communication, the avid self-expression, and the illumination of knowledge through freedom. Without that somber distress, we're left only with a vaguer pattern set in stone, unmovable, unalterable, undoubtable...
 
If we were to paint a beautiful, vivid picture, ceaselessly captivating the higher connoisseur with tinges and fainter nuances and lighter shades and lingering diamondlike glitters and aspects readily and surreally interpreted, we'd have to concentrate upon the moving, the impermanent, the ungraspable shadows and shades first. It would be the same within the sphere of discussion and symposium; to illuminate a point beautifully, we'd first have to stare in the darkness and by making guesses add up the deafened shadows, which, lingering and changing, in their culminative effect, portray a relative, and ever beautiful truth. Pure facts never elicited anything moveable, malleable and provoking; an exquisite delight would be the exquisite and delicate, and nevertheless disturbing portrayal of the commingling of communication, the avid self-expression, and the illumination of knowledge through freedom. Without that somber distress, we're left only with a vaguer pattern set in stone, unmovable, unalterable, undoubtable...
I'm seeing a lot of flowery buzzwords, and no depth of information.
 
  • Like
Reactions: jnjnilson6
I'm seeing a lot of flowery buzzwords, and no depth of information.
This is just the point. As Albert Einstein has said, 'Imagination is more important than knowledge.' Poetry and prose are made of delightful words and wonderful sentences which carry on the lighter thread of meaning, interwoven, veritably, unmistakably through tints and shades of beauty. If we had only the rough structure, harboring the basic and overt truths, the beauty would be gone and there would be no shadows and no fainter yearning and knowledge would become an ugly thing. It is the same thing within the illumination of all subjects and spheres of knowledge. Take the beauty away and you have nothing left. Stick to the facts and shortly enough they become boring and forgotten. To entertain, to enthrall the imagination with the acerbic keys of imaginative wisdom, to bring forth ideas and ideals, there must exist a rhythm, a notion, a pungent difference. Even in computer literature, the cursives of the words follow an unnamed, unnoted flow. The cursives and curves come about raspingly and portray the shaded illuminations of a higher meaning, despite they are not poetic or benignly interwoven, they still make you think deeply and are a brisk outlet for knowledge and its constitution.
 
I'm seeing a lot of flowery buzzwords, and no depth of information.
'You have to apportion the values when you look back. You finish up the portrait then—paint in the details and shadows.'

We're talking hardware, but I think that the scalloped and saturated and expressively diverting piece I wrote about beauty would be easily understood by those who've read philosophy or have marveled at masterpieces in museums. The meaning in the so-called 'buzzwords' (I believe Oscar Wilde would be accused of a synonymous crime) exists. It means that you cannot attain beauty and truth without diversity, without wrongness, without the melting of concrete truths into something graceful and elegant, like fainter shadows straining in the darkness. The shadows provide a picture of beauty the same way a little wrongness or excessiveness, a little diversity of opinion deviating from the pure facts may grant a conversation grace. Surely, I've used 'fancy' and 'exquisite' words, but it was centered toward a logical meaning from the first word and not the other way around. Thing is you need to see just a little into the words to unravel the intended deeper correlations.

Thank you!
 
I do think it would be a wonderful upgrade. How's the i9-11900K doing with the RTX 4090? I think it ought to suffice nicely. Of course, it should be expected that a good margin for higher performance should be left with the upgrade to the 7950X3D. Perhaps at a higher resolution you'll be getting over 15% increase in framerate, which is purely speculative and variably dependable.

If you're using the machine for animating there should be a very good bonanza of lesser rendering times. All in all, a very worthwhile upgrade, and definitely of reasonable proportions.

It's great you made it to the RTX 4090, that thing's a game-changer, much more powerful than any card of the previous generation.

Yeah the 4090 is an absolute beast... the generational leap in power over the 3090 hasn't been seen previously I don't think.

As for the 11900k... it did fine I was just thinking long term and decided to jump ship to AMD now when AM5 is new and upgradability will be easy for a few years. I decided the 7950x3D was the best option given my mix of gaming and productivity. I passed on the 13900k due to thermals and the fact Intel will be changing the CPU socket again next gen... so meh... first AMD processor since 2001.

Build went well tonight... she's up and running with no issues. Getting software set up and fine tuning the next couple days.
 
  • Like
Reactions: jnjnilson6
Well that's going quite back in time. We've currently got consumer CPUs with 24 cores / 32 threads and the potentiality of hitting 6 GHz out-of-the-box.
How many cores do you think CPUs will retain between 2030 and 2033? What nanometer technology will we be running? Would the difference be higher than that of times afore or synonymous to a particular era?
Makers only have so much power to work with, the 250W that both companies use right now is already too much for many consumers and unless that changes core count is not going to increase much.

Cores need power to work and going smaller nm doesn't change that unless you target the same amount of performance, but you would want new cores to be more performant than the old ones so they will be using around the same amount of power, if not more if they can be pushed (overclocked) more than the old ones.

AMD loses a lot of the single/per-core performance when all cores are loaded and intel can only avoid that for their p-cores by using a lot of e-cores for multithreading.

Going for a high number of more cores will result in all of them running at very low clocks turning it into a terrible CPU for desktop use.
AMD is currently avoiding this by running 2 or more cores at higher clocks than the rest of them, if not running a total server workload, and as already said intel is avoiding it by only having 8 cores run fast.
View: https://www.youtube.com/watch?v=fBxtS9BpVWs&t=286s
 
  • Like
Reactions: jnjnilson6
Anybody remember the Core i7-990X from 2011? It used to be a big thing.
Anybody remember the first Pentium 4s from 2001?
Anybody remember 486 processors from 1991?

Well that's going quite back in time. We've currently got consumer CPUs with 24 cores / 32 threads and the potentiality of hitting 6 GHz out-of-the-box.
How many cores do you think CPUs will retain between 2030 and 2033? What nanometer technology will we be running? Would the difference be higher than that of times afore or synonymous to a particular era?

Write out your most grasping thoughts and wildest predictions. That would constitute the beauty of symposium and the versatility of discussion.

Thank you!
It is reasonable to predict that CPUs will continue to increase the number of cores over the next decade, as we have already seen a trend towards higher core counts in recent years. It is possible that CPUs with 48, 64, or even more cores may become commonplace by 2030-2033, especially in high-end consumer and workstation markets.

In terms of nanometer technology, it is expected that the industry will continue to shrink the size of transistors, which are the building blocks of CPUs. Currently, the most advanced CPUs are being produced on 7nm or 5nm processes. By 2030-2033, we may see the industry move to 3nm or even 1nm processes, allowing for even more transistors to be packed into a single CPU chip.

It is also possible that CPUs may become more specialized for specific tasks, such as machine learning or gaming, with dedicated hardware and software optimizations for these applications. We may also see advancements in quantum computing, which has the potential to revolutionize the entire computing industry.

Overall, the future of CPUs is exciting and unpredictable. With advancements in technology, we can expect more powerful and efficient CPUs with a variety of new features and capabilities.
 
More cores means more power right now, so not sure the high frequency desktop chips are likely to see massive increases in core count.

Intel is targeting 6P-8E for laptops, that is likely to continue. We might see 8P-32E for the desktop in a generation or two.

With the move to tiling for production, I think we are going to see additional capabilities included in processors. Even more local GPU power, large DRAM caches perhaps, and we are already seeing large L3 and L4 cache pools. Feeding the cores with data may become more important than core count. PCIe standards increase at a silly speed it seems, already on to the next before there is much adoption of the previous. Wouldn't be surprised if Zen 6 and Intel 16th gen offer PCIe 6.0. Also going to be a huge increase in USB4 support in the next few generations. AI accelerators etc.

As to process nodes, we are already seeing a splitting between logic and other functions on CPUs. If some features are already at a limit of diminishing returns or physics, I think stacking and layering is going to become the new norm. Which makes core count and power requirements mutually exclusive. Hopefully a race for efficiency happens in the desktop space soon, otherwise ARM has a decent chance to take over the desktop space.
 
  • Like
Reactions: jnjnilson6

Latest posts