Has CPU technological development slowed down a lot?

internetswag

Distinguished
Dec 6, 2011
275
0
18,790
I ask, because I built my current PC 3 years ago and bought the GPU 2 years ago. Since then I was completely out of the loop with PC part developments both graphical and CPU.

I was checking out new CPU's for fun and it seems there has not been all that much progress. As it stands I see very little reason for anyone with an i5 2500k to upgrade CPU (for gaming at least).

It's weird I heard when the i3 2120 and i5 2500k came out, that CPU power and speed would be increasing by 25% each year - perhaps that was more for tablets though.
 
Solution
Regarding your question, yes gpus are better at general functions and being made to be more general processing units nowadays than just graphics processing.

But the reason why gpus keep improving and cpus do not is a wrong assumption.

CPUs have progressed as have gpus, both have progressed.

BUT

CPUs have reached the point where they far exceed game requirements so people can even get mid range cpus or low to mid range cpus and still be able to run top spec games on recommended settings.

The limiter with games is the gpu really and you can easily get mid range cpus or low to mid range cpus which can run with top end graphics cards without bottlenecking them because the gaming requirement for cpus is pathetic compared to that of gpus.
It hasn't slowed down, its just changed direction.

To ellaborate, Intel initially planned to keep increasing the clock speed of cpus to unfathomable heights of 15ghz plus but this panned out.

When they hit the brick wall, they changed direction and decided instead to make multiple cpu cores as they hit a clock speed brick wall and thus multi core was born.

Multi Core does suffer from greater diminishing returns than single core clock speed increases for several reasons but the extent to which these diminshing returns are greater if greater at all is debatable.

In summary, cpu development has probably not slowed down at all just changed direction and if it has slowed down at all, the slow down is not significant.

 
By the way, have you even seen cpu requirements for gaming.

Go on to game debate and check it out.

Cpus are definitely not the limiter for gaming, cpus have far outstretched even the best games for many many years to come.

You would be fine using a cpu from any company tbh, you dont need an intel one or even a top spec one cuz they are all overkill for gaming.

The big limiter for gaming is the video card so to a large extent all the people arguing about intel vs amd are doing so pointlessly because from a gaming perspective, it doesn't really matter as they are all too good for games anyway.

The only issue really when slecting a cpu for gaming is really if it will bottleneck your graphics card or not and in most cases since they are all op for gaming anyway, they wont.
 
Used to follow Moore's law, double the performance every 18 months.

To me it seems this stopped being relevant 5 years ago.
5 years ago I had a quadcore Phenom. That CPU would be better than my current, new CPU at same clockspeeds. I would say the development on AMDs side has come to a halt or even gone backwards.

And therefor Intel can just lay back and release their 4th, 5th, 6th generation of basically the same platform with slight adjustments.

I was actually shocked when I upgraded my system a couple months back how little progress and development there has been.


And to adress the poster above me. Game programmers use GPU more and more, even for other stuff than graphics. Some even call the chips General Programming Units instead of Graphics Programming Units. Could it be because there has been no or a small increase to the performance of CPU but GPUs still perform better and better, generation after generation?
 
Moores law is speculated to die soon but there was an article on this recently and Moores law has till continued to double the number of transisters at the same pace till the modern day.

So if we are judging by moores law, cpu progress hasn't declined although it is predicted to decline in the future once we reach the 5nm production process according to intel.
 
As for Amd, I would say it hasnt gone backwards at all with AMD but AMD are using larger chips than intel but there is a good reason for that when it comes down to APUS.

GPUS require larger sizes than cpus atm, just look at the production process for top end gpus and compare the die size to cpus.

Thus with APUs, AMD has to compromise between the size best for a gpu and a cpu since they are both in the apu.

This doesn't mean things have gone backwards, rather his is progression as HSA and APUs are the future and produce efficiencies which cant be gained without merger, you cant compare APU die size to cpu die size and expect it be the same when it gpus are made at larger die sizes atm.

And as for moores law, its actually about doubling the number of transisters which does not necessarily mean a directly proportional increase in performance so you are slitghly off there no offence.

So far though, moores law is still present and expected to stay present till 5 nm architecture according to intel.
 
Regarding your question, yes gpus are better at general functions and being made to be more general processing units nowadays than just graphics processing.

But the reason why gpus keep improving and cpus do not is a wrong assumption.

CPUs have progressed as have gpus, both have progressed.

BUT

CPUs have reached the point where they far exceed game requirements so people can even get mid range cpus or low to mid range cpus and still be able to run top spec games on recommended settings.

The limiter with games is the gpu really and you can easily get mid range cpus or low to mid range cpus which can run with top end graphics cards without bottlenecking them because the gaming requirement for cpus is pathetic compared to that of gpus.
 
Solution


I should have replied to your comments earlier lol. So here it is. read above for my reply buddy.
 
As said above, CPU performance has started to plateau, for a number of reasons eg. CPUs outstripping most user's needs by a large amount, and AMD no longer being competitive in the high performance space leading to Intel just sitting back and not making any performance improvements.

Another factor is Intel and AMD are focusing more on improving their integrated graphics and power consumption to make their chips more suitable for tablets. Both companies see desktop PCs as a dying platform, at least for mainstream consumers, and they won't be able to make enough money catering solely to the server, workstation, and PC gaming market.
 
I came across this article, very interesting read. Also has a Part Two.

http://www.extremetech.com/computing/116561-the-death-of-cpu-scaling-from-one-core-to-many-and-why-were-still-stuck

"From 2007 to 2011, maximum CPU clock speed (with Turbo Mode enabled) rose from 2.93GHz to 3.9GHz, an increase of 33%. From 1994 to 1998, CPU clock speeds rose by 300%."

In terms of Mhz, 33% increase. But double the amount of cores if I'm not mistaken (4->8, Phenom quadcores existed back in 2007, I had one. FX-series was released late 2011. http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors).

Still, performance is lagging behind.

I find this topic very interesting so thanks for the information eazy899.
 
I just wonder why people became so flexible today.
It was clear question from the guy.
But half who answered tried not to answer but persuade he is thinking wrong.

CPU progress has not stopped just changed direction.
And other shit.

Question was very clear.
Guy wanted know if any CPU exists today better its analogue from 3 years old past.
And He has not asked about some power consumption, tablets and other shit.

HE JUST WANTS NEW BETTER CPU FOR HIS PC.
Why You just cannot say is it possible. And If yes list proposals.

And here comes my answer.
I agree with internetswag, There is very little performance boost comparing CPU today and 3 years old.
My opinion is that those advantages do not worth that grown price Intel wants for new CPUs.
I see a little CPU frequency growth and same number of cores. Maybe servers have bigger progress but I focus on usual PC for now.

Summary.

So If I have i5 i7 bought 3-4 years ago I would not buy i5 i7 to replace them today
because It does not worth money Intel wants for them.

It is clear comparison. I think adding some additional features like integrated graphics and other shit to CPU exist to fool customers.
I never heard from ordinary CPU users they want integrated graphics of higher level.
Office workers are satisfied by existing integrated graphics. Others want ability to change CPU and graphics independently. And it is fair according to that price they pay for good CPU and graphics.

And at last there is very good old thing about estimating product.
Good market shows quality growth with price lowering or same.
But who cares ? We better sit on ass and blame those who wants more instead of stay with them demanding that from corporations who propose us nothing we want.
(They propose new features. But do we need them ? Should we pay for them ?)



PS
How many times people you gonna instead of answering
tell somebody that he/she is wrong ?
You have noone to blame at home and You seeking for him/her someone in chat ?
 
I am still using my desktop from 5 years ago and it's still very fast. I do keep it clean though and run shit in Sandboxie to prevent infections, which helps. I can say computers have seen some improvement since then though, especially mobile computing has gone a long way, you just have to compare the best Macbook Pro 5 years ago with the best Macbook Pro now. Lower power dissipation, better battery life, better overall performance, and smaller than ever.

However, development is slowing down. Besides the technological brick wall, there's the practical aspects of it all:

1) The average user (most users) doesn't really need any more computer power. That's why a lot of average users have ditched desktop and laptop computers in favour of iPads and cellphones, because all they were doing on their computers was chat, email, music, movies and porn. Further advances in computer processing speed will only benefit a minority of power users, which means a smaller market, which means less demand, which means less incentive for manufacturers to push the boundaries of development.
2) Display resolution greater than the resolution of the human eye is unnecessary. 4K is already pretty fucking amazing, and generally speaking most people are happy to trade quality for speed (e.g. when streaming videos). Again, demand for hi-rez video is not there as far as the bulk of the market is concerned.
3) Devices are already nearly as small as it is practical for them to be. A 2mm thin iPad may be great but if it's too thin it might also be too easy to bend and damage.

Any type of technology can only be pushed so far. Once a brick wall is reached, new technologies must be explored.
 


Number 1: I can never get enough CPU power. Compiling stuff in linux, encoding videos etc really benefit from faster CPUs. So do certain games. Lots of areas where I could use more CPU power. And if that performance was readily available to everyone, I bet applications, games etc would introduce new things like higher quality, better AI etc etc. But that just hasn't happened. No point in programming something for the comsumers requiring 32 cores 4 Ghz computers when only companies have those.

Number 2:
4k resolution has only 8 million pixels.
An eye has roughly 120 million rods an cones.
http://hyperphysics.phy-astr.gsu.edu/hbase/vision/rodcone.html
https://en.wikipedia.org/wiki/4K_resolution#Resolutions

It's not a fair comparison regardless, we don't see in terms of pixels and our cone of focus is very small, peripheral view is very wide (and generally black and white).

The demand isn't here because most can't afford it yet. It's also a bit of beta, it's still developing. Just like 1080P was for years and is now a de facto standard, I assume 4k will be a standard in 2-3 years, in our homes.


Number 3: I agree. We saw that with the mobile phone. It could be paper thin but the industry decided we should have more features like a camera/webcam, apps etc. It's barely a phone anymore.