I just took a certification class, and my instructor (who works for IBM) said that in general CPU speeds double every 18 months for the same price point. Is this true about how fast they get updated, and how fast do GPUs get updated?
I just took a certification class, and my instructor (who works for IBM) said that in general CPU speeds double every 18 months for the same price point. Is this true about how fast they get updated, and how fast do GPUs get updated?
This sounds like a (mis)interpretation of "Moore's Law", which states the number of transistors which can be economically placed in an integrated circuit doubles every couple years... More transistors can definitely process more data, but the law covers only transistors, not throughput. And definitely not clock speed - since that has fallen in recent years, rather than risen.
This sounds like a (mis)interpretation of "Moore's Law", which states the number of transistors which can be economically placed in an integrated circuit doubles every couple years... More transistors can definitely process more data, but the law covers only transistors, not throughput.