Sounds like a stupid question - but - I have a 7 year old system I built with i7 3939, 32gb DDR3 1333 RAM, 1 tb Samsung 850 EVO SSD for OS, Radeon HD 7900, plus a bunch of spinning HD for data. I also put in a Corsair 1200W PS for no good reason other than I liked the idea.
It has been on 24/7 since 2012, used for medical imaging applications (mostly CPU rather than graphics card intensive) plus Photoshop, Lightroom, HDR applications, and Excel. No video editing or gaming. For my photos I use fairly large RAW files that slow things right down.
No problems so far, but was thinking at some point it should be replaced.
But, when I look at a new system, unless I spend at least 4k, I am not sure it will be noticeably better - an i7 8700 isn't much faster (maybe 25% better CPU mark score), an i9 9900 is about 2x better CPU mark, DDR4 RAM at 3200 a bit faster (but way more expensive than it was), and of course NVMe SSD is faster, and an optane accelerator can speed up my spinning data disks.
The rate limiting step for my medical imaging application is still bandwidth - I have a 600 GB internet line, and that is really what limits things - wavelet decompression and image manipulation really aren't noticeably slow. Doing things like merging image files for HDR and
So - even though I like the idea of building something new, I am not all that sure I would notice much difference - has the real world effect of Moore's law kind of petered out for non-gamers? Would doing an image merge of large RAW image files be THAT much faster? or still a kind of a pain, more likely...
Maybe I should just wait until something blows up - maybe the motherboard will fail, or power supply or...
Thanks for your advice and comments.
It has been on 24/7 since 2012, used for medical imaging applications (mostly CPU rather than graphics card intensive) plus Photoshop, Lightroom, HDR applications, and Excel. No video editing or gaming. For my photos I use fairly large RAW files that slow things right down.
No problems so far, but was thinking at some point it should be replaced.
But, when I look at a new system, unless I spend at least 4k, I am not sure it will be noticeably better - an i7 8700 isn't much faster (maybe 25% better CPU mark score), an i9 9900 is about 2x better CPU mark, DDR4 RAM at 3200 a bit faster (but way more expensive than it was), and of course NVMe SSD is faster, and an optane accelerator can speed up my spinning data disks.
The rate limiting step for my medical imaging application is still bandwidth - I have a 600 GB internet line, and that is really what limits things - wavelet decompression and image manipulation really aren't noticeably slow. Doing things like merging image files for HDR and
So - even though I like the idea of building something new, I am not all that sure I would notice much difference - has the real world effect of Moore's law kind of petered out for non-gamers? Would doing an image merge of large RAW image files be THAT much faster? or still a kind of a pain, more likely...
Maybe I should just wait until something blows up - maybe the motherboard will fail, or power supply or...
Thanks for your advice and comments.