"Future Proof" Law. Is there a constant?

Status
Not open for further replies.

DerrickWildcat

Reputable
Dec 19, 2014
23
0
4,520
Hello, after many years of being out of the Computer building scene...I'm Back. I've noticed a lot of, "Future Proof" or, "How long will this last" questions about new builds and hardware. I did a search for, "Future Proof" on Tom's and the same questions were asked 10 years ago. Since I've been out of the scene for awhile, I'm not up on hardware...specifically Graphic Cards. I'm just wondering if it is possible to determine a baseline for how long, "Future Proof" really is. I mean is it relatively constant. Was future proof the same amount of time 10 years ago as it was 5 years ago and could the time be predicted now? I know, the term, "Future Proof" is somewhat arbitrary, but is it possible to determine a baseline for what, "Future Proof" means in this context? Perhaps one component is the rate limiting step. The GPU, The CPU, Ram Quantity...
Perhaps a blanket answer can be established to this question.
Something like: If you want to be able to play the newest games every year on highest settings, you have to have the fastest GPU, CPU...and this future will be 3 years.

I'm new to this, so I hope this goes to the right place.
It's just something I was thinking about.
Thanks for your time.
 
Solution
I agree with most if not all of Synphul's points. I too use the term "future-resistance," believing that "future-proof" is a myth.
I used to think I was being frugal by getting just what I needed, but then I'd need some kind of upgrade fairly soon; I've since learned to take a few steps ahead of where I needed to be, and other than an occasional graphics card upgrade, really haven't haven't had to buy any more parts for my systems in years. I am finally contemplating an upgrade, because of an additional capability I want that was not available, and it is coinciding with the times that some parts (like HDDs) may be physically wearing out.
Its an aim, but not realizable. It needs to be in conjunction with a time-frame to make any sense. The longer the time-frame, the more inaccurate will be the predicted future.

So in the end its just a fad phrase.
 
my take, these things are relative.
some people are ok with having medium settings then a good setup will go a long way (longer than others)
if you are one who wants to have ultra settings everytime, then "future" is a bit short.

also, another thing to consider is, spend lots of money now or save some and upgrade every 2 years or so.

gpu tends to get older faster than cpu
 
Not if it's possible to establish what, "Future Proof" means. I know, I know, that's the tricky part. But if we could say in this context, "Future Proof" means how many years can you run the newest games on their highest settings. And would this be the same amount of time as say a, "Future Proof" computer of 5 or 10 years ago would be? Is advancement consistent?
 
for me futurproof is that I can run games in 3-4 years on medium settings without having to lower my resolution. I guess futureproof has another meaning for everyone. technology is developing so fast that if you were to buy the fastest (single chip) GPU now you wont be able to max out a game in 3 years neither. so for real future proofing you'd have to go with multiple GPU setups. CPuU's however are a different matter. there are lots of ppl who are using the latest GPU's with i5 2xxx CPU's without experiencing bottlenecks. so long story short: it's easy to get yourself a good CPU that will last you a good long while. but the GPU's and GPU requirements are developping so damn fast it's hard to keep track off 😉
 
Photography is my primary usage. I can use any average Graphic Card. It's getting 250 Camera Raw images off the card and import them full size into Lightroom as fast as I can is my primary need. Lightroom and Photoshop are more Ram and CPU demanding than you think. USB 3 is a big improvement as well. However I think most of the people here are more into the games aspect.
 


The problem isn't so much is advancement consistent, it's how much you see. The jump from 1k polygons to say 5k polygons was a big step in quality, where as modern high end GPUs handle so many polygons that a 10% jump in count hardly registers in the quality of the image.

Moores Law as far as I know is still in effect. The problem with predicting the future isn't so much will the hardware be capable, it's now will the games push the limit? A big limiting factor is the resolutions. In three years 4k will probably be the defacto standard for high end monitors. Right now you need a pretty boss system to push 4k resolution at playable framerates. Are you happy with medium settings and 1080p? If so then your future proofing may last longer than for example mine. I like the eye candy. Where do you draw the line at good enough and why is it such a big deal? No one wants to get ripped off, but the nature of the beast is that by the time your parts are delivered they're obsolete. I try to keep parts as long as they're in warranty, that's my rule of thumb, after that they get repurposed into family machines and such. Asking even computer Gurus to predict the future is futile. When it comes down to it, only the people working on the games and the new engines and the new GPUs know what their time frames are like in reality and it's got to be a big industry secret because of fair trade. Not only do the companies not want to share, they don't have to, and other companies respect that. A few years back some one stole the recipe for Coca Cola and tried to sell it to Pepsi, Pepsi reported them to the FBI. They don't even want to know, it's not a concern, they do what they do and they're successful at it so they'll keep doing it. Go ahead, try to predict what's next for Pepsi. Ask people who work for Pepsi. May as well request lottery numbers while you're at it. Buy the best you can afford and enjoy it until it's no longer good enough. End of story.
 
But we kind of can predict. However it's a lot more work than I'd be willing to do and i'm not knowledgeable enough. You can do a search on Tom's of this same question being asked 10 years ago. People posted the top of the line specs of their build. You could then go back and find out which games were the most CPU and GPU demanding. And determine how well that original Build would run the most demanding games 3,4,5? years later. It would be possible to do the same thing with a build, say 5 years ago. Would, "Future Proof" last the same amount of time? Or is, "Future Proof" becoming a less amount of time than it was 5 or 10 years ago. I think it might be possible to establish a rule of thumb.
 
The term itself is oxymoronic at least in tech sector. If something is "future proof", then one has to assume that it will be as good in the future as it is now. It is then assumed that future progress will be very slow / non existent, which is never the case.
Top of the line PCs ten or even five years ago have laughable specs compared to mid range PCs now.
So, the better way to ask questions for "future proofing" should be, "if I buy a very good PC now, will it remain somewhat useful in the next three years?"
Coming to the more important part of "baseline", the most demanding games today can be somewhat be assumed to be on having similar demand as mid level games three years from now.
This is because, current generation most demanding games are meant to stress out the best available hardware while it is being developed (e.g. the Assassin’s Creed Unity demands the just phased out 780 for "recommended" settings).
The next year, the hardware makers make a better chip and accordingly the game demands are scaled up too. Thus reaching the third year, the top of the line hardware then is now relegated to the mid level.
The CPU demand will remain somewhat flat in the near future as Intel enjoys monopoly (AMD no longer a serious competition for top CPUs) and thus Intel has become lazy and sees no urge to jack up the performance significantly with successive generations.
RAM is the only component that is relatively cheap and easy to upgrade and thus even though AC Unity demands 8gb, most old PCs can still keep up with that.
The GPU sector is still however, hotly contested (at least till just before Nvidia came up with Maxwell) and both Nvidia and AMD are showing significant performance boost with each generation. This has in turn allowed game makers to up their demands significantly each year and the "future proof" seekers are left in the dust. Just look at this hilarious comparison between the GTX 580 and GTX 980 with a three year gap between them
http://gpuboss.com/gpus/GeForce-GTX-980-vs-GeForce-GTX-580
Even the 970 heavily outperforms the 580.
I am expecting the mid range 960 (when it comes out) to be somewhat equivalent to the once mighty 580.
Hope this gives you what you were hoping for.
PS: I took the example of games as they are one of the most widely tested and demanding software that are available.
 
well I dont think those results would be very comparable. and this is pure speculation.I dont have the required knowledge from the technology from 10 years ago so correct me if i'm wrong!
nowadays GPU's are being improved at a higher rate seeing as they are becomming more and more important. while CPU's are being improved in a slower rate than 10 years ago. as they are pretty powerfull atm. for gaming rigs that is at least. they'll have to improve the games engines etc to support more then 1-2 cores for commercial CPU upgrading to become a big leap again

that's my point of view at least!
 
Ok, well with that. I wonder if Graphic card performance increases have stayed relatively consistent every year for the last 10 years.
Meaning this years new king of the hill Graphics card is 10% faster than last years king of the hill, and if so, is it usually a 10% increase from year to year? Or is it allover the map? By allover the map I mean maybe one year it's 3% increase and the next year it's a 25% increase?
 
well the 780ti should be compared with the 980ti when(if) it comes out. I'm guessing the performance increase from the 980 to 980ti might be 10% already.
the gtx 670 has the same performance as the gtx 760. the 970 is womewhere in between the 780 and 780ti. so I think we can see that trend continuing. so it's more or less a steady increase, but they just follow up faster
meaning that over a certain peroid of time the performance increase is bigger then it was. but the performance increase from productline to productline seems stay quite stable
 
The thing is, technological evolution happens nowadays mostly in the details. Sure, newer GPU technology can add more and more details, but they're still "details". Adding 10 or even 25% more polygons to a character doesn't make a tremendeous difference, especially when everything is moving on screen. Better textures do make a difference, but then again, if a texture resolution isn't drastically bigger, the difference isn't always obvious. There comes a point where more details doesn't make that big of a difference. We may not be entirely there yet, but we're closing the gap.

What I do see though, is that while there are still performance increases from one generation to the next, the biggest difference now is all about energy. Making something as powerful, but requiring less power to run. Look at the GTX 750ti. No need for external power! Obviously not very powerful, but still, that's quite a feat for a card that can allow you to play most modern games in 1080p at medium, or near that. Newer processors are not necessarily more powerful, but more efficient. Or they can be more powerful, but not that much more powerful.

Right now, I can run Photoshop on a $200 8-inch tablet I recently bought. Even 5 years ago, this would be unimaginable.

Performance-wise, I think the evolution of technology is slower nowadays. It seems to happens more on the energy-saving front. Someone upgrading their GPU every year would probably need to skip a year to see a noticeable improvement now.

But there's no definite answer. It all depends on what we want out of our PCs... and what we expect... too many factors to consider something truly, universaly "future-proof", whatever that means...
 
I personally like the term "Future resistant". You can't make a build completely future proof, you couldn't 10 years ago, and you definitely can't now. But you can plan ahead for future upgrades when you buy now with all the latest technologies intact in your current system. SATA 6 GB/s and USB 3.0 are absolute must haves. PCI-E 3.0 is also a must have, and having multiple PCI lanes for future GPU purchases will also help ensure system longevity. Things like hex core CPUs and DDR4 RAM are nice to have in the short run, in the long run these things won't see mainstream appeal for a long time as they're geared toward higher end users. While you can't make a rig completely future proof, you can plan for upgrades ahead of time by having all the latest technologies in tact.
 
The thing with future proofing attempts, they're feeble at best. No one can predict what will come in 18mo to two years. Speculation even by the manufacturers are all but useless as we've seen multiple times over from just about everyone. Until it's physically in hand and run through a battery of testing to at least get an idea of capabilities, it really means nothing. I can design the fastest sleekest supercar you've ever seen on paper. Doesn't mean squat until it hits the pavement.

Future proof will be different for different people and their differing needs. Someone who does graphic design likely won't need to upgrade their gpu as often as someone who does mostly gaming. If it's a matter of basic light use (spreadsheets, email, social media, office tasks etc) then a pc that's 6-7 years old may be more than adequate.

I think what people are scared of and attempt to fight is the nature of the beast. Worried they're making some major investment that loses value/performance faster than a new car driven off the lot. Welcome to technology, it's an ever changing market. The most volatile is likely gaming since any new game that comes out could render any system potentially weak in the performance department. Then again they may not. There's no reason for that to happen, it's solely up to the game manufacturers. Nothing is a given and a game released 2 years from now may not have as much requirement in terms of horsepower as a game released next month. There were games that came out long after the original far cry or crysis that didn't require near the specs those games did (hence the joke yea, but will it play crysis?).

No one can possibly buy/build anything better than the absolute best available on the market. If the questioning leads to well should I wait until the next big thing? Well what then, what happens when the next big thing arrives. Wait for the next big thing after that? If so, one will be in a perpetual holding pattern. There will always be something newer, bigger, better, faster. Plan on it.

Ways to plan ahead (if you can call it that) include things like a power supply or motherboard. If you buy a bigger power supply than you need now, you won't have to upgrade it right away and will likely last through several upgrades (within reason). Planning ahead and buying a motherboard with support for more ram than you currently need will provide you with the ability to make upgrades in that department. Things that don't pay off, don't skimp on a video card thinking you'll just crossfire it or sli it soon. Chances are you won't (for most folks) and by the time you consider it it's more about trying to play catch up to something better that's available. On top of it, if it was a decent mid grade card to begin with chances are a year or two down the road the price won't be low enough to warrant buying a second one and you'll be better off coughing up the cash for a single better card (unless you have a specific circumstance like multimonitor or something where sli/xfire pays off).

Something else I've noticed, at least in my case - the best value is getting what you want up front. I don't mean buying the best of the best and blowing your life savings on cutting edge, but don't cut yourself short either. I see a lot of people who opt for a lower performing part to save $20-30. It just doesn't make sense. If you keep your build for any length of time (say a year or more) you will be stuck with the lack of performance for that time and the $20-30 wouldn't have been missed. Most people probably have $20 worth of loose change they've misplaced in a year. If it's a temporary solution with plans to upgrade when finances allow, say in 3 months time so someone won't mind a little less performance in 3 months. Why pay $120 for a part (just using a random value here) only to turn right around and pay out another $150 or $160 in such a short time. Even if they're able to sell what they have, they may get $70-80 on a good day for that $120 part. In the end they've paid $200 for a $160 part, plus the time and hassle swapping things out so often and add in the time and hassle of trying to sell off the old part(s). If they had just saved another week or two, they could have been done with it.

Just my .02 worth, only trying to put things into perspective for those who get all hung up on 'future proofing' anything especially since most everyone has a budget - some more restrictive than others. Also trying to put things into perspective as far as cheaping out and cutting corners where it may end up being more of cutting their nose to spite their face scenario.
 
First, synphul, that was an amazing, well thought post. You won a free beer. 😉 Cheers!



This made me think a bit. Was there any recent game after Crysis that got that sort of attention? I mean, something that needed hardware so beefy to run well at the time, that it was the type of game you would upgrade your rig for? Not the type of upgrade you do to stay up to date every now and then, but a major upgrade because things were really kicked up a major notch in a specific game? I don't recall any other game released after Crysis that couldn't be enjoyed with a decent middle-class GPU.

I think since a couple of years, there wasn't any drastic change in processing power for CPUs and GPUs. SURE, there are always better gear out there that will perform better. That's a given. But we're not seeing any revolution happening, at least on the performance front. Performance gains from one generation of CPUs or GPUs to the next are not major ones, let alone exponential ones. Some will say they never were, and I agree. But the gains we're seeing more and more nowadays are more than often related to energy-efficiency than anything else.

Sure, being more energy-efficient will often allow more performance (per watt) in the end too, but what we're seeing are CPUs and GPUs being often as powerful as the ones they replaced (or slightly better), while consuming less energy.

Case in point, I can currently play Crysis on my $200 8-inch Windows tablet!!!. 😉 Can it play Crysis? YES IT CAN. lol! Not maxed out, of course, with low (but somewhat playable) fps, but IT CAN!!!

I think, for the better or worse, that the popularity of console gaming since the Xbox 360 has kind of slowed down the need for beefier GPUs and CPUs, at least on the gaming front. Most games coming from big studios are made to run on as many platforms as possible, thus never taxing the beefiers PCs. Nowadays, there is no "need" to buy an expensive graphic card to play many of the latest games at the same level of details than the consoles. Sure, you can get better graphics on the PC with a decent GPU, but it is not "essential" nor required to run those games. And sure, you have many people that feel the appeal of 4K, but still, you don't need 4K to play games. So, basically, the reason behind buying a better GPU or CPU is nowadays more because "we want", rather than "we need".

I love to game. But since I don't have truckloads of cash, I usually try to buy parts that will last me some time, upgrading a thing here or there as I go along. My current CPU (Phenom II X6 1090t) is now 4 years old and still plays whatever I throw at it. Only now it starts showing its age a little (but just a little). But my decision, when I bought my PC back then, was to buy something that will last me a couple of years, without breaking the bank. I think I succeeded. I initially bought only 4gb ram. Eventually bought 4 more. I bought a 6850. Upgraded two years later to a 7850 for about $100 (after selling my 6850). Bought an SSD too.

My Radeon 7850 is still giving me plenty of 1080p satisfaction so far, although I need to drop the details a bit in more recent titles to keep the framerate silky smooth. I'm currently looking to upgrade, but anything I do won't bring significant improvement (I have limited resources). My 7850 cost me around $200 two years ago. Right now, a $200 (Canadian $) GPU is not significantly better than this. My CPU cost me a litlte bit more than $200 4 years ago. Today, a $200 AMD CPU will be better, sure, but not THAT much better.

So... either technology isn't evolving as fast as it did some years ago, or evolution happens more at the higher-end of the spectrum and only takes longer than it used to to get down to a more "mainstream" level. Or maybe a combination of those two things.

I think we can always try to plan ahead by being smarter when buying. Like you said, if buying a lesser model of anything to save $20 will wreck your future upgrade path, well, that's just dumb.

But there will ALWAYS be a risk that things won't last as long as you planned to. And there will ALWAYS be something better coming along. Nothing is future proof. Although I think you don't need to do major upgrades as often now as we used to, let's say, 10 or 15 years ago. Which, from a monetary perspective, may be a good thing for us, the consumers.
 
I think to some degree things are hitting a wall. Not to say improvements in both speed and efficiency aren't made, they are. Obviously back in the day the easy out for performance was frequency. Just keep pushing the frequency faster and faster and we kept getting benefits release after release. Now frequency is hitting a ceiling and chip developers are looking into shrinking dies, aiming for efficiency, improving the structure of the processing pipelines so more can be processed per cycle.

I haven't seen anything mentioned about this in a long time, maybe due to architecture changes. However it used to be that the end user would only see roughly half the gain in 'performance' as the percentage increase. For example, an older 100mhz cpu. Upgrading to a 200mhz cpu (100% increase) didn't net a processor that was twice as fast to the user. Rather only around 50% faster. In order to get a feeling of 'wow, this is twice as fast as my old one', someone had to upgrade from a 100mhz to a 350-400mhz cpu. That being the case, to get a 400mhz cpu to feel 50% faster meant an 800mhz cpu. Which grew to a 1.6-1.8ghz cpu and so forth. Architecture improvements have made a bigger difference since frequency isn't comparable between different generations so it doesn't scale perfectly. You can still imagine once we hit 4ghz, the next logical leap would be to 8ghz and that just isn't happening. Silicon has its' limitations requiring engineers to find new ways to squeeze performance. Eventually chips will likely move away from a silicon based chip to some other medium down the road (how far is anyone's guess).

Die shrinks usually allow for more efficiency and lower costs (the less space a die requires off the same sized wafer, the more chips can be produced per wafer and so on). That's why gpu's have been stuck for quite some time now, even maxwell didn't offer a die shrink. It turned out to be too costly so they stayed with the previous die size.

Factor in on top of it, other than specific instances - for the majority of programs and operations, once the cpu starts processing faster than a human can keep up with the user becomes the limiting factor. How fast can we make a program open and appear on the screen? Faster than the eye can blink? If so, what's the point of trying to go faster than that beyond some tech gizmo that can give us a benchmark splitting hairs that one is faster than the other. When the user becomes the bottleneck, that's it.

A keyboard that is capable of n-key rollover and sending enough signals to handle 400wpm is no more effective than one that can handle 200wpm since I don't know of anyone capable of even reaching 160-200 words per minute. Once we've reached that point, all that can be done to improve a keyboard is change the color, add function keys for various things, add lights, make the keys click differently etc.

A pc from 6 to 7 years ago can handle facebook and email and whatever else just as well as a pc bought today. Not everyone creates large graphics or edits video or audio or does serious cad work. That's a fairly small community by comparison. Gaming likely places the most strain to a larger group of users and the extra processing power/rendering comes in handy for more intense experiences. Multi-monitor, larger screens, higher resolutions, 3d etc.

Just speaking from a personal standpoint, until this past holiday season I didn't feel a need for an upgrade. My system was a good 6-7yrs old and kept up just fine other than for more recent games. I expect my current system will easily last me another 6yrs aside from worn parts (hdd, possibly psu) and a gpu upgrade or two along the way. The only upside I can see for energy efficiency beyond mobile devices (battery life is important) is the cumulative effect of energy efficiency. Even if we only shave off a handful of watts per desktop for the same performance we had a year or two ago, that's that much less wasted power and it all contributes to more efficient products.
 


Yeah I agree, it's like designing a building - it's better to lay a solid foundation now and plan for upgrades in the future than it is to blow $5,000 on a cutting edge system. The people that I have seen do the latter wind up completely regretting it later on. The ones that take the time to do research and plan ahead, not necessarily buy top of the line stuff, are the ones that have the most satisfying experience.

Future proof will be different for different people and their differing needs. Someone who does graphic design likely won't need to upgrade their gpu as often as someone who does mostly gaming. If it's a matter of basic light use (spreadsheets, email, social media, office tasks etc) then a pc that's 6-7 years old may be more than adequate.

If we're talking light use, we're pretty much done there. Any processor whether it's an Intel Atom, Qualcomm Snapdragon, or i7-5960X can run MS Office, etc. It's the high end stuff that requires envelope pushing. But when the high end stuff breaks technology barriers we all benefit.

I think what people are scared of and attempt to fight is the nature of the beast. Worried they're making some major investment that loses value/performance faster than a new car driven off the lot. Welcome to technology, it's an ever changing market. The most volatile is likely gaming since any new game that comes out could render any system potentially weak in the performance department. Then again they may not. There's no reason for that to happen, it's solely up to the game manufacturers. Nothing is a given and a game released 2 years from now may not have as much requirement in terms of horsepower as a game released next month. There were games that came out long after the original far cry or crysis that didn't require near the specs those games did (hence the joke yea, but will it play crysis?).

Yeah of course, nobody wants to pay $300 for a new GPU only to have another new GPU come out by year's end, but with gaming is the nature of the beast.

No one can possibly buy/build anything better than the absolute best available on the market. If the questioning leads to well should I wait until the next big thing? Well what then, what happens when the next big thing arrives. Wait for the next big thing after that? If so, one will be in a perpetual holding pattern. There will always be something newer, bigger, better, faster. Plan on it.

Those that go for the concept of "Future proofing" will endure two things - the first is a system that greatly depreciates in value. I remember one time there was a thread where someone wanted to know the value of their 2006 top of the line system (when they paid something like $5500 for it) and after depreciation it was worth only about like $900, so they lost a lot of money in that regard. The other thing that I've noticed in my time as a moderator is that the people who spend way too much money thinking a system is going to last them forever, find that their returns diminish extremely quickly! I'm sure a lot of people that post here, also post or at least lurk over at PC Part Picker, and if you want to see a textbook definition of "A fool and his money are soon parted", take a look at some of the people who spent over $10K on builds - they're not getting their moneys worth and the people who spend this kind of coin on a build have bragging rights for about... 2 days. :lol:

And the sad thing about blowing $10K on builds? Despite that you spend as much money as you would a Nissan Versa, your returns very quickly diminish, as does your bank account (or credit line). I say if you're going for a top of the line rig spend no more than $2500 and plan for future upgrades. Spending anything over than $3500 (not including monitors) - you're what P.T. Barnum was talking about. :lol:
 


You know, my CPU (Phenom II X6 1090t) is still perfectly fine for what I do, even for gaming. The reason why I said my CPU starts showing its age, is not because it has become inneficient, but because I'm actually asking my CPU to do stuff that, 4 years ago, didn't even exist.

I started using Steam In-Home streaming a lot. Which requires that on top of running a game, my CPU also has to encode a HD video stream and send it on the network at the same time. It is still able to do so, but sometimes, it's a little bit too much as I get "slow encode" warnings. Not that often, but enough to be a bit annoying. Sure, I can probably overclock it a little bit, but that won't make any significant difference, I think.

And seriously, I don't think the AM3+ platform has a future anymore. My CPU upgrade path right now may be a 8350 or a 8370 but that's pretty much it. And while this would improve performance, it's not going to be "OMG it's awesome" better. Unless maybe I overclock the hell out of them (and I'm not really at ease doing so). And I don't think AMD will release anything significant on AM3+ anymore.

So, getting back on the "future proofing" idea, I don't think upgrading my CPU to a 8350 or 8370 would be a good idea. I know it would cost more, but switching to Intel (LGA1150) would probably be a better choice. Buying a nice i5 like the 4460 or 4690K would offer a bump in performance (especially in gaming), and I could always upgrade to an i7 down the line. I think my upgrade path would be better with this, than staying on AM3+....

But yeah, I can probably save money right now by going with another AM3+ CPU... as I wouldn't need to change my mobo. But then again, in two years or so, I'll probably want to upgrade, and then I'll be stuck and will need to change anyway.

So maybe the wiser decision is to think "ahead" and not always buy to minimize costs RIGHT NOW, but to minimize costs on the long run. To be aware that you'll probably want to upgrade a thing or two down the line, and so, to buy stuff that would facilitate this as much as possible.

 
I agree with most if not all of Synphul's points. I too use the term "future-resistance," believing that "future-proof" is a myth.
I used to think I was being frugal by getting just what I needed, but then I'd need some kind of upgrade fairly soon; I've since learned to take a few steps ahead of where I needed to be, and other than an occasional graphics card upgrade, really haven't haven't had to buy any more parts for my systems in years. I am finally contemplating an upgrade, because of an additional capability I want that was not available, and it is coinciding with the times that some parts (like HDDs) may be physically wearing out.
 
Solution
Status
Not open for further replies.