junkeymonkey :
I got to admit upgrading times are stretched out a bit longer today for sure I can skip a lot of new stuff due to my old stuff is still going strong for my needs
Isn't that great!?
I've been able to take on several side hobbies over the years because I no longer feel like I have to replace my CPU and motherboard every other year.
A new GPU ever so often gets the job done.
none12345 :
"I miss those days too, but frankly we don't really need it."
Sure if all you want to do with a computer is send text messages and email....well the computer i had 20 years ago was just fine for that.
I've been running an i7 2600K since Intel first launched it in early 2011. It is 5 years old and still running strong, with no indication of software being held back in any way. It's currently paired with a GTX 980ti and it is also not held back.
That combination allows me to play games across three 1080p panels with comfortable frame rates and high levels of detail in practically any game. It can also handle 4k at reasonable frame rates. It's certainly not CPUs that hold us back in that regard.
none12345 :
1 or 2 orders of magnitude in computer power would open up a lot of possibilities. If things had gone at the same pace they were back in the early to mid 2000s we'd have something like 8 core 20ghz chips by now. On the graphics front if they didnt get stalled on 28nm, we would be at 3 or so times faster then they are now. Which means easily doing 4k content.
That's a mighty big if man. The entire engineering world is just getting past the 28nm problem, and Intel and AMD both admitted that dialing up the clock speed has hit a wall, many years ago. That's why we see parallel computing taking over. Your theoretical world is simply a fantasy. Things haven't been going at the same pace as they were in the 90s, early 2000s, or mid 2000s.
There's been major advancements in other ways though, Most notably power consumption.
none12345 :
On other fronts. For instance autonomous vehicles, thats basically what they are going to need. Well you dont need 20ghz 8 core, but that would make the amount of power you do need cheap enough to put it in every car.
Autonomous vehicles will have specialized SoCs designed for thier specific needs. Take a look at the PowerVR demos that were revealed today to get some context on what I mean. When a processor is designed for one specific task, they can to amazing things for much less energy and wiht what seem like very weak processors on paper.
http://www.tomshardware.com/news/powervr-ray-tracing-console-graphics,31411.html
none12345 :
On the VR front, you'd need every bit of that and still be wanting for more power to give a proper VR expiereince. Youd be talking about 8k vr instead of 1080p heh. But really you need to be in 16k territory to get rid of the screen door effect for VR.
You have no idea what you are talking about in regards to VR. Go try a Vive or a consumer Rift and come back and tell me if you see any screen door. Unless you are specifically looking for it, you wont.
Will higher res be better? Sure, you bet it will. Do you need that for a "proper VR experience?" Absolutely not. Just ask anyone who's actually tried a Vive.
Or watch a youtuber who just had one sent to them, like Jacksepticeye, for example.
A proper VR experience is about being swept away from the real world, and it's the content that does that more than anything.
BTW, the resolution of both the Rift and Vive is 2160x1200.
None of that even speaks to the fact that 4K panels that run at 90Hz don't even exist, and there's not video cable standard that could even handle that much data right now. 16K is litterally years away from feasibility.
none12345 :
It would get gaming out of the rut of crappy low poly graphics that weve been stuck with for the better part of the last decade. There were a lot of advances there, then it just stopped. Sure the pixels look better, but things are still way too blocky(especially the landscape/worlds) Just not enough polys. I would love a 10x increase in poly count.
Panel manufacturers had to catch up. Game graphics had been held back for years because there were not advancements in LCD panel sizes and resolutions. Now there are, and games will now catch up. It all happens as a trickle down effect.
Now that people are buying 4K panels, GPUs will catch up next and be able to play games well on them, then we'll see games get better gaphic fidelity.
none12345 :
I havent had to buy a new system since 2009(other then 1 gpu upgrade, and buying an ssd at some point, as well as larger hard drives).
Just boring having a system(midrange) from 2009(except the graphics card/ssd) that isnt all that much slower then what you can get today for the same money.
You aren't gaming at 4K then. Why are you complaining about poly count?
junkeymonkey :
I think they do I would like a upgrade but cant see whats out there now over what I all ready got - and then if I had to it wount be the amd cards due to there lack of os support now at least with a 960 I still got all the way back to xp on a recent driver and all NVidia cards have still analog support native with the dvi-I see how amd has me limited on what I can do or use ? whats funny to me is how AMD chips and board still support down to xp but not there cards [like your 380]
I want the hardware that got me covered for how I want to use it not how they want me to and limits/proprietary's me [like that skylake junk ] , but that's just me
just seems upgrading hurts me more then it helps or benefits me now a days
You're using Windows XP still? How are you even browsing the internet? I didn't think modern browsers worked on XP.
You have bigger concerns than GPU driver support if you are on XP though. I can't think of many games that even support that OS these days. It won't be long before you'll need Win10 to play some titles (DX12)