CaedenV :
We are definitely in the PC+ era, where we have a main computer that does the work, and then an assortment of devices (smart phones, media players, GPS, tablets, netbooks, and laptops) which we use as extensions of our PC. I think what we will see is a continued emphasis on the personalization of the computer interface. Meaning less direct contact with a 'PC' and more contact with smaller devices which will call on an external source for it's computing. I think that the PC will soon become the home server which holds all of the bulk processing and file handling for a household, and then our devices will run off of this central device for their needs. Companies would love for this to be the Cloud instead of a home server, but practically speaking the internet is not ubiquitous enough to do this any time soon.
In my own house I find myself planning for this trend. Next year I plan on doing my last major computer upgrade for myself (just cant edit HD content on an old c2d), and as upgrades go for me, they tent to last about 5 years (unless something dies and I do an incremental upgrade). In fact my current system is about 4 years old now, and I bet if I had gone with a C2Q I probably could manage to do what I want to with it, and hold off upgrading for a few more years.
The point is: What will computing look like in the year 2017? Tablets will be able to run multi-monitor support, and have capable duel and quad core processors for everyday work and play. WiDi will be readily available. So what would be the point of a dedicated PC? Sure, to edit quadHD content you would need some horse power, but that could be done on the server, with the software running on a tablet, and the tablet running 2+ monitors for the editing interface. You can do this today with a cheap PC instead of a tablet. In the near distant future this would be entirely possible (if a little expensive) in a home setting.
Another fool who refuses to realise that once tablets get more capable, so will desktops. So, you wanna carry around a tablet... for what? So that it will be anyway connected to external screens and the server in order to do anything serious?
I'll NEVER switch to stupid "server" mode, thin clients or not... each and every PC in my household will run native setups, period. Doesn't have to be Windows, can be Linux if it's not a gaming machine. Native setups are safer, faster and don't depend on anything external. The only thing I can imagine myself doing is connecting a network storage to my home network so that I avoid copying stuff around the computers, but I already have that provided by my external HDD where I store all my data and I don't think there's even a need for network storage, but I'll do it anyway - sounds fun.
In 2017, tablets will be long gone and there'll be something new that will "threaten to kill the PC", as always. That's all that will happen. Meanwhile, we'll be playing overkill graphics games (provided that consoles will finally die or upgrade, lol, you never know... MS said they wanna keep the crapBox around till 2015), possibly with surround 3D or some other crazy stuff, and nothing will be able to do it except the PC.
Sure, average consumer doesn't need anything faster than an i3-2100 (which is a damn good chip for its price, I might say). That's because an average consumer is an idiot. Most of them don't even know how many cool things you can do on the PC. Many of them would upgrade to better hardware if there would be a need, and it's very easy to create a need. Stop advertising consoles, lure them to PC gaming instead and unlock the full potential of the hardware.
Sure, I've seen enough fools who think that a Pentium Dual-core is enough for them, but when I look at their PCs, all I can say is "You need an upgrade." They are blind; they think that their Windows Media library opens for 5 minutes because "it's big"... no, it's because their hardware is $h!t and their PCs are bloated. Seriously, I'll never cease to be amazed with the "average user"'s stupidity and stubbornness.