My video card is a gtx 980 ti, and I plan on getting Windows 10 soon for DX12. Should I upgrade my CPU to the i7 6700k or stick with the i5 3570k? I plan on using some CPU intensive programs such as Dolphin and PCSX2.
Stick with the i5. If you feel you need to upgrade do so but I would stick with the i5 for now. It can hold its own pretty good in games even at 4k res. Windows 10 is still buggy as well I would wait to upgrade if you can.
A) Hold off on Windows 10 for awhile. Mainly because its still relatively buggy, but we also keep finding out more and more about their incredible data capturing. They are making Google look like choir boys by comparison.
B) I would stick with the I5-3570K until you saw a need to upgrade. I love new toys too, but if you are getting good frame rates with the current CPU, there is no real need to upgrade right away. Skylake is going to require a new motherboard, memory and CPU at a minimum. Skylake or one of its successors is definately something we are going to want at some point in the next 24 months, but I want to see a little more about the next 14nm chip that should be coming about this time next year before I make the leap.
On the other hand, GPU's coming out next year are going to be very, very tempting.
Is no one chiming in it to say " Dolphin and PCSX2 isnt really that CPU demanding these days, you really just need a Core2Duo+ CPU and youre golden"
Core2Duo is '08-'09 gen cpu, we're in '15 the 3570k should be tearing through a N64(or GC?) and PS2 Emulator. I know because I can run PCSX2 on my 3570K and gtx 670 flawlessly.
Resolution can put more load on the CPU. Depends on the game, and what the game needs the CPU to process.
Most of the time, all the CPU is doing is pulling data off of the hard drive, and sending it to the video card... Trivial work for a CPU. But there are times when devs put more of a load on the CPU, and I expect with DX12 in Windows 10, that is only going to increase. Devs are going to have far greater control over what does what in DX12 than they have ever had in Windows before.
For now, I5-3570 should suffice, but do not be shocked if in a year or three, that the situation changes.
Don't think you need to change anything as of right now, unless you plan to SLI, in any case if you can monitor your GPU usage in any video stressing synthetic test, and it can hit 100% GPU usage, then theres no bottleneck or there will not be any in any resolution for your current setup.
Higher resolutions often use higher resolution textures, and those are larger files. Not everything that was used at 1080p is the same at 4K resolution. When you reach 4K resolutions, there are almost no jaggies left on the screen. So you can put in new models that focus on the high-res graphics. And some devs do that.
Higher resolutions often use higher resolution textures, and those are larger files. Not everything that was used at 1080p is the same at 4K resolution. When you reach 4K resolutions, there are almost no jaggies left on the screen. So you can put in new models that focus on the high-res graphics. And some devs do that.
And loading higher resolution textures are going to measurably increase load on the CPU? Any benchmarks to prove it?
Send me 4K monitors and video cards. I will test everything to prove it to you, and in about 5 years, I should be done and will send the hardware back to you.
In the mean time, any time you are moving more data, something has to work to move that data.
124,416,000 pixels per second @ 1080p
497,664,000 pixels per second @ 2160p
Something has to push that extra data to the video card...
Ok, arguably tons of things can put small amounts of extra load on the CPU, increasing the resolution is one of them. I highly doubt its significant enough to be discussing. If it was measurable then im sure someone would have tested this already. In my experience, increasing resolution has never noticeably increased my cpu load.