Will quad core i5-7500 still be good enough for gaming for the next few years?

Solution
Will it be enough for gaming for 3 years~? Certainly.

If you plan to go for a super high-end GPU (GTX 1080 ti or future equivalent) in that time, at least an unlocked i5 (preferably 8th gen, price-permitting) would be more cost effective, other than that you'll be just fine. :)
Will it be enough for gaming for 3 years~? Certainly.

If you plan to go for a super high-end GPU (GTX 1080 ti or future equivalent) in that time, at least an unlocked i5 (preferably 8th gen, price-permitting) would be more cost effective, other than that you'll be just fine. :)
 
Solution
Even a five years old i5-2500k is still going to be good enough for most foreseeable future games. It is only the top-1% of games that push CPUs hard enough for faster CPUs to matter and even in those, you can usually trade eye-candy and gimmicks for performance on older CPUs.
 
Depends on the games you want to play and your expectations. Someone who's fine playing at mid details, low/no physics and dips to 40fps in mostly non-demanding games will be fine with an i3 for many more years to come. Someone who wants 120+fps at 4k resolution with everything maxed out may already struggle with an i7-8700k by the end of 2018.
 
The rate of CPU speed improvement has fallen to about 3% per year lately. So barring a technological breakthrough in CPUs, any i5 or i7 you buy today will be good for the next 15 years (at which point a "modern" CPU should be 55% faster at the same clock speed). I'm still on a Sandy Bridge i5 and have no plans to upgrade.

The limiting factor is actually what other computer features are supported by the CPU - like USB 3.0/3.1, or M.2 slot for a SSD. Or if Intel swings a deal with Hollywood which makes 4k video decryption dependent on a feature only found on Kaby Lake CPUs.

While an i7 gets you 8 threads via hyperthreading, the reality is that those 4 extra threads give you very little additional computing capability in most cases. The problem is that hyperthreading works by letting other threads use an unused portion of a CPU. If your CPU is the bottleneck, it's usually because your CPU is being limited despite running the same task on all physical cores. Any extra threads need to use the parts of the cores which are already in use, so hyperthreading doesn't help at all (and in fact can actually make things slower as time-dependent tasks get assigned to a thread which is going to end up CPU-starved). So a program requires a very eclectic set of processing requirements (doing lots of different things at the same time) to really take advantage of hyperthreading.

In real-life use, the few tasks which do this and really benefit from hyperthreading are video encoding, encryption, and data compression. Those typically see about a 30% speedup from hyperthreading (so the 8 virtual cores on the i7 actually runs more like 5.2 cores). Most tasks only see about a 5% speedup from hyperthreading, so your 8 virtual cores run more like 4.2 cores. Barely different from the i5. If you're going to go for an i7, do it because it comes with an extra 1MB of L3 cache (usually 4 MB vs 3 MB), although even that only helps a lot when doing processing on large amounts of data (again, video encoding, encryption, and data compression).

Basically, most tasks break down into being limited by single threaded performance (so clock speed matters most), or are able to be massively paralleled like bitcoin mining (in which case assigning it to a GPU is better). Unless you're doing lots of video encoding, encryption, data compression, scientific data crunching, or running a large multi-user server, the benefit of adding more than 4 cores (physical or virtual) is extremely limited. Intel knows this, which is why they also give the i7 extra cache and higher clock speeds - so you can't do an apples to apples comparison.
 

Simultaneous Multi-Threading (the generic term for Intel's HT) yields 30-40% more performance than running a single thread per core because superscalar deeply out-of-order CPUs have tons of resources that cannot be efficiently and fully utilized by a typical single thread's instruction flow.

Also, the newest mainstream CPUs having 50-100% more cores than previous years', the likelihood of more enthusiast-oriented games coming out making use of more cores+threads will naturally increase.
 


^ This.

The GPU you select or own can be a good indication in what you want from your games in years to come. If you've done your research on graphics cards you'll know what to expect, and you'll be shooting for that specific FPS/Texture settings(VRAM) range. Your CPU is there to complement that.

Now take note of what InvalidError says about games being tailored for more effective use of multiple CPU cores in the future. The idea has been circulating for years, but it seems more likely than ever to occur within that 3-5 year time-frame. Intel seem aware of this, having increased their mid-top tier core count. This may be all that was needed for game developers to start pushing out the expectations of a CPU for gaming.

It means that essentially, in the case of owning a high-end GPU, the mentality of future upgrades for gaming will be geared towards multi-core CPUs, rather than a more current GPU upgrade.

Listing your current/perceived specs and plans for fps/resolution etc. will help give you more specific opinions.