I think it depends on usage and needs. Dual core cpu's are still more than enough for many basic tasks, gaming is only one thing pc's are used for. Not everyone is compiling/editing video, for those who only need to check email and surf the web and do a few other light office tasks a dual core is more than enough. Same for those using one as an htpc type device in their living room connected to a t.v.
The i3 is in fact a dual core cpu but it also has hyper threading which is a means to keep those 2 cores fed with data to process at a steadier pace than a plain dual core like a pentium g series. In some situations despite having 4 threads it does show the inherent lack of performance that results from 2 processing cores and why an i5 will out perform it in those situations.
Threads and cores are not the same thing, an i3 can handle 4 threads simultaneously but still only has 2 cores to do the actual processing. An i5 also processes 4 threads simultaneous but each thread gets processed on its own core. An i7 is capable of processing 8 threads simultaneously but like the i5 still only has 4 physical cores to process them. At least concerning mainstream desktop i7's, not referring to the 6/8 core enthusiast/extreme edition i7's.
That's why in spite of handling twice as many threads at a time the i7 is only around 15% more performance on average than an i5. Some say 30% but that's a best case scenario and not always the case. In other scenarios ht achieves less than 5% gains and in very few cases can hurt performance depending on the task.
As to the original question I'd imagine if dual core cpu's are ever phased out it will be a result of the demands of software. That doesn't seem to be changing anytime soon. Software is often slow to adapt in my opinion, if we look back at 64bit processing there were 64bit cpu's long before there were programs with support for it, a feature without much need in the beginning. Even once software began moving to 64bit common software like web browsers still took years to eventually offer 64bit versions. Browsers like firefox didn't actually offer a 64bit version until december last year, so less than a year it's been out (not including waterfox which was a 64bit offshoot project). 64bit processors in mainstream pc's have been around since 2003, so it only took a major browser 12yrs to finally make the move.
People have been discussing and speculating on things like dx12, mantle and vulcan for years. It's been a long awaited 'thing' that we're still largely waiting for. We have dx12 gpu's, win10 finally offered everyone a dx12 os and there's a very small handful of games that actually take advantage of dx12 or mantle or vulcan, even fewer if you go by the ones natively coded for dx12 and not those simply patched to sort of include dx12 features.
There are more in depth reasons for those things but more or less a broad example of how the 'need' for certain hardware changes just aren't there. There's a speculated 'need' for such things, the hardware materializes into actual products then it's a matter of hurry up and wait for the software to actually utilize it before everything is in sync, hardware, os and software.