So in layman's terms, since 60fps is the goal, and I'll make it easy with a 1060-3 running at 55fps and a 1060-6 running at 60fps, the difference in the two cards is so negligible as to be a pointless argument over a benchmark, because anyone viewing the 2 screens in real life couldn't tell the difference.
Realistically, for a 2Gb vram game, there's no difference. The real difference happens in a 5Gb vram usage game where the 1060-3 would be struggling with 45fps and bouncing the adaptive v-sync all over, and the 60fps of the 1060-6. Which will be a noticeable difference.
Take for instance the 3Gb varient of the 1050 vrs the 1060-3. Cores and tmu aside, that's going to be fps count. The 1060-3 getting a few fps more than the 1050-3. The vram will be the limiting factor. Both will get relative fps in equal amounts at the same settings, upto the point where the vram is throttled, then both plummet. Right about the same detail settings. The 1060-6 won't see that limit, so while it only gets a few more fps than the 1060-3, you get relatively higher detail settings for the same fps. So if the 1050/1060-3 gets @60fps on medium before throttle, the 1060-6 will get @60fps on high-very high before throttling. That'll be a considerable viewing difference.