Overall, the i5-3570k is better for games than the FX-8350 because in nearly all games the i5-3570k beats the FX-8350 from all the various gaming benchmark views on the web. However, actual performance that you will see will be limited by your overall hardware; mainly the graphics card and monitor.
Assuming you buy a powerful graphics card like the nVidia GTX 680 there are many games where you can get above 60 FPS, but the actual FPS that is displayed on the monitor will be limited by its refresh rate. A 60Hz monitor can only display at most 60 FPS and a 120Hz monitor can only display at most 120 FPS. All 3D monitors are 120Hz, but using it in 3D mode will limit you to 60 FPS.
The link below benchmarks several CPU with a nVidia GT 680; all of which are running at stock speed:
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html
In Batman, both the i5-3570k and FX-8350 basically perform the same at 1920x1080. The i5 is 3 FPS faster; 73 vs 70. However, if you are only using a 60Hz monitor, then you are going to get the same performance because the monitor can only display 60 FPS.
Far Cry 2 is a game where both CPUs can get above 100 FPS at 1920x1080 resolution. Both CPUs will perform exactly the same on a 60Hz monitor. On a 120Hz monitors both are pretty close in performance. Even though the i5 + GTX 680 combo can provide up to 131 FPS, the monitor limits it to only 120 FPS. The FX + GTX 680 combo gives you 114 FPS. Unless you can visually tell the difference between 120 FPS and 114 FPS, both CPUs will seem to perform exactly the same.
The FX is slightly cheaper and socket AM3+ is expected to last until at least Steamroller. But it does consume a lot of power more than an i5 at full load as seen in the link below (note that is 100% load on the CPU, when not playing games). Depending on how much you pay for electricity the difference of 92w may or may not make much of a difference.
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_8.html#sect0
I generally pay double the national average rate for electricity in the US so I basically avoid AMD CPUs. Even though Intel CPU sockets do not last for more than 2 generations, I generally do not upgrade very often so I do not mind switching sockets when I upgrade.