I hate that word, bottleneck. In a pc, a bottleneck is something that slows down the flow of data, in this case, fps.
The cpu is the source of fps. With it, you get all that you can get, whatever that number of fps might be. The number will be decided by the game code and available other resources such as hdd or ssd use, fast or slow ram, not enough ram etc. Being the source, it's not going to slow anything else after it, it'll put out as many fps as it can, and thats all.
The gpu is all about eye-candy. As detail complexity and resolution goes up, fps goes down. If you have a gpu that's more capable of higher fps, the point at which fps drops off is higher, so instead of max fps at just low settings, you could hit high settings and see no significant fps loss. With a less capable gpu, you might see medium settings and lose a bunch of fps.
So No, the 3570 isn't going to 'bottleneck' the Rx580, what you'll get is a lower amount of fps from an older cpu with (now) low IPC and only 4 cores/4 threads, which is going to further take a hit from modern games wanting to use 6+ threads. It means in many games, if not most, you'll most definitely need to actually chose a custom setting, not the default 'High/Ultra', and only use Graphical settings that high, but Cpu settings drop to low/med in order to get maximum fps from the cpu.
In many mmorpg type games, there's the floating damage numbers, often green for damage done, red for damage received. Those are highly cpu bound and can kill fps. Stuff like that, lighting after affects, shadows, bloom etc will kill fps. Coming from the cpu, not from the gpu.
So tailoring your settings per game will be very important, if you want to maintain higher fps, while still having a good game experience and a great picture.