I would take that slide with a large grain of salt.. or just chug down all the salt in a salt shaker.
While the Intel HD 4000 more powerful than the Radeon HD 5450 (Intel HD 3000 equivalent), it is less powerful than a Radeon HD 5550. It probably has around 90% the performance of the Radeon HD 5550 on average. The last I've heard, the HD 4600 is estimated to be 20% more powerful than the HD 4000. However, that still means it is less powerful than a Radeon HD 5570. I would place the Intel HD 4600 about half way in between the Radeon HD 5550 and HD 5570. Maybe a little closer to the Radeon HD 5570, but not by very much.
The Radeon HD 6570 is roughly 35% more powerful than the Radeon HD 5570. Therefore, attempting to compare the Intel HD 4600 to a Radeon HD 6570 is misleading to say the least. This is assuming the estimated 20% increase in performance is true. However, in order for the Intel HD 4600 to be comparable to the Radeon HD 6570, the increase in performance from the HD 4000 will have to be significant.
Let's just say that the Intel HD 4000 is equal to the Radeon HD 5550 for argument sake... The difference in performance between a Radeon HD 5550 to the Radeon HD 6570 is probably about 60% - 65%. Naturally, since the Intel HD 4000 is slower than the Radeon HD 5550, the increase in performance will need to be more than 65%.
While not in this article, Intel had another slide stating that the iGPU in Haswell is up to 2x more powerful than Ivy Bridge (meaning 100% more powerful than the Intel HD 4000), then that means Intel's HD 5200 GT3e (HD 5200 + eDRAM = Crystalwell) with double the number of shaders compared to the Intel HD 4600, plus up to 128MB of eDRAM will only be a minor improvement over the Intel HD 4600. If the Intel HD 4600 has to have over a 65% performance increase to match a Radeon HD 6570, then there is not much more room for a performance increase if the Intel HD 5200 GT3e is supposed to be up to 100% more powerful than Ivy Bridge's HD 4000.