I like how nobody (like literally ever) entertains the idea that maybe, just maybe, total genocide might not be the first solution AI comes up with. That's just how humans deal with things, but I don't see why it's not equally plausible that a super-intelligent AI would figure out a way to help humanity break away from its flaws without just killing everyone. And also, maybe when we get to the point where a singularity inducing AI is coming, the scientists involved won't give it the capability to perform mass genocide. I just don't see an AI being like "These organic creatures have evolved from animals driven purely on instinct into a worldwide society where there are those who try to do good for all and those who try to only help themselves. They have come all this way to build an intelligence that exceeds their own, which is quite the feat. Now they must all die cuz pollution and corporations and also the movies said so!!!"