I was reading the other day about the idea of the Singularity, the point where technology becomes more intelligent that humans and starts fucking shit up or something:
But the thing is, why would it? A computer doesn't need to eat or sleep or want to get laid or have a pint. It might be capable of processing trillions of decisions in a second, but it has no aims or desires outside of its programming. Why would Skynet (for example) want to kill all humans? When Deep Blue beat Kasparov, the machine didn't crack out some Champagne and start making mugging off Gary, because all it was doing was fulfilling at algorithm.
Basically, someone explain to me why the singularity would be bad.