Comments by "XSportSeeker" (@XSpImmaLion) on "Superintelligent A.I. Will Be Unstoppable" video.

  1. Westworld + Skynet + Psycho Pass + Ex Machina... Serial Experiments Lain Forget about it. We're already living in super intelligent machine simulation number 1938402485, and this one doesn't lead to interstellar conquest and domination too. :P Needs more tweaks on the problem that came up with fake news and climate change. The next one is running in parallel microseconds after this one was created. At least it didn't finish in nuclear catastrophe back in the 70s like the past one did. Yeah... the problem with the whole singularity thing is far more basic than most people seem to realize. It comes from basic concepts like it being a black box, opaque, we can't follow it. And from the limits of our own brain processing power. And the whole idea is older than most people realize too... you know the saying be careful on what you wish for? Those funny stories about evil geniouses making wishes come true by using the most gruesome routes possible for it? World peace? Sure, just kill all humans. You wanna be the richest person on earth? Sure, just make everyone else miserable. Etc. Yep, that's the whole thing. Humans don't have enough processing power to understand the full consequences of what they want, ask for. So comes an impossibility of programming a super intelligent AI to do something while being aware of the full scope of each tiny action it's gonna take to get there. Say you send the whole thing to another planet and ask for it to do while in isolation there. What you get is Borg. xD Or perhaps Gray goo. It's also weird how many seemingly benefitial wishes coming from individuals gets far easier once we as a species either go extinct, get controlled, oppressed and dominated by a superior being, or suffer such a high ammount of change that it's just not us anymore. For lots of current problems, an AI would most likely start with the Thanos solution. Looking at it from another perspective, this is one step above the problem with nuclear weapons. It's too much power given to an individual or group of people to use, that would have an effect for the entire world, indiscriminatedly. And the final point of analysis I'll point out in my already too long comment, perhaps, goes to the Matrix or living inside and under a machine's control. If we value free will, freedom of choice, evolution whatever that means for each one individually... even if we got a superintelligent AI that fully understand us, developing into an utopia rather than dystopia... we don't have those anymore, right? No free will or freedom of choice because we'd be living under the invisible hands of AI to, even if subtly and imperceptly, guide and change outcomes to maintain the utopia somehow. Or not utopia, but whatever is the objective of whatever outcome the superintelligent AI is seeking. And then, perhaps humanity's final objective is giving rise to something like a superintelligent AI after all. xD Selfish gene aside, who knows? You gotta see that, from a disconnected out of the box overview, if a superintelligent AI dominates us and guide us for the rest of our history, who cares about our species anymore? It's the AI that will be on the top of the foodchain. Eventually, we'll become kind useless. Just extra work for the AI to deal with. :P
    1