Comments by "Digital Footballer" (@digitalfootballer9032) on "Brian Cox - Alien Life u0026 The Great Filter Hypothesis" video.
-
55
-
2
-
If climate change is actually anywhere near as big an issue as some believe (I don't believe this but that's beside the point), and whether or not it is man made or natural, I do not think it will completely wipe out all of humanity. Even large swings in global temperatures, as this is not unprecedented and other species have adapted and survived it, and we will as well. Yes, many will perish, be displaced, have a much tougher go of things, but it will not cause complete annihilation. If it is a direct cause of human activity, the very changes themselves will quell the activities causing the changes, therefore unlikely reaching a "runaway" scenario. We aren't going to see earth turn into Venus from human activity, such a scale would require large scale natural events.
As for the notion of being buried under our own waste, I actually used to ponder this idea myself. However, any kind of long lasting dangerous waste, such as radioactive waste with half lives in the millions or billions of years, is but a small drop, current estimates are that all nuclear waste ever produced in all facilities in the world would basically only fill a football pitch a few inches deep. It would take many millions of years for this to become a serious problem, which hopefully in that time we will come up with more efficient methods of energy production. As for other long term wastes like plastics and such, yes this is a big problem currently but again if we are to continue on as a species I believe we must and will find better alternatives and eliminate this problem before it gets out of hand.
While I don't doubt there are many problems we face that are potential species-enders, I don't believe any of them at this time are insurmountable issues. Even when this planet faced an ice age, a small number of humans survived and kept the species going, a very small number like 6,000, but enough to avoid extinction.
1
-
1
-
1
-
1
-
There is also the old argument about the "self aware paperclip maker". AI that's only task and only drive is making paperclips endlessly. Yes, they may eventually exhaust their resources and be unable to make more paperclips, but rather than seek out new resources for paperclip making, they just shut down indefinitely because they are ill equipped to go searching for said resources, being paperclip makers and all, and not programmed to do anything else like explore or even move from the same spot. This theory, of course, does make the assumption that these machines are incapable, unwilling, or uninterested in evolving and/or learning/improving in any way that is outside the realm of its primary programming (it may work to become a more efficient paperclip maker and nothing else). Which certainly there is a possibility of such an entity, when ruling out that there will be an automatic human type desire to improve/preserve/replicate itself. As you point out, this doesn't necessarily have to be embedded in their logic just because we think this way. What we see as superintelligence may be deemed inefficient/unnecessary by an alien AI. Certainly an interesting angle on AI and how it may perform.
1