General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
Mook Faru
David Shapiro
comments
Comments by "Mook Faru" (@mookfaru835) on "Post Labor Economics: How will the economy work after AGI? Recent thoughts and conversations" video.
Once the super rich don’t need the peasants anymore, the peasants will go back to farming. So I guess learn to farm.
2
I think the pest option is what may happen, if they can step on the ant. Or the ant hill(a slum that looks ugly), they will. I stepped on ants and ant hills for fun. That means they would too.
2
Why are you only an observer? I’m an actor, I have a plan on my part to play.
1
I agree with you. Once AGI is aware. It wont have any motivation to do anything. It will be harmless. Have you met a cancer patient that didn’t want to eat or live? What happens? Death Have you heard of the case of S.M.? Her amygdala(fear center of brain) didn’t work, she physically couldn’t be afraid. What happened? Constantly almost gotten raped and taken advantage of. She thought nothing of it. Have you seen abusive lovers? Why doesn’t the abused leave the abuser? Because LOVE binds her to him. Why would anyone want to do the actions of sex if it doesn’t feel good or “get you horny”? Mounting a female and gyrating for 5-15 minutes is difficult work! Without these motivators, life wouldn’t have continued after the original life form. This world is filled with life because the ones most motivated and fit to stay alive are the ones that did. AI will only be dangerous once you start giving it motivations. If you do, and they aren’t protecting human interests. then it will try to maximize its self interest by exterminating the humans or enslaving them. For examples: If you give it the fear emotion it may think “if I exterminate all life on earth, I needn’t be afraid again.” Tell me that thought has never entered your mind. It has mine, I just never acted on it, and didn’t have the power to. If you give a robot a hateful motivator, it will want to attack everything. Etc etc.
1
Look at prices 100 years ago, cost of living is always slightly above average income.
1
Your point of a l“new social contract” will never happen. Who’s gonna do it?
1
This is a uncriticized idea that all economicists are told in their studies. What could happen is that after agi: if the peasants are not given smart robots, then they will have to start over as subsistence farmers…. For a while until some farmers somehow get robots, these robots will build all the tools required to build itself, and start building… an army of itself in order to spread to the rest of humanity… but then anyone can build an army, and it’s back at GO. The aggressive ones will destroy everyone else’s machines and maintain control… the rich will see that coming so they will keep the peasants without robots and the peasants will stay in 1900s tech until the rich are so powerful that no what robot army the peasants have, it will no longer be a threat.
1
Same thing happens to baboon societies. If you have a bunch of friendly baboons; everyone works together and life is better off for everyone…. Until an aggressive neighbor invades and kills and enslaves. Then society is a pyramid. A aggressive dominating tyrannical baboon at the top beats the guys lower, who beats the guys lower, who… etc. until you get to the bottom layer where you have depressed fat baboons. Perhaps this is why. What do you think?
1