Comments by "Harry Mills" (@harrymills2770) on "Insider Explains the Real Reason Hollywood Is Collapsing | Dennis Quaid" video.
-
Hollywood hasn't really changed. It's always pushed the agendas of the donor class and the government. What has changed is that the messaging from the establishment is so alien to what people know and believe that it's created huge backlash.
This doesn't explain everything that's going on, but the liberal messaging that's dominated since the 1960s, with your John Wayne and Clint Eastwood outliers, resonated pretty well, and could even be argued to be helping move the culture in better, more open-minded directions.
Anybody born before 1970 knows how homophobic society used to be. I think the Tom Hanks movie "Philadelphia," (I think), where Hanks played a gay man who was sick with AIDS, marked a real turning point for society. Jesus teaches care for the sick and unconditional love. Christians are also instructed that departure from hetero norms is a sin. But when they put a face and a back-story to a good man to an AIDS victim, the unconditional love, which is the Highest Level Teaching of Christianity, trumped the homophobia.
That was a turning point in our history, where AIDS had real potential to generate huge backlash against gay people. Instead, the media complex hit us with "Philadelphia," and a lot of Christians couldn't bring themselves to hate Tom Hanks. Major culture shift towards tolerance and acceptance.
1
-
1