The majority of Americans believe that Hollywood’s leftist politics have a negative influence on society.
The poll, conducted by Rassmussen, found that 63% of Americans “describe the politics of Hollywood as liberal,” and of that number more than half “feel that movies and the movie industry negatively impact society.”
The national phone and online survey of 1,000 U.S. adults was conducted on October 12 and 15, 2017, with a sampling error of +/- 3 percentage points.
A closer look at the data also reveals that almost half (48%) of Democrats surveyed admitted that Hollywood leaned heavily to the left.
Interestingly, 60% of Americans also believe that most Hollywood celebrities “are not good role models.”
The survey comes as political fallout continues with disgraced Hollywood producer Harvey Weinstein, who has been accused of sexual assault by scores of women including Gwyneth Paltrow, Rose McGowan, and Angelina Jolie.
The scandal is rocking Hollywood culture to its core, and has revealed its hypocrisy as it preaches “women’s rights and empowerment” but has been protecting and facilitating sleazy “casting couch” behavior for decades.