The Hollywood elite spent the last year bashing Trump every chance they got and specifically spoke out against his alleged sexual misconduct and the infamous “P***y Tapes.”

Fast-forward to October, 2017 and Hollywood is eerily silent about the Harvey Weinstein rape accusations, proving they don’t actually care about the well-being of women in America.


Related Articles


Comments