Only the Tea Party, Christians, Republicans, and Conservatives are allowed to be portrayed as everything Hollywood actually is
The Hollywood movie industry is not racist just because there are few blacks or other racial minorities in leading roles, on the set, or in the executive suite. Hollywood is not sexist even though the culture of the casting couch still lives, and young women are expected to perform...














