Women in Hollywood have no male allies. There are some who pretend to be on our side, but yeah, not really. They may say the right thing because, after all, they're liberals, and that's a public image they'd like to keep up.