The history of American women is about the fight for freedom, but it's less a war against oppressive men than a struggle to straighten out the perpetually mixed message about women's role that was accepted by almost everybody of both genders.