How the First Women Doctors Changed the World of Medicine
Women in White Coats tells us women had been healers since civilization began but once they attempted to get MD degrees they were turned away, ridiculed, and physically attacked…by their fellow students and male doctors.