“Every woman is born a doctor... [while] men have to study to become one,” declared American educator Ella Flagg Young in the mid-19th century. Looking around much of the country, it certainly must have seemed that way.
Long before marketers invented “Dr. Mom,” women had served as nurse, doctor, and pharmacist to their family and friends. Doctoring a family required a great deal of knowledge and skill, which often passed down, woman to woman, through families for generations. Even so, mainstream medicine generally barred women from pursuing medical careers until the late 19th and early 20th centuries.
Those women that did see doctors rarely received adequate treatment. Many doctors refused to physically examine women for fear of offending their modesty. Others dismissed women’s illnesses, contending that reproduction made women irrational and emotional. As a result, women often found themselves suffering from a dangerous or inappropriate remedy—or no treatment at all—without the benefit of a thorough analysis.
Despite these limitations—or maybe because of them—many women did break through the discrimination and gender assumptions to pursue a career in health, particularly women’s health.