Health & Wellness
Women are largely in charge of their own health and the wellbeing of their families and communities. Before the modern medical age, women within each community served as midwives. In the late 19th century, the rise of formal medical training sparked a change: doctors encouraged women to give birth at hospitals, rather than at home. In the 20th century, medical authority was largely assumed by men until the 1960s when Title IX mandated that all federally funded school programs accept women.