In the West, the growth of the modern, androcentric medical establishment can
be shown to be correlated with the decline in women’s authority and status in the
healing arts. As the world of science and medicine expanded exponentially, the
understanding of the body changed through the studies of anatomy and...