How 'Grey’s Anatomy' is teaching women to speak up about their health
The longtime hit television show has more than just entertainment value – it educates its audience about medical issues and helps others become better advocates for their own health.