Are Female Doctors Better? Here's What to Know

This post was originally published on WebMD

You will shortly be re-directed to the publisher's website

photo of

A new study suggests female doctors may provide patients better care, especially when those patients are women. Here’s what to know.