The World Health Organization recently rated America thirty-seventh in health outcomes, on par with Serbia. Tackling head-on the three major myths of American medicine, Dr. Weil shows how medical schools fail to give future doctors the education they need to care for patients, how insurance companies have destroyed our opportunity to get excellent care, and how pharmaceutical companies have come to rule our lives. The solution involves nothing less than the creation of a completely new culture of health and medicine in this country.--From publisher description.
From the community