I would disagree with the statement that doctors destroy health. Health is the sole responsibility of the individual. Doctors are trained to use drugs and surgery. It is up to the individual to create and maintain health for themselves and judiciously use the services of the doctors, when needed, such is the case when you have broken bones sticking out of your body from a car crash. Or when a cop beats your brains out and they need to be put back in. But only idiots go to doctors to get healthy. Health comes from good eating, exercise, and a peaceful mind. Stress, drugs, bad food, and sitting on the couch lead to bad health. And doctors don't make anyone do those things.