Why is it that every time someone goes to the doctor they come back with drugs? I was talking to a friend the other day. She had just come back from a visit to the doctor. Her ailment was painful, but one that her body would naturally heal. Yet her doctor prescribed strong pain medication that made her unable to function effectively at work.
We seem to live in a society in which no one (even doctors) accepts that sometimes the most healthy response to an ailment is to let your body heal itself. Sure, when you invest the time and money, it's unsatisfying to discover that it was unnecessary to visit the doctor in the first place. But isn't letting your body heal itself, even after a visit to the doctor, sometimes the best course of action?
We seem to live in a society in which no one (even doctors) accepts that sometimes the most healthy response to an ailment is to let your body heal itself. Sure, when you invest the time and money, it's unsatisfying to discover that it was unnecessary to visit the doctor in the first place. But isn't letting your body heal itself, even after a visit to the doctor, sometimes the best course of action?
Comments