Have American physicians become a huge disappointment? Not unlike lawyers, MDs (1) are often important in the overall scheme of things but (2) are certainly no longer among the brightest bulbs our higher education system produces. In recent years, doctors not only let themselves be bullied by insurers but worst of all let patient care and patient service wane. Care/service is now a cruel, comical heartless ruse. Anyone agree with me that American doctors have lost their way?
