Good post. This is what freakin USA needs more of. It recently hit home with me; right in my face where I realized that US medicine and pharms are business. This sounds like a clichee, but I learned that doctors are not there to get people cured; they are there to make money. It is a disgrace. The US of all countries ( and I say 'of all countries' because it is the place that talks the loudest) should have a health care system that cares about it's people's health. This dude proves that you can still make a living, and do what is right.