The de-sciencing of American medicine and what it means for you

The de-sciencing of American medicine and what it means for you

With all the talk about “evidence-based medicine,” you might think that doctors were becoming much more focused on rigorous science. But like the names attached to bills in Congress—such as the Affordable Care Act, which outlaws affordable insurance—the language used in the movement to fundamentally transform America and American medicine usually means the opposite of…