Most universities continue to follow a blueprint introduced in 1910, which called for two years of in-depth study of the basic sciences followed by two years of clinical experience. A cookie-cutter approach, it means that students spend two years sitting through long lectures and regurgitating facts on tests, followed by the shock treatment in their third year of suddenly dealing with patients in a hospital ward.
“It’s become pretty clear in the last couple of decades that this is probably not the best way to learn something as complex as medicine,” says Randolph Canterbury, the medical school’s senior associate dean for education. “The idea that physicians ought to learn the facts of all these various disciplines—anatomy, physiology, biochemistry and so forth—to the depth that we once thought they should doesn’t make much sense.”
About half of all medical knowledge becomes obsolete every five years. Every 15 years, the world’s body of scientific literature doubles. The pace of change has only accelerated. “The half-life of what I learned in medical school was much longer than what it is today,” adds Canterbury, a professor of psychiatric medicine and internal medicine.
Huh. Who knew? Oh yeah.
So what happens