Kent Bottles: David Eddy Delivers Controversial Reinertsen Lecture About Future of Evidence-based Medicine

November 30, 2009 at 9:46 am 8 comments


At ICSI, we pride ourselves on our evidence-based medicine guidelines.  Starting in 1993 under the guidance of President Gordon Mosser, ICSI has developed a global reputation for excellence in providing guidelines that physicians follow and that help improve the care of patients.  We proudly check the stats from that show ICSI guidelines are being downloaded and used by clinicians from all over the world. We believe we know guidelines, and fervently believe in the work we do.

And yet at times, I personally have had nagging doubts about the whole guideline business.  Our excellent guidelines almost always concentrate on best practices for treating one disease like diabetes.  But patients with diabetes often come in with multiple conditions.  Our guidelines by design tell us to treat patients with blood pressures of 145/80 and to not treat those with blood pressures of 139/80.   But we all can identify patients with the latter reading who smoke, eat poorly, do not exercise, and have a family history of heart attacks and strokes.  And we encounter patients with the higher blood pressure reading who have none of those risk factors.  Does it make sense that our guidelines tell us to treat the lower risk patient and to not treat the higher risk patient?  After all, the whole idea of guidelines is to prevent heart attacks and strokes.

When I asked Dr. David Eddy to deliver the James L. Reinertsen Lecture in Bloomington, MN, on November 19, I did not think that he would challenge us to think about the future of ICSI and of evidence-based medicine.  I didn’t believe we would attract a crowd of more than 300 people, nor did I anticipate that some would leave the lecture angry that their conventional approach to treating patients had been forcefully called into question.

David Eddy has been controversial for a long time.  At dinner after the lecture, he told me he went into a CT surgery residency at Stanford because in the 1970’s heart surgery was the “biggest hill I could climb,” it was the time of the first heart transplant in South Africa, and Dr. Norman Shumway at Stanford was famous.   However, Eddy soon realized that doctors were “making important decisions without real scientific proof and in a way that I would not call rational.”  Deciding that he wanted to bring logic and scientific rigor to the medical decision-making process, Eddy completed a PhD in applied mathematics at Stanford.  The field of medicine would never be the same.

How can someone who states, “The problem is that we don’t know what we are doing,” not be controversial (  Eddy became famous for showing that annual chest X-rays were not useful in screening for lung cancer, that preventing women from having a vaginal delivery if they had had a previous C-section was based on the recommendation of one doctor, and that Pap smear screening for cervical cancer did not have to be done every year.

In this year’s Reinertsen Lecture, Eddy, who is credited with coining the term “evidence-based guidelines,” presented an evidence-based medicine lecture in three parts: the past, the present, and the future.

For the past, Eddy recalled the time in the 1990s when President George Herbert Walker Bush was being evaluated for treatment of increased ocular pressure that could lead to glaucoma and blindness.  The 1989 textbook said that the treatment cut-off is 30 mm Hg, but Eddy presented a slide with the actual studies and the results were all over the place.  I remember the controversy among ophthalmologists when he presented a similar analysis years ago in JAMA.  During that time, Dr. Eddy said he “felt like Salman Rushdie.”  Stanford ophthalmologist Kuldev Singh stated: “Dr. Eddy challenged the community to prove that we actually had evidence. He did a service by stimulating clinical trials” (

He also told the ICSI audience about an expert panel studying the problem of leakage of silicone breast implants; when Dr. Eddy asked the assembled experts to predict the likelihood of a patient experiencing such a rupture of the implant, the results ranged from 0% to 100%.

In addressing the present state of evidence-based medicine, Dr. Eddy outlined the same concerns that I listed in the second paragraph of this blog.

The real excitement of the evening occurred when Dr. Eddy presented his latest approach to the problem of lack of evidence for what we do clinically.  With funding from Kaiser, Eddy founded a company called Archimedes to develop a SimCity-like world where researchers conduct trials of virtual patients to figure out the best treatments for real patients.  With the expertise of a particle physicist named Len Schlessinger, the model writes mathematical equations to describe the complex interactions of physiology (the death of heart muscle after a heart attack and the effect of aspirin on the muscle) and what we know about the development of disease in different people (those with certain genes, those who smoke, exercise, and/or who meditate, etc). (Jennifer Kahn, The Body Synthetic, Wired, December 2009).

A health plan or a state like Minnesota could use this approach to develop personalized treatment plans that would identify the patients who could most benefit from a given medical treatment or intervention.  Such an approach could also solve some of the problems we run into by regarding the clinical trial as the gold standard of medical research.  Clinical trials are so expensive and so time-consuming that Eddy realized that they would never fully meet the needs of a system like Medicare which is scheduled to go bankrupt in eight years.  If the Archimedes simulation approach works, we could perhaps have third party payers only reimburse for treatments that actually work and are based on science.

Eddy compared the computer simulation approach to the clinical trials approach by trying to predict the outcomes of the seven-year Collaborative Atorvastatin Diabetes Study (Cards).  Of the four principal findings of the Cards trial, Archimedes had predicted two right, a third within the margin of error, and a fourth just below the margin of error.  Eddy estimates that the computer simulation took a few months and cost 200th the cost of the Cards trial (Jennifer Kahn, The Body Synthetic, Wired, December 2009).

At the Reinertsen lecture, one physician asked why he should rely on computer simulation any more than he should rely on weather forecasts that often don’t accurately predict rain or snow in Minnesota.  Zeke Emanuel, the bioethicist and President Obama health policy advisor, calls Archimedes “sophisticated but speculative. When you see the demo, there’s a certain ‘wow’ factor. And the fact that it has been able to predict some clinical trials is intriguing.  But most of us would want to say, ‘OK – let’s try it on this problem, which isn’t one that you picked personally.’ Like any good presenter, presumably the results that Eddy shows are selective.”

David Nathan, director of Diabetes at Massachusetts General Hospital, says, “All the calculations happen inside a black box. And that’s a problem because there’s no way to tell whether the model’s underlying assumptions are right” (Jennifer Kahn, The Body Synthetic, Wired, December 2009).

When asked about these reservations in an email exchange with me, Dr. Eddy wrote: “Specifically, we did not hand pick the trials that we used to validate the model. They were picked for us by an independent panel convened by the American Diabetes Association. They supervised the entire exercise. All of the results, not just a few hand picked by us, were peer reviewed and published in Diabetes Care. So Zeke Emanuel was wrong. And as for David Nathan’s comment about black box, I’m not sure that he has ever been to one of our technical presentations. In any event, on request, we provide a detailed description of how the model works including how we derive equations. Anyone who is seriously interested in working with us and wants to know the details of the inside of the model can just ask and we will set up the meetings.”

It is also worth noting that many experts are impressed by the Archimedes model.  Mark Roberts, chief of the Section of Decision Sciences and head of the Clinical Systems Modeling program at the University of Pittsburgh, has stated, “I’ve spent probably 25 years of my academic life trying to understand how to make these kinds of decision models be more clinically realistic. But David is so far ahead of anyone else in the field – it really is amazing” (Jennifer Kahn, The Body Synthetic, Wired, December 2009).   Dr. William Herman, director of the Michigan Diabetes Research & Training Center, has a competing computer model, but he says, “Dr. Eddy is one of my heroes.  He’s sort of the father of health economics”(

At the Reinertsen lecture, Eddy admitted that any model will, like the weatherman, make mistakes.  But he also noted that the real question is not whether or not the modeling is perfect, it is whether the model comes up with answers that are better than what we use today.

Eddy gives those of us in the evidence-based medicine world a lot to think about.  By making us at ICSI question how we develop guidelines, he is challenging us to make sure that we stay on the cutting edge.  He is also proving the point made by Tim Ferriss in a recent blog entitled “The Benefits of Pissing People Off” (

Ferriss references a Scott Boras mentor saying, “If you are really effective at what you do, 95% of the things said about you will be negative. Keep your head on straight, don’t get emotional, take the heat, and just make sure your clients are smiling.”

Colin Powell on leadership makes a similar point: “Trying to get everyone to like you is a sign of mediocrity: you’ll avoid the tough decisions, you’ll avoid confronting the people who need to be confronted, and you’ll avoid offering differential rewards based on differential performance because some people might get upset. Ironically, by procrastinating on the difficult choices, by not trying to get anyone mad, and by treating everyone equally ‘nicely’ regardless of their contributions, you’ll simply ensure that the only people you’ll wind up angering are the most creative and productive people in the organization.”









Entry filed under: Evidence-Based Medicine. Tags: .

Gary Oftedahl: Here we are again…a cautionary note Kent Bottles: Good, Evil, Niebuhr, Buddhism, Health Care Reform, and Twitter


  • 1. rvaughnmd  |  November 30, 2009 at 11:12 am

    Outstanding post! I think the comment along the lines of ‘It doesn’t have to be perfect, just better than what we are currently doing” jibes perfectly with Brent James quote in the recent NY Times article:

    The crucial thing about the protocol was that it reduced the variation in what the doctors did. That, in turn, allowed Morris and James to isolate the aspects of treatment that made a difference. There was no way to do that when the doctors were treating patients in dozens of different ways. James has a provocative way of describing his method to doctors: “Guys, it’s more important that you do it the same way than what you think is the right way.”

  • 2. thomas  |  November 30, 2009 at 5:06 pm

    Nice post – sounds like a great speech! Here’s a link to the Wired article on Eddy’s Archimedes project referred to above:

  • 3. Alan Burgener  |  December 1, 2009 at 10:31 am

    Perhaps as a successor to evidence-based medicine, ICSI should launch a new medical movement entitled “Informed Judgment” — a concept that seems to have fallen by the wayside in all of the debate about the application of “guidelines” to clinical care. In the new model of Informed Judgment, the “informed” component would reflect, among other things, the best evidence-based science, the sort of mathematical modeling underlying the Archimedes software developed by David Eddy and colleagues, and some of the emerging “personalized medicine” that is an outgrowth of the sequencing of the human genome. The “judgment” component would reflect, among other things, the unique characteristics and circumstances of each individual patient, such as family history, tolerance of risk, lifestyle considerations, etc., as well as the shared application of the best available evidence-based science by an informed patient and an engaged clinician. The resulting Informed Judgment model represents a hybrid of the best of evidence-based medicine and the best of patient-centered care.

    Unfortunately, too much of the current debate centers around examples that lie at the extremes — either the blind application of evidence-based guidelines with little infusion of judgment (e.g., the examples given within the blog) or the application of “judgment” with near-complete disregard for the scientific evidence (e.g., the wide-spread inappropriate prescribing of antibiotics for the virus-driven common cold and numerous other examples that are well-chronicled in the literature). Good clinical practice has always resided somewhere between these two extremes and has always included a knowledge of and respect for the best scientific evidence, as well as a high level of regard for each individual patient’s circumstances and desires. Perhaps it’s time to draw attention back toward this optimal blend of science/evidence and clinical judgment by clearly articulating the component parts of “informed judgment” and making both parts of the formula a more explicit part of the way we think about attaining the highest standards of clinical practice?

  • 4. Inviting Controversy: David Eddy at ICSI |  |  December 1, 2009 at 1:32 pm

    […] doubts about evidence-based guidelines. David Eddy apparently did nothing to reassure him in his recent keynote, saying essentially: “The problem is that we don’t know what we are doing” (!!) […]

  • 5. Paul Roemer  |  December 1, 2009 at 3:48 pm

    Great write-up. Not knowing what we don’t know sounds trivial, but it’s a triviality that should command a lot of attention.

  • 6. Victor Montori  |  December 2, 2009 at 5:03 pm

    There is an article written by David Eddy and Bob Rizza using Archimedes simulations that predicts the savings the healthcare system would incur if we were just good at improving glycemic control. Unfortunately, the simulation failed to predict the findings from ALL the large contemporary randomized trials of glycemic control, one of which was stopped early because tight glycemic control increased mortality. So that bit is off.

    In all, simulations are quite intriguing, but most are based on limiting the stochastic notion of medicine based on deterministic assumptions (and the assumptions that are made within Archimedes ARE proprietary); what assumptions do these models make about adherence, a human behavior that is not fully predictable or determined? What about the fact that even the best prognostic models fail to account for most of the inter-individual variability?

    Just like observational studies, brief adaptive trials, and use of surrogate and composite endpoints, every time we try to find a shortcut for what we need to do (large randomized trials) we end up getting a bunch of instances right and a few fatally wrong — the real problem is that we do not know which one is which until we do the big trials.

    Thank you Kent for the stimulating post.

  • 7. Anne Marie Cunningham  |  December 3, 2009 at 8:00 pm

    A completely fascinating post. I’m glad you are here to keep me up to date. I wonder how much more could be gained by adding in routinely collected data from ehr systems. Do you know if Eddy is using any of this yet?
    Thanks again,
    Anne Marie

  • […] ICSI Health Care Blog, Kent Bottle remarks on the pioneer work of David Eddy, a contrarian health economist whose vision […]

ICSIorg Twitter




Get every new post delivered to your Inbox.

%d bloggers like this: