Imagine you're a doctor or researcher working with HIV/AIDS. You're taking a sample of blood from a HIV+ patient when you slip and, to your horror, jab yourself with a bloodied needle. What do you do?
In a 1997 study, researchers Cardo et al studied hundreds of cases of this kind of accidental HIV exposure ("needlestick injuries") in medical and scientific workers. They wanted to find differences between the people who contracted the virus, and the ones who didn't.
One factor they considered was post-exposure prophylaxis - taking HIV drugs as soon as possible after a suspected exposure. Now these drugs were still pretty new in 1997, and it wasn't clear how well they prevented infection, as opposed to just delaying symptoms. Many people with needlestick injuries were offered a course of drugs - but did they work?
Cardo et al's raw data found no significant benefit
By univariate analysis, there was no significant difference between case patients and controls in the use of zidovudine [AZT, the first HIV drug] after exposure.But it turned out that this was due to confounding variables. When they corrected for other factors...
Infected case patients were significantly less likely to have taken zidovudine than uninfected controls (odds ratio 0.19, P=0.003). This is a classic example of confounding, since the adjusted odds ratio differed from the crude odds ratio (0.7) because zidovudine use was more likely among both case patients and controls after exposure characterized by one or more of the four risk factors in the model.So while people who took zidovudine were just as likely to catch HIV than ones who didn't, they were also more severely exposed to the virus i.e. by being exposed to a greater quantity of blood, or a deeper wound. People were more likely to decide to take it after severe exposures. Zidovudine actually dramatically reduced the risk.
Post-exposure prophylaxis has since become standard procedure and it has undoubtedly saved many lives since. Without statistical correction, it might have taken longer for people to see the benefits.
In summary, I guess what I'm saying is, remember to correct for confounds - or die.

6 comments:
That is a great example of the power of statistics when done correctly (and the dangers of inappropriate analyses). Thanks!
Neuroskeptic,
Thanks for a very clear and educational example on the necessity to use the stats with morals-and proper technical abilities.
One of the sexiest things about your blog is the variety not only of subjects but also the variety of level of difficulty for the "layperson" in your way of treating the same subject subjects.
Miss behavior,
Thank for your comment:To my mind stats is like the law a dangerous subject because bad masters-or personal hubris - can give any stupid the dangerous delusion that he is a master of it.
Those who think that stats is an easy subject should ask their money back from college in the USA...
One of the trouble for psychiatry and psychology is that Big Pharma hires good brain with good mastering of the dark side of stats.
Plus, at least in France ,the maths academics tend to often undervaluate stats as a subject-questionning if it belongs to math for some - or they are working on theory without any interest in the application and consequences of their work.
Correct for confounds? Sure but lets not forget that any statistical manipulation of data creates any opportunity to apply the positive effects filter again. Don't like the results of your analysis? Do something to the data. Now do you like it? Yep. Publish.
DS,
I like it when I agree with your view!
Please explain more: your command of English, diplomatic skills and research experiences are so much better that mine.
An "Oeuvre de salut public" is needed on that point of yours in the discussion.Pleeeeeeeeeeeeeease
People, o my what a wonderful site you have here. Interesting reading funny photos and lots of content for your reader.
Post a Comment