Standard deviation, standard error, whatever!

Ivan Oransky points us to this amusing retraction of a meta-analysis. The problem: “Standard errors were used instead of standard deviations when using data from one of the studies”!

Actually, I saw something similar happen in a consulting case once. The other side had a report with estimates and standard errors . . . the standard errors were suspiciously low . . . I could see that the numbers were wrong right away, but it took me a couple hours to figure out that what they’d done was to divide by sqrt(N) rather than sqrt(n)—that is, they used the population size rather than the sample size when computing their standard errors.

As Bob Carpenter might say, it doesn’t help that statistics uses such confusing jargon. Standard deviation, standard error, variance, bla bla bla.

But what really amused me about this Retraction Watch article was the this quote at the end:

As Ingram Olkin stated years ago, “Doing a meta-analysis is easy . . . Doing one well is hard.”

Whenever I see the name Ingram Olkin, I think of this story from the cigarette funding archives:

Much of the cancer-denial work was done after the 1964 Surgeon General’s report. For example,

The statistician George L. Saiger from Columbia University received [Council for Tobacco Research] Special Project funds “to seek to reduce the correlation of smoking and diseases by introduction of additional variables”; he also was paid $10,873 in 1966 to testify before Congress, denying the cigarette-cancer link.

. . .

Ingram Olkin, chairman of Stanford’s Department of Statistics, received $12,000 to do a similar job (SP-82) on the Framingham Heart Study . . . Lorillard’s chief of research okayed Olkin’s contract, commenting that he was to be funded using “considerations other than practical scientific merit.”

So maybe doing a meta-analysis badly is hard, too!