Fortuna Imperatrix Mundi

Olim lacus colueram,	
olim pulcher extiteram, dum cignus ego fueram. Actually this is a post about statistics, but what the hell I’ve been listening to Carmina Burana a lot recently, even if Miriam thinks it is bombastic. So anyway, several people have commented on this article which (whilst it makes some points about statistics that are vaguely plausible) far far overgeneralises its bounds of validity: any single scientific study alone is quite likely to be incorrect, thanks largely to the fact that the standard statistical system for drawing conclusions is, in essence, illogical. Tamino points out the obvious: that statistics works, and it isn’t stats faults if you don’t understand it. JA says similar, though unusually for him is quite kind about it. Even Paul linked to it. And this post merely expands my comment there.

Which is: that much of science isn’t statistical at all. A simple example would be special relativity. You can read the underlying paper; there are no stats in it. You could sort-of argue that there is some stats underlying it (that the speed of light is a constant is sort-of a matter of observation, which always come with errors, so need stats to analyse properly) but that isn’t really true. The sciencenews thing itself seems to be mostly thinking about medicine, where they use stats a lot because they don’t know what is really going on.

8 thoughts on “Fortuna Imperatrix Mundi”

  1. You could sort-of argue that there is some stats underlying it (that the speed of light is a constant is sort-of a matter of observation, which always come with errors, so need stats to analyse properly) but that isn’t really true.

    It’s not true at all. The speed of light is exactly 299 792 458 m/s by definition. There is no uncertainty in it whatsoever.

    There is uncertainty, of course, in the length of a meter, which is defined from c

    [Yes yes I know, but that is true *now* it wasn’t true in 1905. Nor does it matter -W]

    Like

  2. Special relativity has no statistics, yes. But how would we validate the theory without measurements? How do we evaluate measurements without statistics?

    [Oh dear, I hope I’m not going to get sucked into this… there is a difference between the sort of measurements that are way past the 4-SD limits and those in which yu’re struggling to find significance lower down -W]

    Like

  3. It’s not true at all. The speed of light is exactly 299 792 458 m/s by definition. There is no uncertainty in it whatsoever.

    Oh good. Does that mean I can persuade the committee who made this definition to change it to 3m/s? Just bribe them enough money, and we can have a totally different universe!

    Just to expand on 6EQUJ5’s point: on its own special relativity is mathematics (or perhaps metaphysics?). it only become science when you apply it to the real world. If you want to do that, you have to collect data and fit models to them. The study of how to collect and process data so that it can be interpreted is statistics.

    Like

  4. >”The sciencenews thing itself seems to be mostly thinking about medicine, where they use stats a lot because they don’t know what is really going on.”

    But but but…. that is just begging someone to say that there is so much use of stats in climate science because climatologists don’t know what is really going on. How could you make such a gift to the septics? ;o)

    Like

  5. Read the article in question this weekend. Not as bad as you might expect, but still way too journalistic in intent and style.
    Anyway, if you have never read biological, sociological, or psychological science from say, the 1950’s or earlier, it is well worth the time: there was no need for numbers, and the word of the expert was enough. When data were presented at all, more often then not they were obviously ‘improved’. The history of the explosion of those sciences in the second half of the 20th century is in no small part the history of the ascent of statistics.
    On the other hand, I compile plant science and medical research literature for a living, and there isn’t a day, or even an hour, when I am not astounded by the shoddiness of the stats work. Everyone knows that some journals are especially susceptible, and editors are squarely to blame. Submitted papers are sent for review to two, maybe three referees, almost always all of them experts in the subject matter (say, toxicology, or plant physiology, or…), not statistics. These referees see it as their job to rate the quality of the subject matter work. They typically just fly over the quantitative aspects. Thankfully, the practitioners of some disciplines are more competent in statistics, and pay attention. Epidemiology comes to mind, or ecology. But in many disciplines, the stats are there, but make no sense.

    Like

  6. @P.J.B.
    “Not as bad as you might expect, but still way too journalistic in intent and style.”
    It reminds me of this:

    Like

  7. Does that mean I can persuade the committee who made this definition to change it to 3m/s?

    Potentially, I suppose. The meter would just get a about 100 million times longer.

    Just bribe them enough money, and we can have a totally different universe!

    No, we’d just have a totally different meter.

    Like

Leave a comment