I came across a link posted by @urbandata on Twitter asking the question, “Does ‘big data’ make the scientific method obsolete?” My immediate response before clicking the link was, “I sure hope not.” After reading the article, I think it may be a bit more complex than that, but stand by my original impression.
The article “The end of theory: The data deluge makes the scientific method obsolete” can be found here: “Does big data make the scientific method obsolete?”
I think the author, Chris Anderson, rightly points out that correlation must not be confused with causation, but he continues without exploring the full meaning of this statement. As a result, he builds an argument that rests on the wisdom of this traditional warning without intending it.
For example, Anderson uses Craig Venter’s successful “shotgun sequencing” method to DNA sequencing as an example, yet doesn’t realize that the established theory that species are uniquely identified by their genome makes this approach valid. More than that, it lends credence to the author’s later observation that organisms don’t need to be directly observed to learn about their characteristics. The author can make this claim for the same mechanistic reason the shotgun approach works.
This is not to say that the use of statistical and mathematical models to analyze ubiquitous data around us does not extend the scientific method in ways that we don’t yet imagine. It does. However, science provides not only the foundation for the mathematical theories underlying statistical methods, but it also helps us to interpret the data streams and statistical results. Yes, we should strive to change the way science works, but we should not abdicate responsibility for inquiry and investigation to the black box.