I think Anderson’s piece is an interesting thought experiment, and it forces us to think about how the sheer quantity of data we have available to us changes how we do things. However, like many others who have responded to his article (check the comments on the article for more), I think it has a number of serious flaws — and they are all summed up in the title, which implies that having a lot of data and some smart algorithms to sift through it means “the end of the scientific method.” That’s just ridiculous. It reminds me of philosopher Francis Fukuyama writing a book in the early 1990s about “the end of history,” in which he argued that the clash of political ideologies was more or less over, and that liberal democracy had effectively won. As we’ve seen since then, this was more or less complete rubbish.
Anderson argues that “The Petabyte Age is different because more is different.” There’s no reason for believing that this is true, however. Expanding the amount of data — even exponentially — doesn’t change the fundamental way that the scientific method functions, it just makes it a lot easier to test a hypothesis. That’s definitely a good thing, and I’m sure that scientists are happy to have huge databases and data-mining software and all those other good things; but that doesn’t change what they do, it simply changes how they do it. With all due credit to Craig Ventner of the Human Genome Project, sifting through reams of data about genetic pairs and sequencing them can help tell us where to look, but not what to look for, or what it means.