
Can we just get something out of the way, and admit that ‘Data-Driven’ is a nonsense statement? There are a lot of companies out now that claim to have a ‘Data-Driven’ philosophy which, in practice, means that they collect a whole lot of information and then make decisions based on that.
Wait a minute. Isn’t that basically how decisions have been made since… always? Admittedly, the nature of that information has changed: Early information was collected via anecdote and trial and error, as in “Don’t eat that berry, it killed Phil’. Slightly later, decisions were made based on advice from our respected, venerable, frequently syphilitic and insane elders – and, now, slightly later still, decisions are made based on vast collections of numbers produced by experimentation. Some of that information is a hell of a lot more reliable than the anecdotal and mythological precepts of former times, but it is still neither perfectly reliable nor is it without bias.
Don’t get me wrong, here. I’m all for the collection of as much information as possible to inform decisions – disregarding, for the moment, the ethical concerns behind mass data collection, which are plentiful – but thinking of this as either a fundamentally new approach or necessarily superior approach is naive. And, considering the vast size and financial power of the institutions believing in this techno-utopian method, this naïveté is more than a bit worrying.
No matter how much information you collect, there will always be limits to what can be collected. We only have one ocean, and even if we experiment on small pools of water we can never be quite sure how those experiments will affect the sea. Though we can see how the price of each product affects its own sales, we can only guess at what it does to the overall market. What fills those gaps in knowledge will always be just guesswork.
No matter how much information you collect, there’s still a human being making decisions on both ends. There’s someone deciding which information to collect, how and when to collect it, how to organize it, and what qualifies as information – and there’s someone, on the other end, determining what the information means, what courses of action are recommended or contradicted by it, and how to proceed. This human component is not removable: Even if you devise algorithms to parse the data, to make changes and recommendations and projects based on it, that’s just pushing the human decision back, one more layer, to the person responsible for writing the algorithms. And, no matter how you try to tighten that loop, it will always be subject to human biases, because in the end even the assessment of what results are desirable or undesirable are derived from our own human desires.
No matter how much information you collect, the end result of trying to systematize and data-drive everything is solely an abdication of responsibility. The end result is that when something goes wrong, we can pretend it’s the fault of a machine, a faceless other, rather than the fault of the flawed and human assumptions and beliefs that went into creating the machine – or, more frequently, deny that anything is wrong at all, because the data doesn’t lie, and heck, it’s always worked fine for me.
Yeah, the machine that you devised, that was created with your own needs and those of others like you in mind, works fine as long as it serves solely that purpose. If it fails to serve others, well, that’s their fault for not fitting into your paradigm, isn’t it? The data doesn’t lie, the machine works, and if it doesn’t then it’s not really anyone’s fault – and, therefore, not really anyone’s responsibility – it’s just how things are, sure as rain and snow. Nothing can really be fixed, because everything already works perfectly as well as it possibly can (as far as you can tell). What a shame that the humans that this machine was supposed to be built around, that were supposed to be so perfectly served, whose needs so perfectly anticipated, fail so frequently to fit into the parameters conceived of by you, the machine’s designer.
But, really, it’s just their fault for not being like you, isn’t it?