This week’s newsletter is a brief response to an article published in the British political and cultural magazine The New Statesman. The political risks of Big Data dominance, article by Firmin DeBrabander (Professor of Philosophy at the Maryland Institute College of Art) talks about Big Data's hubristic claim of understanding every aspect of humanity. It is good to see established publications such as The New Statesman interested in technology and politics and bringing philosophers to comment. The article provides a good summary of the issues and points out what is at stake. Yet, I have found zero new insight from it and furthermore, it misses a few interconnected, multi-dimensional aspects. At the end of today’s newsletter, I suggest several works that provide a better contextual approach.
I agree with the point: “Big Data's hubristic claim that it understands humanity opens the door to dangerous manipulation.” However, it is important to contextualize this “hubristic claim”. From a philosophical and theoretical point of view, it is good to deal with abstract ideas like Big Data and Humanity but context matters.
The underlying impetus of understanding humanity through Big Data is fuelled by primarily a market driven and profit oriented focus. Even though it is not explicitly stated, to make the point, Prof. DeBrabander cites Shoshana Zuboff’s work on surveillance capitalism. Big Data systems read us then predict and shape our behaviours. However, such ambitions of Big Data are not new. He asks,
Political leaders and researchers throughout human history have thought they cracked the human code and could program us at will.So, what is different now? Why should we believe Big Data has figured us out? And even if the data analysts are wrong, what should we make of their hopes and designs? And what should we fear?
Then he goes on to say, “Data analysis is an esoteric science whose methods and conclusions are inscrutable to us” and talks about the oft cited Target example where an algorithm figured out that a customer was pregnant by analysing purchase habit (this example is 9 years old! time to find some new references!!).
Yes, there are “black box” aspects of data analysis, data science etc that are not evident to the general public. But I would not call this “esoteric science”. Anyone with a rudimentary knowledge of statistics can understand a few basic premises of “data analysis”. Of course, there are complex systems and patterns involved and certain expertise is needed but this is not that esoteric.
He rightly points out the role of history:
History, however, has shown that the search for and supposed implementation of human perfection and uniformity is a recipe for bloodshed. As the eminent intellectual historian Isaiah Berlin famously put it, humanity is made of “crooked timber”. We diverge in countless ways, some remarkable, some minute, and s “forc[ing] people into the neat uniforms demanded by dogmatically believed-in schemes is almost always the road to inhumanity”.
However, the article gets a few things wrong.
Power of data analyst?
But if you are as predictable as the data analysts claim, and if data analysts are vested with great power, they may be tempted to use this technology for less admirable ends. In fact, one company now offers AI mental health technology to telemarketers, ostensibly so that they can better empathise with customers, but it could also be used to help lure them in. This would be a devious use of a technology designed to detect when people are at their most vulnerable.
Again, this “great power” is not something out there. If we follow the connections we can see how these powers are manifesting in our current society. Data analysts and scientists are now trained at various computer science and mathematics departments all over United States and via online courses. Companies hire skilled personnel for different purposes. There are huge demands for these courses. At the same time, humanities and arts departments are struggling or closing down around the world. This imbalance is the impetus along with market demands and the extractive nature (such as mining for batteries) of the economy attached to Big Data and the AI ecosystem. The source of hubris is not Big Data alone (see Atlas of AI by Kate Crawford for a such analysis). I do agree that the use of AI “when people are at their most vulnerable” is devious but this vulnerability is a data point the data analysts (and the companies that hired them) are using. There will be more and more such data points. This is important in contextualizing the hubris. Big Data does not occur in thin air — always exists within a particular social context. Rob Kitching (Professor of Human Geography) calls this the “oligoptic views of the world”:
Indeed, all data provide oligoptic views of the world: views from certain vantage points, using particular tools, rather than an all-seeing, infallible God’s eye view (Amin and Thrift, 2002; Haraway, 1991). As such, data are not simply natural and essential elements that are abstracted from the world in neutral and objective ways and can be accepted at face value; data are created within a complex assemblage that actively shapes its constitution (Ribes and Jackson, 2013).
More from The New Statesman article.
“Data analysts enamoured by their own talents expand the bounds of experimentation.”
This enamoured status is part of a growing trend that will subside later. Data analysts and data science are now “hot” vocations. Remember the Y2K crisis. If you could spell COBOL then you were getting jobs. It is almost like that now. If this phase of data science gives away to something else, then the job market will follow suit and these analysts will turn to that. Of course, these analysts are confident in their ability to be part of the system that tries to predict. But I doubt they are trying to “expand the bounds of experimentation”. The joke goes something like this: “A data scientist is a statistician who lives in San Francisco.” Again market and demand focus us on what such “expand of experimentation” should be. There are now more engineers and data analysts working on figuring out how to display cat pictures online than for instance climate crisis data. If the analysts are so vested with power and enamoured by their talents, why are they not using those for the greater good.
These analysts are playing a supporting role in decision making aspect of the Big Data systems. And these decisions are of varying nature — such as suggesting a product, checking credit scores. Some of these decisions are descriptive while some are explorative. Predictive aspect of Big data is often highlighted but these systems could be prescriptive as well. These are complex decision making processes and the relations within them may differ per decision type and context. In order to hold Big Data’s ambition in check we need know and understand more about these relationships.
I agree with the aspect of hubris mentioned in the article by DeBrabander (his book Life After Privacy is worth checking out) and the need for government regulations. However, the influence and the context of the hubris of Big Data should have been spelt out clearly in the article (I understand that it probably had limited space to address everything).
There’s no doubt that the type of analysis we can do now would have been unimaginable even 10-20 years ago. These are not just social media and purchase history — from satellite imagery, weather data to Twitter data stream. We have built massive instrumentation where classification, representation, and quantification of human behavior is at the heart. We often re-use, re-purpose data very different than the original purpose. No doubt, all these present conceptual and ethical challenges. But we require new approaches to understand the political risk of Big Data dominance and I applaud The Νew Statesmen for the article but they could do more.
I would suggest to The New Statesmen to look into some of the more recent and exciting works done by Safiya Noble, Jill Lepore, Timnit Gebru, Loiuse Amoore, Carissa Véliz, Sabina Leonelli, Manan Ahmed Asif, Ulises Mejias, Nick Couldry.
If I were to invent a symbol for a new religion, or new social movement, it would be the Bell Curve. As I understand things (simple undergraduate course on statistics, nothing deeper) then it is the Law of Large Numbers that informs us about probability. It's clear that Amazon "knows" I might be more likely to buy a popular physics book by, say, Brian Greene, if I bought a book by Carlo Rovelli. If I search for a new baseball bat online, I'm not surprised to see targeted ads for, say, a new baseball glove. It all gets more complicated than this, of course. If I do a search for far-Right groups in America, does that mean I want to join one, or I want to know what they're up to? Miscalculations, like in international relations, are possible.