Big data is preparing big things for us. A growing industry that collects and analyzes each of our digital radiation, no matter how small or mundane, believes it has found the key to reading us and predicting, if not clueing, our behavior.
Such ambitions are not new. Political leaders and researchers throughout human history thought they had cracked human code and could program us at will. So what has changed now? Why should we believe big data has figured us out? And even if data analysts are wrong, what do we do with their hopes and intentions? Why should we be afraid?
Data analysis is an esoteric science, the methods and conclusions of which are incomprehensible to us.… Quote famous example, data analysts working for Target in the US concluded that female shoppers were pregnant after analyzing their purchases of certain items, including vitamins, lotions and cotton balls. Target analysts were shrewd enough to predict a woman’s due date to within a week.
Meanwhile, Facebook data analysts know when we fall in love or break up. Through careful study, they definite that “couples who are about to go ‘official’ will post … 1.67 times a day for 12 days before they publicly change their profile to ‘in a relationship’. Then the number of posts will drop to 1.53 posts per day for the next 85 days … [While] the number of interactions decreases with the beginning of the relationship, and an increase in the level of positivity is also observed. This includes the use of words like love, cute, happy and …[subtracting] negative words such as hatred, resentment and evil. “
Another alarming example: the observer scientist Shoshana Zuboff explains in The era of surveillance capitalism how online lenders are using data analysis to determine creditworthiness. Through “detailed analysis of a person’s smartphone and other online activities,” they extract important data, including “the frequency with which you charge the phone battery, the number of incoming messages you receive if and when you answer phone calls, how many contacts, on your phone, how you fill out online forms, or how many miles you travel each day. ” How do they understand this data? It is hard to say.
However, it is clear that data analysts are keen to expose our vulnerabilities. Why else would Facebook know if we’re falling in love or breaking up? We are especially irrational or malleable in such countries, and advertisers – real customers of Facebook – would like to know about this. And with our personal information, analytically savvy advertisers can influence our behavior and turn us into the customers they’ve always wanted us to be.
Zuboff suggests that big data is fascinated by the thinking of 20th century behavioral psychologist BF Skinner.… Skinner held conflicting views, such as the idea that knowledge and freedom contradict each other: our actions seem free only as long as their reasons and motives are not understood; when we are fully understood, we will see that our actions are completely predictable and our freedom is illusory. In fact, Skinner believed that the concept of “autonomous man” hinders our rational future and suppresses our progress. A rational future is a technocracy where choice on key issues is taken from the hands of lost people and left to experts who know us, read us and understand what we really need.
Skinner’s convictions and aspirations are reminiscent of the special kind of rationalism that conservative philosopher Michael Oakeshott discovered in 20th century political thinking. This rationalism, Oakeshott explains, combines “policy of excellence” and “policy of uniformity.” In particular, rationalists believe that political problems can be solved by ensuring that political institutions are consistent with the ideal form of government. And instead of using history and experience to resolve political conflicts, rationalists rely on their technical understanding of human nature and society.
The rationalist view is that people ultimately need to be cleansed of the habits that hold them back and then reprogrammed to achieve an exemplary political community. Oakeshott’s assessment of rationalism reflects the way of thinking underlying industrial and Stalin’s policies. Mao’s Cultural Revolution… And what is typical, both the USSR and the Chinese Communist Party sought to erase traditions and radically renew society, forcing their citizens and institutions to conform to political ideals.
History has shown, however, that the search for and purported embodiment of human perfection and uniformity is a recipe for bloodshed. As the eminent intellectual historian Isaiah Berlin said, humanity consists of “crooked tree“. We disperse in countless ways, including the remarkable, the minute, and the irresistible.[ing] people in neat uniforms, required by dogmatically correct schemes, are almost always the path to inhumanity. “
Moreover, trying to understand humanity is an act of violence. It is a kind of conquest that indicates the arrogance and danger of the political rationalists. TO cite The late Donald Rumsfeld, Secretary of Defense under George W. Bush, said there are “unknown unknowns” in the human psyche. And when leaders claim to be aware of the human condition, freedom and diversity are easy to sacrifice for a broader vision. In fact, it was precisely such sacrifices that Stalin and Mao demanded, imprisoned under the seemingly innocuous bureaucratic name “central planning,” and their technocratic social experiments caused untold suffering.
What does this tell us about big data? What portends its dubious intellectual origins? With artificial intelligence (AI), data analysts are claiming more and more of your soul. Researchers have applied AI to diagnose mental health, for example, by listening to a human voice and analyzing its tone, pitch, and loudness. This is brilliant and very helpful for people who do not have the opportunity to see a therapist.
But if you are as predictable as data analysts claim, and if data analysts are empowered with more power, they may be tempted to use this technology for less worthy purposes. In fact, now one company suggestions Artificial intelligence technology for telemarketing, ostensibly so they can better empathize with customers, but it could also be used to lure them in. This would be a dodgy application of technology designed to determine when people are most vulnerable.
Data analysts, passionate about their own talents, push the boundaries of experimentation. Facebook, which knows when we fall in love, has developed methods to influence our mood by inviting us to choose messages and ads. He also deployed algorithms for increasing voter turnout is a feat that later imitated by Cambridge Analytica in 2016.
Basically, data analysis aims to identify all of our needs and desires – even before we even know about them. Analysts have become very good at this and thus have enabled advertisers to better serve us. But there is a danger in big data science.
Like political rationalists, data analysts may think they know best. When we are presented as a set of data points that an expert analyst can nudge and nudge, we risk becoming objective and undermining our autonomy. This means that big data opens the door to blatant inhumanity.
Yet despite these unsettling similarities, there is a key difference between rationalism in politics and big data. Unlike the technocrats who served under Stalin and Mao, data analysts do not have a monopoly of political power. This means that we have the opportunity to pass government legislation that limits the possibilities of big data and reduces its potential for abuse.
We can, for example, restrict the use of analysts to their ideas. Confidential information, such as whether we suffer from anxiety or depression, should only be shared with healthcare professionals. Privacy policies that inform consumers about the data requested and how it will be used can be helpful. Tech firms that interfere with elections should be severely punished. And antitrust laws for the technology industry can dismantle big data companiesand shrink them to a more convenient size.
However, one thing is clear: we cannot expect big data to admit its own fallacy and contain its ambitions.
Firmin DeBrabander is professor of philosophy at the College of Art at the Maryland Institute. He is the author of the book “Life after confidentiality “ (Cambridge University Press).
This article is part of Agora series, a collaboration between the New Statesman and Aaron James Wendland. Wendland is a Vision Fellow in Social Philosophy at King’s College London and a Senior Fellow at Massey College in Toronto. He tweets @aj_wendland…