It’s Privacy Week here at CoinDesk, and we’ve been diving into a variety of technological and legal angles on the consequences of digital surveillance. Anxiety about the rise of omnipresent snooping can often feel like an academic matter of principle, or a series of warnings about important but uncommon edge cases: the battered spouse being stalked with malware, the dissident tracked and murdered by a government, the consumer with legal but socially marginalized tastes. These scenarios of privacy compromise have serious implications, of course, for those who fall victim and for every single one of us.
But the most widespread use of digital surveillance can seem far more mundane than these headline examples, while being potentially vastly more insidious.
Algorithmic content targeting is the foundation of omnipresent information businesses like Google and Facebook, and it affects you every moment you’re online. It can make you less informed, less unique, less thoughtful and less interesting, so subtly you don’t even notice.
Harvard researcher Shoshana Zuboff describes the impact of algorithmic targeting as “the privatization of the division of learning.” We have increasingly handed over our decisions about everything to pattern-recognition software, she argues. It guides our interactions with social media, dating sites, search engines, programmatic advertising and content feeds – and it’s built almost entirely on models of past human behavior. At its structural root, it is hostile to novelty, innovation and independence. And its pioneers have benefitted hugely from it – according to Zuboff, Google now has a “world-historical concentration of knowledge and power.”
I have a slightly snappier name for this than Zuboff: the Algorithmic Loop. Like most loops, it is easy to get trapped in because it harvests our preferences, then uses that data to keep us hooked – and take control. Sure it shows us prospective dates or movie titles or news blurbs that it knows we’re likely to click. But those suggestions in turn shape our desire for the next thing we consume.
The algorithmic loop, in short, doesn’t just predict our tastes, attitudes and beliefs, it creates them. And because it shapes them based on only what it already knows and can understand, it is making us less creative and less individual in ways that we have barely begun to understand.
Over time, the individual and collective effects may prove devastating.
Lowest common denominator
How is the algorithmic loop narrowing the range of human thought and creativity?
The dynamic varies but consider the basics. Companies like Facebook, Amazon and Google ultimately make money by showing you things you might want to buy. One level up, social, search and streaming platforms keep your attention by showing you content you are most likely to find “engaging.” They accomplish these goals by observing your behavior, matching it to the behavior of similar people, then showing you the other things those people liked.
These systems are sometimes praised for their ability to help users with niche tastes find precisely what they’re looking for, and there is some truth to that. But the larger dynamic is easy to spot: The algorithmic loop operates on the fundamental assumption that your taste is interchangeable with other peoples’. The algorithm can neither predict nor create personality, innovation or chance encounters – which means that it is ultimately hostile to personal empowerment and individuality.
As a thought experiment, imagine a truly average user of YouTube or Amazon Prime Video. What do you suggest to someone who has rented five mainstream Hollywood films because that’s all they’ve heard of? Well, you offer them more of the same mainstream, middlebrow, easygoing content. Even when content truly is tailored to a demographic niche, the creative process has become an exercise in box-checking: Netflix, famously, uses its algorithmic loop to “optimize” a piece of content for success before it is made. If art at its best is a process of self-discovery and learning, the algorithmic loop is turning us away from that and toward simply repeating ourselves endlessly.
That algorithmic bias towards banality, along with other forces, has already dumbed down our culture in measurable ways. In the 20-odd years since algorithmic recommendation engines have been in the wild – first at online bookstores like Amazon, then at Netflix’s DVD service, then on streaming video and music platforms – global popular culture has undergone a radical contraction centered on the most popular and inoffensive blockbusters.
For example, Spotify, an algorithm-centered music platform, concentrates streams and earnings among a handful of top artists far more than the physical media-and-broadcasting system that preceded it. This is particularly striking because the terrestrial radio conglomerate ClearChannel was so often a bugaboo for music fans in the pre-internet 1990s, accused of silencing adventurous or controversial artists. We now live in the era of the “infinite jukebox,” with practically all the music ever recorded just a click away – yet melding that to the algorithmic loop seems to have made music consumption more monolithic, not less.
Hollywood movie studios, major book publishers and music labels have all responded to this winner-take-all model. They have shifted en masse to focusing almost entirely on blockbusters and stars, committing resources only to artists who produce the most widely loved product – and even then only to their clearest hits. This broad seachange has made it vastly more difficult for even slightly unconventional musicians and filmmakers, those capable of introducing new and exciting ideas, to financially support their work (to say nothing of writers, who have always struggled). Instead, we get an endless string of Marvel movies.
In fairness, there are other major factors behind these changes. Hollywood, for instance, is grappling with a secular decline in theater attendance that creates pressure to make less-challenging content because it needs butts in seats. U.S. political culture was increasingly partisan well before the algorithmic loop made sorting people into opposing, equally single-minded hives a process as unconscious as breathing. At the very highest level, the trend toward a “winner-take-all economy” began with the invention of the telegraph: Improving communication technology allows the very best performers, businesses and products to dominate ever-larger shares of the global market for just about everything.
But the algorithmic loop is what allows the winner-take-all dynamic to infiltrate every aspect of our lives, online and, increasingly, off. It is what constantly tempts us with news or products or tweets that might not make us any more thoughtful or empathetic or well-informed – but which everyone else, as the algorithm knows, seems to be enjoying.
Reject tradition, embrace yourself
The algorithmic loop is the cybernetically enhanced version of a problem humans have been grappling with since before machine learning, the internet or computers even existed.
In ye olden times, the problem went under names like tradition, hierarchy, superstition, conventional wisdom or just “the way things are.” Three decades ago, legal scholar Spiros Simitis predicted just how powerful these systems could be for molding people’s behavior into acceptable forms, much like traditional hierarchies. In a passage cited by Zuboff, Simitis argued that predictive algorithms were “developing … into an essential element of long-term strategies of manipulation intended to mold and adjust individual conduct.”
Such forces have been viewed with suspicion for thousands of years. You’ve likely heard the phrase, “The unexamined life is not worth living,” one of the most famous aphorisms of Socrates, the foundational philosopher of the Western world (as passed down by his student Plato and Plato’s student Aristotle – Socrates didn’t even write, much less code). The general sentiment is clear and obvious enough: Spend some time reflecting on yourself. It’s good for you.
But Socrates also meant something much more specific: To truly examine yourself, you have to interrogate all of the social norms, unspoken assumptions and historical conditions that shaped you. Until then, you’re essentially the puppet of the people who came before you and established the norms, whether we’re talking about church doctrine or aesthetic judgment.
A couple of thousand years later, pioneering psychoanalyst Sigmund Freud restated this a bit more explicitly, in a slogan that also has the advantage of sounding totally badass in Freud’s native German: “Wo ist war, soll ich verden. Or in English: “Where it is, there I will be.” The “it” Freud is referring to is the unconscious mind, which he saw as shaped by the traditions and social norms hammered into all of us from birth. By Freud’s time, modernity and technology had helped make those norms ever more widespread, uniform and rigid, particularly during the sex-repressing Victorian era of Freud’s youth.
Freud believed the conflict between social norms and individual desires was a source of mental health problems. He hoped that his “talking cure” could help patients who felt strangely out of place in their repressive society, by making visible both the norms that are so often unspoken, and the desires that people sometimes hide even from themselves. We might understand disturbing findings about the mental health impacts of social media in similar terms: A constant stream of the most popular content might sometimes amount to a psychically damaging erosion of individuality by the dominant social order.
The algorithmic loop may not seem quite as harsh a master as the social norms of Victorian Europe – but it is often more insidious. Repressive social norms that are visibly enforced by a policeman or priest may be easier to defy than the algorithmic loop, because now we’re the ones doing the clicking, the streaming, the scrolling. It certainly feels like we’re making individual choices, affirming our uniqueness, and expressing ourselves.
But that’s only because the curve toward groupthink is so subtle. Viewed as a total system, the algorithmic loop inevitably degrades the diversity and uniqueness of what most people see, learn, and enjoy. Even as the amount of “content” we consume skyrockets (a disturbing trend in its own right), it feels like less and less of actual consequence is on offer – less that can challenge you, help you grow, make you a better person.
No matter how much we scroll, tube or tweet, we may begin to suspect that our choices are illusory.
Escaping the loop
How, then, do you break free from a strangling vine that reads its future in your very struggle? How do you re-assert control over your own choices and your own brain?
Of course, there are individual practices requiring various degrees of commitment. A straightforward one, if not entirely easy, is to ditch Facebook and Google to whatever degree possible. Facebook especially – the company that now calls itself Meta is simply and uniformly not to be trusted. (And yes, Facebook can track you even when you’re not using Facebook.com. Here’s how to change that.)
Use DuckDuckGo for search. ProtonMail is a popular alternative to Gmail – which, yes, also spies on you. In fact, it’s learning how to write your emails for you, another instance of the seductive, narcotizing death loop we must somehow escape.
The benefits are likely marginal – in part because they already have so much data – but these moves will at least make it somewhat harder for the data hoarders to profile and entice you online.
Returning to physical media is another way to detach from the hive mind – CDs and vinyl instead of Spotify, DVDs and VHS tapes instead of YouTube or streaming services, physical books instead of (let’s be real) tweets. Learn to appreciate your local library. Using more physical media forces you to make considered choices and pay attention to it for a while, instead of just riding the algorithmic loop (though MP3s and a PLEX server aren’t a bad option either). Heck, if you really want to go buckwild, get a flip phone and subscribe to a print newspaper – you can disappear from social media and streaming like the One-Armed Man.
But these individual tweaks aren’t really The Solution any more than you can fix the obesity epidemic by eating more quinoa yourself. Digital systems are immensely more convenient than what came before, and its downsides are abstract and collective. Even if someone is deeply aware of the compromises they’re making every day, all of this is just too difficult to worry about.
For those folks – that is, most folks – a more systematic regulatory approach is needed, and good privacy regulation and practices are the linchpin. Careful limits on how much data we give advertising platforms like Google and Facebook, and how exactly they’re able to target us, create more space for individuality. There is already some precedent here – Facebook has recently been pressured to reduce advertisers’ ability to target by race, for instance (though because this is Facebook, of course there’s an easy workaround).
Then there’s the nuclear option: Make programmatic advertising illegal.
That won’t happen in the U.S., the home of the largest corporate data hoarders. U.S. legislators are too deeply biased toward profit to do anything that would hurt Facebook or Google or the thousands of ancillary adtech and marketing firms that feed on their plume of data chum.
But hypothetically, if programmatic ad targeting ended or was seriously curtailed, data about your habits and preferences would lose its value. Facebook would stop spying on you not because it was forced to, but because it would have no incentive. With your data and attention suddenly valueless you would be free to learn and explore on your own terms.
Well, we can still dream … can’t we?
Source: https://www.coindesk.com/layer2/privacyweek/2022/01/27/the-algorithmic-life-is-not-worth-living/