Unravelling the Disruption of Personal Taste in the Algorithmic Filter Bubble
Can you still have an unpopular opinion?
The old adage is that you cannot teach taste - you either have it or you don’t.
Amidst the avalanche of choices confronting us daily, the development of personal taste becomes increasingly homogenised. To keep up with the pace of TikTok trends, there is little room for introspection or the cultivation of individual viewpoints. The demand for content is incessant, leaving creators with little time for thoughtful reflection. In such a climate, everyone seems to have an immediate opinion—a "hot take"—and content is only deemed worthy if it aligns seamlessly with the algorithms dictating visibility and engagement.
How much would you pay someone to tell you how to have good taste? As someone who was a teenager in the 80s (the decade that style forgot), it tickles me that the colour analysis books I borrowed from my local library are getting a resurgence as viral TikTok videos and consumers are parting with their hard-earned money to get ‘their colours read’.
As social media algorithms flood our feeds and FYP with reheated fads, is there any space left to develop personal taste? Kyle Chayka outlines in his book ‘Filterworld’ how algorithmic recommendations influence everything from what we watch on Netflix, to which music we listen to or where get our coffee. This web of algorithms woven into all of our online experiences has disrupted the development of personal taste, leading to a phenomenon often described as the "algorithmic filter bubble."
This term was coined by internet activist Eli Pariser who in 2011 “anticipated the dangers of a hyper-personalized Internet, and introduced the ‘filter bubble’ to the lexicon in his New York Times bestselling book of the same name.” Within this bubble, our tastes are shaped, curated, and sometimes even predetermined by algorithms, resulting in a cultural flattening that homogenises diverse expressions of style and identity.
Author Kevin Bloor defines ‘cultural flattening’ as “the process in which information, commodities and images produced in one part of the world enter into a global village. Cultural flattening is associated with a monocultural set of western values and therefore undermines cultural differences”. Algorithms, designed to maximise user engagement and consumption, operate on the principle of familiarity and predictability. They analyse vast amounts of data, including our past preferences, online behaviours, and demographic information, to tailor content recommendations and suggestions. While this customisation promises convenience and personalised experiences, it also creates echo chambers where our exposure to new ideas, cultures, and perspectives is limited. As a result, our personal tastes become confined within the narrow confines of algorithmically determined norms, eroding the richness of cultural diversity - culture becomes stuck.
At the centre of the ‘algorithmic filter bubble’ are the tastemakers that wield considerable influence in shaping our style and preferences. Traditionally, tastemakers were individuals or institutions with expertise and authority in cultural domains such as fashion, music, and art. Nowadays, algorithms have emerged as the primary tastemakers. Algorithmically-driven platforms such as social media, streaming services, and e-commerce sites dictate what content we consume and what products we buy, thereby exerting a profound influence on our tastes.
The algorithms employed by these platforms prioritise content that aligns with viral trends, relegating niche and alternative expressions of taste to the sidelines. And content creators align with these popular trends, as it gives them the views and engagement to drive their ad revenue. Consequently, the power dynamics of tastemaking have shifted, with algorithms promoting what appeals to the masses and the masses aligning their taste with these trends.
Nostalgia plays a pivotal role in the disruption of personal taste. Algorithms leverage our nostalgia for past experiences, trends, and cultural movements to shape our present preferences. As Lawrence Lessig analysed in his 2008 book "Remix," algorithms meticulously scrutinise our digital footprints and interactions, discerning patterns of nostalgia to craft personalised content recommendations that trigger sentimental emotions and evoke deep resonance. This process perpetuates a relentless cycle of cultural recycling, wherein past trends are constantly rehashed, repackaged, and repurposed to capitalise on our nostalgic yearnings. Consequently, our tastes become trapped in a web of manufactured nostalgia.
The disruption of personal taste by algorithms represents a real shift in the way we engage with culture and identity. Cultural flattening, driven by algorithmic homogenisation, undermines the diversity and vibrancy of personal expression. Nostalgia further complicates this, as algorithms exploit our emotional attachments to the past to influence our present tastes.
To reclaim agency over our personal preferences, we must critically examine the role of algorithms in shaping our cultural experiences and strive to foster environments that celebrate diversity, creativity, and authenticity. Only then can we break from the confines of the ‘algorithmic filter bubble’ and discover our own personal taste and get culture un-stuck.