It’s a good question. Years ago I wrote about different types of content curation, positing that there were three main types – algorithmic (we see stuff because of algorithms), professional (people who are paid to curate such as editors and commissioners) and social (we see stuff because friends/people we follow think it’s worth sharing). I added a layer of self-curation over this, reflecting the affect of the choices we make (such as who we connect to, or the news and other sources we turn to).
Since then, algorithms have largely swallowed social curation. The ‘For You’ feeds have either become the default entry point or are the entirety of our feeds. Substack may be an honourable exception (thank you for reading) but even here algorithms are playing a larger part in recommendation and Notes.
A platform like TikTok is a good example of the richness of data that can inform this algorithmic curation. Users spend an average of 58 minutes a day on the platform (there are differing stats on this but I’ve taken a middle ground), with the average session lasting approximately 9.4 minutes. Assuming an average video length of 35 to 55 seconds, users might watch about 10 to 16 videos per session, and the app is opened an average of 14 times daily. That means an estimated range of between 140 to 224 videos viewed daily. That’s an awful lot of data signals that can be used to inform algorithmic curation (and where TikTok has a huge advantage of course over other platforms).
So as algorithmic curation (and AI) plays an ever larger part in content discovery and consumption, the question of whether algorithms (and AI) do indeed flatten culture is all the more important. There was some noise around this idea earlier this year following the publication of Kyle Chayka’s book on the topic (Filterworld: How Algorithms Flattened Culture, and there’s also a podcast on it here) in which he makes the case for how algorithm-driven platforms like Instagram, TikTok, and Spotify have homogenised cultural experiences. He argues that these algorithms, designed to maximise engagement, often promote content that appeals to the broadest audience, leading to a uniformity in the art, music, and media we consume. This results in a culture characterised by its ambient flatness, where creators are incentivised to produce content that aligns with algorithmic preferences, potentially stifling innovation and diversity.

Image source
Cultural homogenisation is visible across a variety of fields. In architecture for example, with the emergence of globalised cityscapes that increasingly resemble one another. Algorithmic design software often contributes to this by optimising for efficiency and cost, promoting standardised aesthetics over unique, culturally relevant designs. In music the recommendation algorithms that Spotify and other streaming platforms use popularise certain genres which may provide background listening (‘lo-fi beats’ anyone?), and artificially incentivise artists to shorten song intros and lengths, and include so-called ‘pop-overtures’, or a hint of the chorus in the first 5 or 10 seconds to hook you into listening longer. In fashion, companies like Zara, H&M, and Shein often replicate popular styles that are algorithmically identified as trendy, based on user data from platforms like Instagram and Pinterest, resulting in a ‘global wardrobe’. In food, social algorithms can reinforce popular food aesthetics, sometimes at the expense of regional or traditional cuisine. In interior design, physical spaces such as coffee shops have all started to look the same. In film and TV, streaming services like Netflix create a feedback loop where popular shows become more visible and therefore more watched and therefore more visible.
Whilst growing homogenisation has long been happening, it seems that algorithmic social media feeds have served to accelerate the globalisation of aesthetics and culture. For all the talk of hyper-personalisation and individualised feeds this has surely been an unintended consequence of the growing role of algorithms in our lives.
And yet…it doesn’t always feel like this. Thinking about my own Spotify usage for example, I’d say that I now listen to a wider spectrum of artists and genres than I ever have before, driven not entirely but in no small part by the ‘Discover’ features. I’d also say that I follow a broad range of thinkers and doers on the various platforms from artists to celebrities to friends and people with interesting perspectives (not all of whom I entirely agree with) so I like to believe that I’m not always existing within a filter bubble.
So in many ways it feels though social media algorithms are a double-edged sword. This recent paper for example, explores the dual role of social media in cultural homogenisation and cultural diversity. On the one hand it facilitates the rapid dissemination of global cultural phenomena which can lead to the erosion of localised traditions, but on the other it can serve as ‘a potent tool for preserving and promoting cultural diversity, providing platforms for individuals and communities to document, share, and celebrate their unique cultural heritages’.
And I suspect that the real impact of social media algorithms on culture lies in this duality. In simultaneously promoting cultural homogeneity and diversity. If there’s one thing that’s certain, it’s that as AI and algorithms become ever more embedded in our lives we should pay close attention to the balance between these two extremes.
A version of this post was also published on my weekly Substack – To join our community of thousands of subscribers you can sign up to that here.
To get posts like this delivered straight to your inbox, drop your email into the box below.

Leave a Reply