Digital Humanities, Digital Folklore

The idea of “Digital Folklore” has gained a cachet in recent years; much as “Digital Humanities” did from the mid-2000s, and as other “Digital” suffixes have more recently – such as Digital History, Digital Culture, Digital Art History, and so on. Given the intimate connection between folklore studies (especially via anthropology) with the humanities and the social and communications sciences, I am very pleased that we have the opportunity to host the Folklore Society’s annual conference at King’s, on the theme of “Digital Folklore”. The call for papers, which closes on 16th February 2024, is here.

Some, but certainly not all, of the main issues with Digital Folklore group around the impact of algorithms on the transmission of traditions, stories, motifs and beliefs etc. These encourage us to look at those various Digital suffixes in new ways, both semantically and substantively. In framing the idea of “algorithmic culture” in 2015, Ted Striphas noted that:

[T]he offloading of cultural work onto computers, databases and other types of digital technologies has prompted a reshuffling of some of the words most closely associated with culture, giving rise to new senses of the term that may be experientially available but have yet to be well named, documented or recorded.

One issue which this process of “offloading” of work onto computers highlights is that of agency, and the role of the algorithms which regulate our myriad relationships with “the digital” as agentive actors. This is a theme explored in the current issue of the journal Folklore by the digital culture theorist Guro Flinterud, in a paper entitled “‘Folk’ in the Age of Algorithms: Theorizing Folklore on Social Media Platforms”. In this piece, Flinterud observes that the algorithms which sort content on platforms such as X (Twitter as was) shape the behaviour of those contributing content to them by exerting a priori influence on which posts get widely circulated and which don’t.  She views the algorithm as

a form of traditional folklore connector, recording and archiving the stories we present, but only choosing a select few to present to the larger public

[451]

Her argument about how the agency of objects drives the “entanglements of connective culture and folk culture, as it alerts us to how algorithmic technologies are actively present in our texts” speaks to what I have alluded to elsewhere as both the “old” and “new” schools of Digital Humanities.

Personally, I am greatly attracted to this idea of the algorithm as part of the folkloristic process. However, I am not sure it fully accounts for what Striphas calls the “privatisation of process” – that is black-boxification, the technical and programmatic structures of the algorithm hidden, along with the intent behind it, by the algorithm’s design and presentation from (or rather behind) a seamless user interface.  I fully agree with both Striphas and Flinterud when they assert that one does not need a full technical understanding of the algorithm or algorithms to appreciate this; but the whole point of this “privatisation” is that it is private, hidden in the commercial black boxes of Twitter, Facebook, TikTok, etc – and serving those organisations’ commercial interests.

Because this information is proprietary, it is hard to trace the precise outline of these interests for individual organizations. It is, however, widely recognized (to the point of self- evidentness) that what they crave above all else is engagement. So long as the user-public is clicking, Tweeting, re-Tweeting, liking, friending and posting, then they are also looking at adverts. We are neurologically programmed to react more proactively to bad news than to good, so an algorithm in the service of multinationals will surely seek to expose its users to negative rather than positive stimuli. X/Twitter’s embrace of this truth will surely be written about a great deal over the next few years, but in the meantime let me illustrate it with personal experience. I was active on X/Twitter for a little more than a decade, between 2012 and 2021. In January 2022 I had an irrational fit of FOMO, and re-joined. Elon Musk acquired Twitter three months later, in April. I never had a big following there, and never made, or sought to make, a splash, Tweeting mostly about work, and very occasionally about politics or other personal ephemera. Then in September 2023 I Tweeted a throwaway observation about a current political issue – I will not repeat the Tweet or recount the issue here, as to do so would defeat the purpose of reflecting on the episode dispassionately – linking to a BBC News report. The Tweet got, by my low bar, traction. I started getting replies vigorously supporting my position, mostly from strangers, using language that went far beyond my original mildly phrased observation. I also started getting – in much lower volume – abuse, all of which (I’m glad to say) was from strangers.

I realise that all this is entirely routine for many when navigating the digital landscape (including several friends and colleagues who have large online followings), and that my own gnat-bite of online controversy is as nothing in the context of the daily hellscape that X/Twitter has become especially, e.g., for women. However, it bought me to the abrupt realisation that I did not have the time, energy or temperament to carry in this environment. More relevant than that one episode however, it was emblematic of what X/Twitter had become: angry, confrontational and, for me, a lot less useful, informative and fun that it had been before. In particular I noticed that the “videos for you” feature, which promoted video content into my timeline, had taken on a distinctive habit: it was constantly “recommending” soliloquising videos by a certain UK politician – again I will name no names – whose stance and philosophy is the opposite of my own. So far as I can remember I never Tweeted at, or about, or mentioned, or even named this person in any of my Tweets; however, one could probably tell from my own postings that we were of radically different outlooks. One can only conclude therefore that X/Twitter’s algorithm identified my viewpoint, however abstractly, and was pushing this individual’s videos at me solely in order to upset and/or anger me – and thus to ensure my continued engagement with the platform; and that in my continued engagement, I kept looking at their advertisers’ advertisements. 

This anecdote points, albeit similarly anecdotally, to another aspect of the “algorithm as folklore connector” model, which is that not all of the humans they interact with impact them equally in the age of the influencer or content creator. Like its mathematical model, the social media algorithm’s economic model remains black-boxed; but we can still follow the money. According to some estimates, then-President Donald Trump’s banishment from (then) Twitter following the Capitol riots in Washington DC in 2021 wiped $2.5bn off the company’s market value, mainly (I would guess) through the loss of the collective attention of his followers, and the platform’s ability to direct it to advertisements. Social media minnows (such as myself) are individually buffeted by these currents, and we can shape them only through loosely-bound collective action. Whales on the other hand make the currents.

We can trace the impact of a more specific narrative motif by continuing the marine analogy. Another whale is the author Graham Hancock, the author and journalist who has written extensively about the supposed demise of a worldwide ice age civilisation whose physical traces remain in the landscapes of Mesoamerica, Mesopotamia, and Egypt; an idea he promoted in his 2022 Netflix series, Ancient Apocalypse. Hancock’s strategy has been to establish himself as a “connector” outwith the conventional structures of scholarly publication, verification and peer-review – indeed, in direct opposition to them, portraying them as agents of conspiracy. “Archaeologists and their friends in the media are spitting nails about my Ancient Apocalypse series on Netflix and want me cancelled”, he Tweeted on 25th November. Without entering the debate as to the veracity of his ideas, there is no doubt he has played the role of “folklore connector” with great success, with Ancient Apocalypse garnering 25 million viewing hours in its first week.  A powerful message delivered in a framing more akin to gladiatorial combat than the nuanced exchange of ideas that the academy is more used to.

The algorithm brings another angle which I hope might be explored in June: the presence of algorithms as characters in popular narratives, as well as, for better or for worse, propagators of them. One can reel off a list: The Terminator, Tron, The Matrix, War Games, Dr Strangelove … all the way back to the founding classic, Kubrick’s 2001: A Space Odyssey, which deal with character-algorithms that take on agency in unexpected and unbenign ways.  I strongly suspect that these films, and the dozens or hundreds like them, reflect existential fears about the world, political instability and, to one extent or another in the cases listed here, the Cold War and the prospect of nuclear Armageddon, and that the role that machines might “rise up” to play in it. This fear is a human constant: the fear of the machines in the twentieth century is the fear of the unknown and the unknowable, much like fear of AI in the twenty first; and, indeed, the fear of the darkness beyond the campfire. Folklore helps us to manage it, and understand it. I believe that folklore and the digital have far more in common than they might at first seem to have.