It is always a privilege to be given a special insight into a country’s cultural heritage, and such a privilege came last week on a brief visit to Riga, which included a talk on digital public spaces at the National Library of Latvia. This imposing building itself draws inspiration from folklore, being modelled on the mythical “Palace of Light”, a motif for wisdom that has been lost and reclaimed through triumph over adversity; a theme very much in tune with Latvia’s national identity. After the talk, I was fortunate to be given a tour of the Archives of Latvian Folklore of the Institute of Literature, Folklore and Art (University of Latvia), which is located in the National Library, by colleagues Sanita Reinsone and Ginta Pērle-Sīle.
At the centre of the archive is the Dainu skapis, or “cabinet of folksongs”. This remarkable, purpose built edifice was constructed according to the design of Krišjānis Barons (1894-1915), a foundational figure in the study of Latvian folklore, who collected the archive together and edited the collection into its structured form. It consists of over 260000 paper slips which document folksongs, each of no more than four lines or so, which provide commentary on every aspect of daily existence. Through this lens can be seen a rich and powerful picture of rural life, its joys, heartbreaks and milestones, and the very powerful connection which many Latvians have to their home region.
The Dainu skapis gives us an insight into the methodology of folklore. It was by a happy chance at the turn of the twentieth century that the process of recording the folksongs in text, which enabled their preservation and ongoing availability in the archive. This further cemented their place as a core element of Latvian identity and heritage, and sustained it through the immense geopolitical challenges that the Baltic region faced in the nineteenth and twentieth centuries.
On my visit, I heard about the Archive’s exciting programme to digitise the Dainu skapis, which is available here. Many of the folksongs have been transcribed and are now available online. Scrolling thorough these often-enigmatic little texts (even through the not always perfect lens of Google translate), they are strangely compelling. As with all the most interesting examples of the folklore of any culture, they shine a light into vernacular stories that come up from the land, and which are often missed by history’s more mainstream voices.
I have just submitted the final proofs of a book chapter on the topic of “Spatializing the Humanities” to the forthcoming Bloomsbury Handbook to the Digital Humanities, edited by James O’Sullivan. This is an attempt to link the interpretive challenges of digitizing historical (or rather historically expressed) geodata such as maps or gazetteers, the kind of work which gazetteer initiatives such as the Pelagios Network has excelled, with contemporary theory on the interpretation of “born digital” geodata on, for example, Google Maps or OpenStreetMap. The submission of this chapter coincided with a keynote talk I gave recently (and virtually) at the Meaning in Translation: Illusion of Precision conference at Riga Technical University in Latvia. “Illusion of precision” captures perfectly the challenges of “translating” historical place to the digital world, so I thought it was worth capturing some of the links between the chapter and the keynote.
There are some common threads that run between “heritage geotada” and “contemporary geodata” that are (or will be) worth exploring in the methodological frameworks of both history and archaeology (with which we explore place in the past); and science, technology and innovation studies, which has done so much to define place in the present. There are a number of links lurking between the surface between these areas as disciplines: they have more in common than one might think (this, by the way, gives me some hope in my oft-stated desire to integrate the Old and New schools of Digital Humanities in my home Department at King’s College London).
A great deal of archaeology, after all, is the story of technology in the longue durée: the supra-frameworks of Palaeolithic, Mesolithic, Neolithic, Bronze and Iron Ages is defined by the technological competences of fashioning lithic artefacts, and then of smelting naturally occurring ores into bronze and then iron. Whilst, for convenience, we often follow our nineteenth century forbears in conceptualising these as massive, monolithic (no pun intended) blocks of time extending neatly back into the human past, the discovery of the processes of making bronze and iron must have led to enormous social, cultural and economic upheaval, the creating and cementing of elites, the generation of wealth for some and poverty for others, the refashioning of whole societies – and as these societies were preliterate, we simply lack the documentary sources with which to see these events. However technology continues to have much the same sort of impact today, in the so-called Information Age. As my DDH colleague Jonathan Gray has written of “data work”: “[s]imilar moves will be familiar from approaches inspired by Science and Technology Studies which view data infrastructures as relations of people, machines, software, standards, processes, practices, and cultures of knowledge production” .
The idea of place represented in the digital world is a theme I hope to return to next year when, fingers crossed, I will have more time for research than I do now; but as noted my aim here is to capture some of the links between the Bloomsbury chapter and the keynote: what we can learn about (digital) place in the present from (digitized) place from the past. One topic which comes up repeatedly is the role of authoritative institutions and corporations in the generation of contemporary geodata. The visibility of features, businesses, on major mapping platforms has much more to do with that platform’s algorithms (and the commercial interests they represent) than any other category. Something to bear in mind, perhaps, as yet more of the world’s social communication infrastructure, and thus contemporary geodata, passes into the hands of white American billionaires (or maybe not).
Keeping with the riff of archaeology and STS however: the contemporary GeoWeb is the result – not the end product, because it continues to evolve – of a sequence of technological innovations, some rapid, some enacted over decades. Historians might argue as to how long this process goes back: some might say to the Victorian trans-Atlantic telegraph networks of the nineteenth century, some to the successful piloting of the four-node ARPAnet Network in 1969, others to the establishment of the TCP/IP protocol which enabled “internetworking” between networks in 1978, others still to the invention of the World Wide Web 1989. And so on. Collectively this process is a disorganised, yet fundamentally sequential and interdependent mishmash of ideas and innovations, not steered by any one individual, despite the disruption fantasies of Silicon Valley’s tech bro culture. The process certainly includes extraneous events such as the successful piloting and bedding in of ARPAnet into the US Department of Defense‘s Cold War information management and protocol programmes, which unlocked virtually unlimited access to government resources, and the Soviet launch of the Sputnik satellite in 1957, a shock which sparked a state of near panic in the US military and government, leading to that investment. There are developments where are more specific to geodata, such as the de-militarisation of the GPS system in 1983 as a result of a civilian airliner, Korean Airlines Flight 007 from NYC to Anchorage via Seoul, being shot down by the Soviet air force for “violating Russian airspace”. And there was the unscrambling of the GPS signal by the Clinton administration in 2000, which opened the way for its use in mainstream commercial applications.
Like any good archaeologist, the Science and Technology theorist or historian looking at these events must consider their context as well as their happening. As early as the mid twentieth century, visionary intellectuals such as Vannevar Bush and Paul Baran were thinking through the implications of dealing with unprecedented volumes of information. The body of military and related research produced during World War II alone transcended anything that the paper world of library, archive and information systems had been built to cope with. There was also the need to get information – such as commands in the event of a nuclear attack – from A to B securely and instantaneously. Bush’s hypothetical “Memex Machine”, described in his iconic 1945 article As We May Think, was a solution with which a researcher could retain all of his books, papers and resources in one place, and construct information and new insights from the unordered mass of knowledge therein:
[H]e names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. 
It takes no great leap of the imagination to connect this vision of nuggets of information defined by the user and “permanently joined” to the development of the World Wide Web by Sir Time Berners Lee some forty years later, to the concepts of the unique resource identifier and hypertext transfer protocols. This is indeed a link which Berners Lee and his colleagues made in 1992:
Since Vannevar Bush’s article (1945), men have dreamed of extending their intellect by making their collective knowledge available to each individual by using machines. Computers give us two practical techniques for human-knowledge interface. One is hypertext, in which links between pieces of text (or other media) mimic human association of ideas. The other is text retrieval, which allows associations to be deduced from the content of text. 
Where there is context, there is concatenation. The inexorable sequence of incidental innovation which connects Bush with Berners Lee and the other twentieth century Web visionaries can be seen as a process of capitalistic evolution or social movement – and it is here where, as always, history becomes political. It is probably both, but I find myself being more heavily influenced by authors who propound the latter. The feminist geographer Doreen Massey, writing in 1991 against the backdrop of the Web’s emergence and the IT revolution which at the time was seen as ushering in the “Global Village” (Wikipedia link – students, don’t do as I do, do as I say), where she states:
Imagine for a moment that you are on a satellite, further out and beyond all actual satellites; you can see ‘planet earth’ from a distance and, rarely for someone with only peaceful intentions, you are equipped with the kind of technology which allows you to see the colours of people’s eyes and the numbers on their number plates. … There are faxes, e-mail, film-distribution networks, financial flows and transactions. Look in closer and there are ships and trains, steam trains slogging laboriously up hills somewhere in Asia. Look in closer still and there are lorries and cars and buses, and on down further, somewhere in sub-Saharan Africa, there’s a woman on foot who still spends hours a day collecting water.
Leaving aside the fact that further steps in the chain of incidental innovation now mean that the technology to see the colour of people’s eyes and number plates– dystopian to Massey in 1991 – are now in widespread use by governments and corporations, this theorisation points to a constant series of makings and remakings, of conflict between standards and authorities, and the processes those standards regulate.
My point here is that we can only understand the “illusion of precision” in contemporary digital place in the context of the chain of innovation that created the environment in which digital place exists. This also means understanding the materiality of the media in which place is represented. In the chapter, I develop the idea that Abraham Ortelius was an innovator of publication method as much as cartography, a savvy media professional who understood the importance of bringing different innovations together and making them work in concert. A kind of seventeenth century Steve Jobs. Hopefully there will more to come on this subject in the near-ish future.
 Gray, J., 2018. Three aspects of data worlds. Krisis: Journal for Contemporary Philosophy, (1), pp.5-17.
 Bush, V., 1945. As we may think. The Atlantic Monthly, 176(1), pp.101-108.
 Berners‐Lee, T., Cailliau, R., Groff, J.F. and Pollermann, B., 1992. World‐Wide Web: the information universe. Internet Research. 2(1), pp. 52-58.
 Massey, D., 2008. A global sense of place. In The cultural geography reader (pp. 269-275). Routledge.
I’ve been moved to circle back to this long-neglected blog by this news story from the BBC, which reports on calls by a local MP to route the easternmost section of the Hadrian’s Wall Path along the actual route of the Wall through Newcastle’s West End. The 84-mile Path is one of sixteen officially designated National Trails, connecting Wallsend in the east to Bowness-on-Solway in the west (which is not the actual end of the Wall’s infrastructure). Currently, the eastern-most section follows the riverside path along the River Tyne through Newcastle’s urban landscape, rather than the actual line of the Wall through the the city’s West End. This is an interesting argument, from the point of view of spatial history and archaeology, and of cultural heritage interaction with historic landscapes – all subjects of professional interest to me. I also have a personal interest, having walked the whole Path myself in 2011. That’s another reason for this post in fact – I am currently self-isolating with Covid-19, which means I am hankering wistfully after the winds, views and fells of Northumbria even more than normal.
It is also an apt excuse to dig out some of the photos I took on the walk back then:
My guidebook for the walk, Hadrian’s Wall Path, by Henry Steadman (Trailblazer, 2009) wasn’t terribly complimentary about this section. It states it is “not the most auspicious start to the trail”, that it is “hemmed in by warehouses and the back of housing estates, there’s little industry, or indeed anything to see”. It also warns that “one or two trekkers have been subjected to insults and threats from local kids on both this stretch, and the one the leads across Denton Dene”. I have to say this was not my experience at all. On the contrary, I had nothing from cheery greetings from my fellow walkers, including one who noted my backpack and asked if I was going the whole way, and wished me luck.
In the diary I kept of my walk along the Path in 2011, I noted of this section:
Mile after rolling mile of city riverside, and out of town greenfield land do not perhaps quite herald the soaring, monumental fells to come in the way they might. But nonetheless, I think it is good to begin the walk with a cross-section through the modern landscape of the North East, a region built on coal and ships, the ghosts of which may be seen in the derelict and dilapidated steel yards and industrial estates which line the Tyne to the east.
There is indeed a palimpsest of the modern story of Newcastle, stripped for sure of any heritage centre-type gentrification, but hinting at the spatial complexities of a 2000 year-old human-made landscape:
The route itself is a mishmash of river side paths, city Quayside, country footpaths and, in one stretch, the Wylam Waggonway, a dismantled former industrial mini-railway. It makes one think that when we plot ancient networks on maps, how much now-vanished complexity must we be overlooking in the process.
It is not a difficult walk, but the terrain throws you some curveballs as you leave the city:
For the last stretch, the path snakes up through a golf course… the trees are beautiful, all the more so in the dappled sunshine that made it through the hail bursts this afternoon. Shortly thereafter the landscape springs some pretty hairy – and rocky – gradients on you, as the path leaves the river and up towards Heddon-on-the Wall. As if to say “you’ve had it easy for now matey, *this* is a taste of what’s coming”.
Academic scholarship acknowledges that the Wall in the past is also the Wall in the present. As Claire Nesbitt at Divya Tolia-Kelley noted in their 2009 paper:
The Wall is monumental to unravelling narrations of civility, barbarism and imperial strategy, and has resonance with modern accounts of Empire, borders and national identity. It is not just a material for archaeological study, but an organic landscape through which historical, quotidian, geological and affective encounters are made and remembered.
Nesbitt and Tolia-Kelly, 385 
So what do I think of the idea of re-routing the eastern section of the Path away from the Tyne? At first I thought it was a great idea: after all, there is a certain constructivist archaeological purity in routing the official trail along the monument, taking in sections exposed at roadsides and in back gardens. But then I had a better one: keep both routes. This would help stress that the Wall has always been a landscape and not “just” a line. A key part of Nesbitt and Tolia-Kelly’s argument is that the very linearity of the Wall is product of modern imperial perceptions of “inside” and “outside”, “civilized” versus “barbarian”, which did not apply in the northern Roman Empire. This is what the arbitrary line of a National Trail implies (the important arguments about conservation and curation of the monument aside). Keeping both routes would help stress that that landscape is a zone of engagement, not a neat line on a map.
This is a much delayed (and very brief) write up of another corpse path, one we walked late last summer (between lockdown restrictions) in Dorset. This one links the village of Plush, near Dorchester, with the thirteenth century Church of the Holy Rood in Buckland Newton, about three miles to the north west. It features on my little database of corpse paths, being mentioned in Deveraux’s anthology.
The path starts as a satisfyingly deep holloway, branching off from the road into Plush, and tracing a steepish path north, up the side of the West Hill towards Watcombe Plain. Opening out, and affording stunning views of field systems (Iron Age, I think) on the eastern side of the valley, the path follows the contours of the hill round to the west. Crossing White Way, it becomes Crowthorne Lane as it makes its descent towards Buckland, and then Hilling Lane.
As Hilling Lane enters Buckland itself, it branches south, away from the church. There appears to be no extant footpath or right of way linking this section with the church itself, but in one of those pleasing bits of historical jigsaw-fitting that occasionally comes along with these things, such a footpath clearly appears on the 1903 Ordnance Survey map (which marks much of the rest of the route as B.R., = Bridle Route); following the trajectory of Hilling Lane straight to the church.
“Desire lines”, or “desire paths” are, at one level, a relatively straightforward concept in the fields of planning, land management and architecture. The Wikipedia definition will serve:
[A] path created as a consequence of erosion caused by human or animal foot traffic. The path usually represents the shortest or most easily navigated route between an origin and destination. The width and severity of erosion are often indicators of the traffic level that a path receives. Desire paths emerge as shortcuts where constructed paths take a circuitous route, have gaps, or are non-existent.
However, the idea of the “desire path” becomes more complex when some of the concepts and assumptions in this definition are unpacked. How are “constructed paths” constructed? By whose authority? What motivators, apart from ease of access or shortness of route, define the course of a desire path? The idea of the desire path, as framed here, would encompass “holloways”, described by the writer Robert Macfarlane, Stanley Donwood and Dan Richards, in their book of that name, as “a hollow way, a sunken path. A route that centuries of foot-fall, hoof-hit, wheel roll & rain-run have harrowed deep down into bedrock”: how can we unpick the complex network of historical and archaeological drivers which shaped them over hundreds of years?
Desire paths are intimately linked with the fundamental motivators which govern human interaction with space and place, and they can be intensely political. A significant aspect of the Paris Situationist philosophy, for example, was resistance to the constructed path, and to the bourgeoise banality of the planned urban space. The term “Vox Populi, Vox Dei”; the voice of the people as a kind of collective wisdom, funnelled by a kind of collective interpretation of the moral, cultural or political angle of a particular issue, and equating the voice of God, is a well-worn political trope dating from at least the early eighteenth century, probably a lot earlier. Might there be a similar trope for the “walkings of the people”? “Ambulari Populi” perhaps?
The ambulari populi changes pathways, whether constructed or not. In the example below, a tree has been effectively subsumed by a path as walkers have increasingly used both, rather than just one side of it:
There are some big concepts here, but – as Macfarlane et al and others have observed – they are visible in the day to day. I offer here a small, literally pedestrian observation gleaned from 2020’s months of Covid-19 lockdown. Many of the desire paths we created, and continue to create, have undergone a subtle transformation, as a result of social distancing. The idea of the path has been rethought, as people internalized the need to keep two meters apart to prevent the spread of infection. Our behaviours have changed: when, on path – whether constructed or not – two or more of us approach from opposite directions, the instinct has become to maximize the distance between us as we pass, treading the very outer edge of the path as we do so – and, in some cases subtly altering it.
Since April or thereabouts, I have been documenting the way “Covid desire lines” have come to express the desire to socially distance in the fabric of the landscape, and how paths have developed as a result (many of these pictures date from the height of the national UK lockdown, and I hasten to add at this point that they were all taken during my permitted once-daily exercise).
In some cases, as below, where there is an existing track of some breadth, “sub-paths” develop on the opposite sides, as walkers use the width of the path to avoid one another.
In others, the new desire lines have strong relationships with field boundaries. In the very clear example below, clear distance is maintained between the pre-existing path on the left and the new one on the right, until they are compressed together by the gap in the fence.
However, in the stretches where we have the freedom to socially distance without constraint, the evenness of the old and new paths, and the spacing between them, is striking:
Desire paths can be very persistent, and where imperatives such as social distancing are in play, they bely the simple notion that the physically easiest or shortest route must always prevail. Here the bole of a fallen tree is overrun by those socially distancing towards the left:
In other cases, they append themselves to metaled paths:
There is no real conclusion to draw here, beyond that even surface-level desire paths such as these, imposed only on the very top layer of the landscape by a few weeks of extra-ordinary measures, have different facets, and represent different responses at places where our relationship with the landscape is constrained in different ways. Cost-path reductivness in GIS is all well and good, but it must always be qualified by many, many layers of human interpretation.
My name is Ozymandias, King of KingsLook on my Works, ye Mighty, and despair!
Thus goes the inscription on the pedestal of the great statue of Pharaoh Rameses II, as described by the “traveller from an antique land” in Percy Shelley’s poem Ozymandias. This poem describes a great, ruined statue of the pharaoh, whose “shattered visage” gazes out over what was once his empire, now “Nothing beside remains. Round the decay/Of that colossal wreck, boundless and bare/The lone and level sands stretch far away”. Like that of the pharaoh, the awesome majesty of the statue has been stripped away by the passage of time and the fall of empires, leaving only empty desert.
The power of Shelley’s sonnet lies in the absence of power that was once there, in this case the symbolic power of a statue. Statues matter. This last week, #BlackLivesMatter protests have swept many countries following the death of George Floyd in police custody in Minnesota. Among other things these protests have shown, very graphically, that statues have the potential to become focal points at times of heightened tension. This is not new: statues, be they religious iconography, political statements, or works of art, have a long history of drawing controversy and focalizing conflict. Often, this has as much to do with things that happen to them, as to the objects themselves. The Parthenon Marbles are a very obvious example: originally framed as cultural religious/political symbols of power in Classical Athens, de-animated images of gods and mortal figures of contemporary history, it is the events that have swirled around them since their removal from the Parthenon, which began in in 1801, that now dominate their narrative. Similarly, it is unlikely that Neil Simmons’s marble effigy of Margaret Thatcher would have gained the notoriety it did had it not been decapitated by Paul Kelleher as a protest against global capitalism in July 2002. “I haven’t really hurt anybody” Kelleher said, “it’s just a statue, an idol we seem to be worshipping to a greater extent”.
I do not plan to wade too deeply into the ethical debates around the destruction of statues here. Rather my interest is in what the present protests tell us about public spaces, the nature of public space, and how that nature has been altered in literate, Western societies by the digital communications revolution. Like most of my best ideas, this line of thinking is inspired by my students: In 2015/6 I supervised an excellent MA dissertation entitled Reframing the Memorial Landscape: the Emancipation Memorial in Physical and Digital Space which examined the spatiality and public reception of the memorial of that name in Washington DC’s Lincoln Park. This monument, depicting Lincoln freeing African Americans from slavery, has attracted controversy in the past, supposedly for promulgating racist ideology. This controversy was the topic of the dissertation, which noted that:
[i]t is important to understand how visitors are physically able to interact with [the statue] and how it informs the space surrounding it. To grasp how that public sentiment is promulgated and turned in to shared memories, it is also useful to investigate how users interact with the monument tin a mediated space: the digital space.
Drawing on this student’s very prescient observations, I am going to reflect a little on the present debate. I am also going to take the liberty of sketching out the bones of a solution.
The most obvious thing to say about a statue is that it is immutable, set quite literally in stone (or bronze, or some such similar medium). However, the public spaces they occupy are not. Whilst at one level, public spaces – such as Lincoln Park or, in the UK, Whitehall, Parliament Square, or Bristol’s The Centre, are regulated by legislation such as the Public Order Act of 1986 and other public nuisance laws, as well as the general slate of criminal statutes. In reality they are also regulated in the day to day by complex swathes of negotiation and renegotiation, by social norms and expectations, and the environmental parameters of what the anthropologist Tim Ingold calls “wayfaring”. All three of these places, and many others besides, were sites of conflict and protest this weekend, focusing on the Cenotaph in Whitehall, the Churchill Statue in Parliament Square, and the statue of the slave trader and philanthropist (both of which are simplified distilments of many different things he has been called in recent days) Edward Colston, which was ripped down and thrown in Bristol Harbour by Black Lives Matter protesters. All three spaces (and here I am giving isolated examples from many such spaces which have been in the news in the last two weeks) are far more than the sites of statues: and that is the point. What happens in them are local instances of the wider norms, expectations negotiations and parameters of current society. To put it another way, public spaces are zones of shared values and behaviours. If these are violated in a certain way, the law will step in; otherwise the taboos, prohibitions and standards all regulate what happens in them. This is the root of the debate about statues that we have seen in recent days.
In the past, these currents of shared values have eddied and bumped around the statues as they flow through the spaces, sometimes causing comment, sometimes vaguely outlandish reactions, such as Churchill’s statue being given a grass Mohican during May Day protests in 2000. In some very rare cases, such as the aftermath of invasion, liberation or revolution, statues are engulfed by those values, and physically torn down. The image of Saddam Hussein’s effigy being toppled and pelted with shoes in Bagdad is an enduring image of the Guld War of 2003. What Shelleian, ekphrastic power have historical statues acquired now that sees them physically targeted as never before? I believe – following my 2015 student’s very far-sighted lead – that much of this has to do with digital interaction. Not necessarily with the statues themselves, but because the increased blending of the physical public space and the digital public space leaves less and less room for nuance in either. The polarisation of politics and society in the last five years or more is well documented: as with most other political or social topics you might care to mention, the question of whether a statue of a controversial figure should or should not stand on a public plinth is now absolute and binary. In Parliament Square, as on Twitter, shades of grey flicker around great expanses of black and white.
I suggest, therefore, that the way public spaces deal with effigies of potentially controversial figures needs to evolve intelligently, just as the idea of a public space itself has evolved. Partly, this goes to the purpose of public statues. This, generally, is to commemorate their words and deeds – and the values they held (this, by the way, is why I believe it would be perfectly consistent morally to remove the statue of Cecil Rhodes from Oriel College, but keep the Rhodes Scholarships – the former was instituted with the purpose of publicly proclaiming values, the latter with the purpose of enabling outstandingly talented overseas students to study at Oxford). Why should such a purpose be enacted and embodied in a public space which is inherently more polarized, yet more multivocal and multicultural, than at any time in history? This is a perfectly reasonable test to apply to the ethical and responsible management of any public area.
Speaking of history: the key argument in favour of maintaining controversial statues is that they preserve history (very often, this is qualified as “preserving our history” with little unpacking of who “we” are). Setting aside the fact that generally, when I teach history I tend to use books and articles rather than statues, the point above about purpose begs the further question of why statuary in public spaces is the best way to “preserve history” at all. I meant this purely as a dry, academic question, rather than a comment on current events. Museums are generally the places where historical objects are preserved, along with expert curation of their narratives (and counternarratives), and the meta-information that surrounds them. Clearly in my view, one option is to remove statues of figures such as Edward Colston to museums – similar arguments have been made for some years in the US about the Confederate Flag – where his sins and his virtues could be properly contextualized. However, where this is not possible, could we not designate small areas of our urban public spaces to be open air museums, with or without restricted access, where such statues could be displayed in a controlled way, linked (e.g. with QR codes) to a wealth of curated online information and context, and debates managed sensibly. It wouldn’t necessarily be cheap for countries with Covid-ravaged finances, but the news this weekend suggests that the digital age has challenged us to try new ways of curating information in our public spaces. And these may well not be cheap.
Because museums are the best places to preserve and understand historic statues, and to encourage debate around them. That’s why the Parthenon Marbles are in the British Museum isn’t it?
Our co-edited Routledge International Handbook of Research Methods in Digital Humanities will be published shortly. Like many such volumes, this has matured in to being over time, and our own picture of what the volume is about, and what it should be for, has evolved as we have read and reviewed the 27 chapters of cutting-edge thinking. These represent many of the varied corners of scholarship that feed in to Digital Humanities, and we hope it will similarly help a broader constituency of the field’s scholars re-evaluate their theory and practice, and how they go about it.
We have co-authored an introduction of some 5000 words, in which we set out our own view what DH methodology is and what it is for. We plan to post this online under Routledge’s Green Open Access rules (of which more below) in due course, so we will not go into any detail on this aspect here. We are, however, very excited by the way the volume has shaped up. The emphasis we have tried to establish on what DH *does*, as opposed to (yet another) discussion of what it *is*. In particular, we feel a thread runs through chapters which makes connections between long-established DH debates and newly emerging ones. We see this as a key aim of the volume.
There has been much discussion recently about the place of method in DH. Most recently for example, is a renewed discussion of the dichotomy between quantitative and qualitative methods, highlighted in a link posted recently to the Humanist discussion list by Marinella Testori. This is certainly a key debate, but it’s only one of several. It is important to note in this regard that although we are both academics of the Department of Digital Humanities at King’s College London, and the volume does indeed contain contributions from current and former Departmental colleagues of ours, this book does not in any way reflect a “King’s” view of the field – if such a thing even exists (one of us has blogged on this recently). Rather, we have tried very hard to step outside our immediate institutional context and provide a bottom-up evaluation of the latest methodological developments in the field.
The volume comprises of three sections, Computation and Connection; Convergence and Collaboration, and Remediation and Transmission. All three sections acknowledge that DH – both its subjects and its methods – exist in a world that is connected in new ways. We have tried to imagine these new kinds of connectivity and consider why they are important. This is a challenge all our authors have risen to magnificently. Under these headings, various sub-themes are explored, some of which have perhaps not had the profile in DH methods discussions that they should have. We are excited, for example, to have been able to include a three-chapter section on critical pedagogies in DH, a subject which will be essential as the 2020/21 academic year starts with much teaching online, and/or socially distanced, due to Covid-19. The same, of course, can be said for collaborative research, and the virtualization of most academic meetings and conferences. By establishing what is methodologically necessary for doing DH in a connected world, we can surely equip it better to weather storms like Covid, as well as to improve and evolve incrementally, as network technologies evolve.
As noted, our own introduction will be posted in preprint form under Green Open Access in due course. Of course, whether individual authors follow suit or not will be up to them and will depend on a range of factors including requirements to deposit manuscripts with institutional repositories; tenure and promotion considerations and the norms of their “home” research domains. However, we hope that as many unprocessed drafts as possible will be available via this protocol.
This leads us to comment on the way we have attempted to address inclusivity and diversity in the volume. All chapters were contributed by invitation. Some authors were identified through our own networks and knowledge of the field, others through the process of “snowballing” where authors already on board made recommendations to fill the gaps that emerged as the Table of Contents grew. In all cases, we have prioritized the excellence of the work involved – there has been no conscious attempt to socially engineer the author pool. We are, however, very proud of the fact that many of the contributors are early career researchers, although these are blended with more established voices as well. We are also proud of the fact that well over half of the contributors use she/her pronouns. We acknowledge that there could be more representation from the Global South and non-Anglophone worlds. However the volume nonetheless contains a great deal of critical self-reflection of Anglophone/Western DH (some of it quite hard-hitting) which we hope will enable such inclusive conversations going forward – starting with a recognition that “inclusivity” isn’t simply the admission of a particular group to a particular territory; but rather an equal intercultural conversation. We have tried to start such a conversation between the many different cultures of DH, and hope that it will expand in that spirit.
It will take a very long time for us to fully understand the long-term impact of the current COVID-19 crisis, and all the horrors it has bought to the world. By “us” I mean Higher Education, but of course this applies globally. Last month, in the space of a week many universities (including of course my own) underwent the kinds of changes that would normally take five years or more to effect; and it is unclear when any kind of “normality”, as visible in the familiar processes of face to face Higher Education, will return. Given the great dependence of the global HE sector on academic and student mobility, and (some argue), the generally disorganized nature of many Western governments’ initial responses to suppressing the outbreak, some predictions estimate that it may be March 2021, or even later in Western Europe, before such normality can resume.
As the next academic year approaches – and its potential timing is discussed – we need to consider online teaching as a matter of resilience. After all proto-Internet itself emerged in the 1960s and 1970s partly as a response to the shadow of Cold War, providing a means of channelling executive command decisions through “distributed networks” which could survive nuclear attack. Given that COVID-19 and/or other pandemics may well recur, we have responsibility to our students, and each other, to consider how we might weather such storms in the future.
More importantly though, it is a matter of pedagogy. One thing to say at the start, which is extremely obvious within the DH community, but which still perhaps needs re-stating, is that moving teaching normally done face to face online at a time of emergency is not the same thing as online pedagogy, never mind good online pedagogy. No one – academics, students, management – should expect it to be. Once this fundamental truth is acknowledged, there opens up a range of important and self-reflective questions that DH as a field needs to ask about what good online pedagogy is. This post attempts to pose – if not answer – some of these questions.
Most importantly, the COVID-19 crisis throws into relief the distinction between what we teach online (in DH, and everywhere else) and how we teach it. Flurries of discussion about the how of online education – the relative merits of Skype, MS Teams, Zoom, institutional VLE platforms – have proliferated. Against the background of crisis, our “how” has changed (literally) overnight, driven by the need to deliver what we had already promised to our students. Despite this, the creativity and innovation of DH has been much in evidence. It comes through in the always-excellent Digital Humanities Now’s roundup of COVID-19 think-pieces and other contributions here. I have seen stories of many colleagues in DH who have risen magnificently to the challenges involved (these abound in my own Department), and I have been truly inspired by the stories they have told me of compromise, improvisation, imagination, and the challenges of the digitalization of content and delivery (these are exactly the stories that I have also seen echoing all around academic trade and social media – we are most certainly, to employ another over-used phrase all in this together).
However, in the longer term the question of what we teach online, and how this differs from in-person degree programmes will need to be addressed. What kinds of learning can best be imparted remotely? Thinking of this in terms of what, as well as how, allows us to think of online teaching in terms of its opportunities, and not just as a palliative for the pain that recovering from COVID-19 will cause us all (which we will have to address in other ways – that is another story entirely). This, I think, is really important. It will also take time, resources, effort and imagination beyond the teaching we already do, and the efforts that we have all made to salvage our existing teaching tasks.
We can begin by asking if it is even possible to deliver the same learning outcomes from our homes as we do from the lectern. Should we even try? If not, what should we be doing instead? These are fundamental questions that have been bubbling under the surface of DH pedagogy for years. Many current debates in the newer forms of DH embrace “the digital” as its own theoretical construct. They argue that “the digital” has its own modes of production and interpretation that are separate from (for example) printed materials or physical image media (this idea permeates much our teaching and research in DDH at King’s, and one of our core aim remains to build and contribute to the global body of that theory, as driven by the humanities). It follows that “digital methods” should be seen as a body of methodology distinct from other types of method, particularly the discursive means used by humanities researchers to reach and understand the human record. If this is true, then we will have to accept that delivering “the digital” and “digital methods” online to students means that the fundamental building block of HE programmes, the learning outcome, will have to be re-thought for online delivery. What are learning outcomes even for in the digital age, when students are, as part of their everyday lives, connected with networks of knowledge, information, ways of doing things, cultures and economies that have only ever been “digital”?
Learning outcomes, defined as the skills and knowledge that a student has on completing a course that they did not have before, are inevitably tied to the types of material we teach. In the kind of humanities-driven learning of and about “the digital” that we pursue in DDH, the origins of such material may lie in the physical world (such as manuscripts, artworks, photographs etc) or the digital world (content created purely online). For reasons set out in more detail below, I believe that online teaching, in particular, gives us incredible opportunities to question this distinction in new ways, for all kinds of material in the Digital Humanities.
We can tease out these opportunities by taking an historic perspective; by looking back at assumptions which were common in the pre-digital world. In World Brain (1938), H. G. Wells predicted that in the future
[a]ny student, in any part of the world, will be able to sit with his projector in his own study at his or her convenience to examine any book, any document, in an exact replica.
This view of the world runs gloriously roughshod over any idea that “the digital” actually changes our interpretive relationship with our material in any way, rather it asserts that an “an exact replica” can be easily delivered to any student, anywhere. The medium will never be the message in such a world: rather it is value-free, lacking in any phenomenological significance, and contributing nothing to the interpretive process. If our studies of Digital Culture and its related fields have taught us anything, it is that this is manifestly not true. Of course digital transmission changes our perception and reception of cultural material. Try writing a tweet with a fountain pen and posting it through the mail, or opening a text file in MS Word 95. The digital is a prism through which we see and experience the human record past and present, not a window. Online teaching needs to embrace this, and this is very much a matter of “what”, as well as “how”.
Therefore, the challenge for DH pedagogical theory and practice as it approaches both the how and the what of online learning is to construct new forms of learning outcome which enable students to embrace that prism: the teaching of digital methods, digital citizenship and digital ways of being, rather than just digital content not, as per Wells, as simply an exactly replica of what we get in the library or the archive. There is much one could draw on from other DH discourses: for example, much is made in library and archive studies of the truth that preservation (e.g. the creation of exact replicas of content) is not sustainability (which is the ability to use those replicas in some way). I make no claim that this is a new idea in DH pedagogy (it is certainly very present in DDH), but it shows that DH has many rich and deep seams to draw in in understanding the key “how” versus “what” difference for online teaching (and research).
There follows some areas which I think we need to consider when building learning outcomes for online courses. I do not purport to offer any answers here, these are merely very initial and high-level ideas to act as way-markers to help kick off conversations that many of us will be having over the next months and, probably years. No doubt they will be changed, deleted, re-organized, re-ordered and added to, but for teaching which approaches “the digital” in a humanities driven way – which, for me is the essence of DH pedagogy – then these represent the starting points as I see them.
Participation and placemaking
Teaching is not something that happens only in the classroom or instructor’s office. As I point out in one of the early lectures of my Maps, Apps and the GeoWeb module, the classroom or lecture hall is a “place” that we all contribute to by the medium of our presence. It is more than walls, floor and a ceiling; its function channels Heidegger’s Being of Dasein: of physical presence. Place is a human construct which we create collectively and socially through processes of actually being there and, as in the world outside the academy, this has been disrupted by the digital.
In DH we have – slowly – learned to teach and develop bodies of theory with our students in the framework of “traditional” face to face teaching in the classroom and the lecture hall. Consequently the act of speaking in, and to, a group in the same physical location is a staple of the traditional seminar. However, for many of our students, physical place has already been collapsed. The channels of Instagram, Twitter, Snapchat etc may connect to the physical world through geolocation, but they “exist” aspatially. We must find ways to enable synchronous contribution to online discourse which encourages critical reflection of the nature of that “place”, that meaningfully separates formal educational channels from social ones.
Embracing asynchronous conversation
Closely related to the need to embrace asynchronicity, especially given that the seminar – small-group teaching of students co-located in time and place – is one of the key planks of humanities pedagogy. Like many of my colleagues, I make use of group work in seminars to maintain focus, but also to ensure that students who may be more introverted, and thus less inclined to contribute to a larger group discussion (despite the value of any contributions they have to make) feel able to contribute. We will need mechanisms online which facilitate such inclusivity online, and which not only facilitate one to many conversations, but also many to many discussions. These in turn need to respect, and work alongside, and not impinge on, students’ existing many to many digital lives.
New kinds of assessment
The essay is as much an artefact of conventional teaching in the humanities as the seminar; however the limitations of the essay format for assessing who students have learned and how well they have learned it in DH have long been apparent. Whilst they will also have a role in assessing discursive understanding of the core modules, however there is a general assumption across the arts and humanities that assessment will always be by essay, unless there is a reason for it to be otherwise. In DH, I would suggest the opposite should be true, especially for online teaching: essay-based assessment should have to be justified by the impracticability of shorter, practice-focused evaluation. For example, one of the learning outcomes of my own optional module is:
[The student should] be able to demonstrate knowledge of fundamental web standards for geospatial data, with a primary focus on KML, but with a broader appreciation of how these standards relate to generic frameworks, including most importantly the World Geodectic Data system. They will also be able to discuss the limitations these impose on the expression of information in the digital humanities, and discourses built around it.
Currently my assumption is that this outcome will be assessed discursively by a 4000 word essay, structured across 4-6 examples, or 4-6 arguments focused on a single case study. There is no reason at all why this assessment could not be broken down instead into 4-6 web-mounted exercises based on real-world problems based on humanities materials (in the best world of all, students could be given a list of 10+ mini-cases to select from, and then explain the methodological link between them). I think this would, in any case, get them much closer to the technical core of the problem described.
The importance of Open Access and Open Data
The COVID-19 crisis has prompted many publishers and content providers to make materials related to coronavirus research that would otherwise have been paid for freely available, for example Cambridge University Press , Wiley and Taylor and Francis. This is excellent news for sure, but we need to capture this opportunity to think in more detail about the place of Open Access and Open Data in our research and teaching.
In theory of course, online teaching can continue to be done behind institutional VPNs, subscriptions to Shibboleth and Athens, and to publishers; although some such resources are not available to students accessing content from certain regulatory regimes, which is another key factor. A move to online teaching must include promoting critical assessment of reflection on open data and open resources; to extend the principle of encouraging students to explore further reading in trusted environments (i.e. libraries) to the “wild west” of the WWW. Teaching that happens in the online “place” (see first point) must include methodological skill-building in how the features of that “place” – datasets, articles inside and outside peer-review, formal and informal research outputs, content produced by other students function, and how they can best be evaluated and navigated.
To conclude: the what and the how of online teaching are the axes on which all these considerations need to be plotted. Reconciling them will require resources, imaginative thinking, a range of theories, ideas and resources that DH has been experimenting with already for years and, above all, skillful and creative people to put them in to practice. In all these things, I think DH has a good start.
The story of Global Positioning Systems (GPS), like all phenomenal technological success stories, is one of unintended consequences. GPS is now with us everywhere. It guides our driving and finds us the restaurant nearest to us. It aids mountain rescuers, and it helps ships stay on course. The digital world would be a very different, and less interactive, place without it. A phenomenal success story it certainly is. But its origins are more complex.
Also like most phenomenal tech successes, GPS was not born of any Eureka moment. Rather it came from a hotchpotch of technology, politics, fear and expedience. The launch of the Soviet Sputnik satellite on 4th October 1957 was a point of enormous disruption for the West in the Cold War. Not only had Russia pulled ahead in the space race, it was now clear that it had the rocket technology to strike the US homeland. Western experts sprang in to action. MIT scientists found they could track Sputnik’s location using Doppler Effect principles, whereby sound waves vary in frequency according to the direction and velocity of an object reflecting them (imagine standing by a long straight road, and a car passes you at 80 mph). This in turn gave traction to the methods which would enable Earth-bound receivers to precisely triangulate their positions using satellites. For years this locative capacity was tightly guarded by the US military: under the process of “Selective Availability”, the signal, which allowed receivers to pinpoint their positions, was scrambled except for military users, so that its accuracy was all but useless except for the smallest-scale navigational purposes.
It took another Cold War flash-point to shift this thinking. On September 1st 1987, Soviet air defence mistakenly shot down Korean Airlines flight 007 to Seoul with the loss of 269 lives, thinking it was a hostile aircraft after it strayed into Soviet airspace. This bought the need for real-time geolocation in to sharp focus. In the grief and outrage that followed, then-President Ronald Reagan – amid the blood-curdling threats of retribution flying between East and West – accelerated the process of making GPS available for civilian use. And then on May 1st 2000, as the dust from the falling Berlin Wall was settling, his successor (bar one) Bill Clinton issued an Executive Order ending Selective Availability. Combined with the World Wide Web (invented 11 years earlier), this action helped set the Internet on its path from our desktops to our pockets.
We are marking this event’s 20th anniversary in DDH with a series of events. Last week, we hosted a quadruple headed set of talks by myself, Cristina Kiminami and Claire Reddleman of DDH, and the GPS artist Jeremy Wood, who presented an overview of how he uses GPS receivers to craft linear sculptures of human motion through the world. In his words, “[W]e are the data. We are the map.” This raises a whole set of practice-led research questions for Digital Humanities: how does GPS help us explore “annotative” approaches to the world, where movement can be captured and imbued with further meaning through the process of associating and linking further information with it, versus “phenomenological” approaches, which stress the subjective lived experience of creating a trace (Cristina’s work); and how an entire place, such as a penal colony can be reproduced (Claire’s work).
The work we all presented last week represented showcases of how a technology born of the unintended consequences of the struggles, crises and flashpoints of the last century’s Cold War is now open to new ways of exploring relationships between the human/physical and the digital/ephemeral. We are much looking forward to exploring these questions further throughout the 20th anniversary of the end of Selective Availability. Next in this will be a conference on May 1st – the actual anniversary of Clinton’s order – organized by Claire and Mike Duggan of DDH entitled “20 years of seeing with GPS: perspectives and future directions”. This conference will surely have a rich seam of theory and practice to build on.
There is a whole genre of writing out there on the subject of “What is Digital Humanities?”. For some, this is an existential question, fundamental to the basis of research, teaching and the environment of those parts of the academy which exist between computing and the humanities. For others, it is a semantic curiosity, part of an evolution of terminology from “computing in the humanities” to “humanities computing”, finally arriving at “digital humanities” when the instrumentalist implications of the first two no longer encompassed the field of activities described. For others still, it is a relic of 1990s angst over terminology as computing began to permeate the academic environment. Whichever camp one is in, it behoves people, like me, with Digital Humanities in their job title to revisit the question from time to time. This post is an attempt at this, with a particular emphasis on the Department of Digital Humanities (DDH) at King’s College London. The strap-line of the present-day DDH is “critical inquiry with and about the digital”. In what follows, I hope to unpack what I think this means for the field, and for DDH, which has been my institutional home since 2006. Those fourteen years have seen immense changes, both in the Department and in the field of Digital Humanities (hereafter DH) more broadly. Furthermore, tomorrow (1st February) marks six months since I took over as Head of Department of DDH. Therefore, this seems as good a moment as any for a moment of autobiographically driven reflection. I state, of course, the usual disclaimers. Like any healthy academic environment, (D)DH is marked by a diversity of views, a diversity we pride ourselves on embracing and celebrating; and despite being Head of Department, I speak only for myself, in a very personal capacity. Also, any errors of fact or interpretation in what follows are mine and mine alone.
Before I arrived at King’s, I worked for the AHRC’s ICT in Arts and Humanities Initiative at the University of Reading (to the great credit of Reading’s web support services, the AHRC ICT programme’s web pages, complete with the quintessentially 1990s banner I designed, are still available). At the time, I was no doubt suffering a colossal intellectual hangover from my efforts to apply GIS to Bronze Age Aegean volcanic tephrachronology and its archaeological/cultural contexts, and this may have coloured my view of things; but the purpose of this programme was to scope how computing might change the landscape of the humanities, and to funnel public money accordingly. This is the kind of thing that the National Endowment for the Humanities in the US has done, to great acclaim, with its Office for Digital Humanities.
What I was not, at this point, was any kind of Digital Humanist. Working outside the Digital Humanities/Humanities Computing (both appellations have been used over time, but this is another story still), I recall some push-back to the application of digital methods and “e-infrastructure” from some less engaged with technology in their work, who were concerned about reductivism, and the suborning of discursive curiosity to the tyranny of calculation. I particularly recall the debates about GIS in archaeology: GIS, we were told, encouraged over-quantification and processualism, thus stifling discursive, human-centred interpretation of the past. That, at least, is how I remember the landscape when I came to work for the AHRC programme.
Keeping this in mind, let us return to the “what is DH” question. This has become more nuanced and more complex as the “Information Age” has spread and developed over the last thirty years or so. It is well worth remembering that in that in the last 20 years (at least), “the Digital” has impacted on “the Humanities” far beyond the circle of those who self-identify as Digital Humanists in myriad ways (even recognizing that it is a highly permeable circle in the first place). For many “the Digital” was once a convenient method of sending messages supported by university communication networks, which eventually gave way to suites of tools, and associated methods, which provoked new questions about the approach, methodology and even purpose of what we were doing. Historically, many of these questions were (and are) reflected in the preoccupations of wider society as “the Digital” seeped into the praxes of everyday life. Much of the debate in the bits of academia I inhabited in the early 2000s was couched in terms of if, or how, digital technology would enable research to be done faster, more efficiently and over ever larger distances. There were even questions of computers taking over human roles and functions: perish that thought now. Taking an historical view provides a bigger and contingent picture for this: my own generation was raised in the 1980s on movies such as Terminator, Tron, Lawnmower Man and War Games, scenarios, sometimes dystopian ones, where semi-sentient machines take over the world. I have long argued to my students that it is no coincidence that the rise of “Internetworking”, and the communication protocols that enabled it, including Tim Berners-Lee’s invention of the WWW in 1989, coincided with a genre of Hollywood movies about computers becoming better at intelligence than humans.
The series of intellectual processes which led to the field of DH as we know it today thus unfolded against a backdrop of great change in society and culture, driven by computing technology. I come to the shape this gave more specifically to precursors of DH, such as “humanities computing” a little later. But these were, and are, certainly factors which have shaped the development of DH at King’s. Traversing several and various incarnations, “Humanities Computing” at King’s goes back to the early 1970s. There are traces of these times in the fabric of the environment today. If one walks from the present-day Department’s main premises on the 3rd floor of KCL’s Strand Building, down the main second floor corridor of the King’s Building towards the refectory, in the bookshelves on the left hand side – amid Sir Lawrence Freedman’s library on the history of war and small display of Sir Charles Wheatstone’s scientific instruments, is a collection of volumes and conference proceedings that originate from CCH/DDH’s early activities (picture below).
DDH came in to being, by that name, in 2010. Prior to that, it was known as the “Centre for Computing in the Humanities”. It was established as an academic department in its own right in 2002. Harold Short, formerly director of CCH and the first Head of Department of DDH, wrote that:
The Centre for Computing in the Humanities (CCH) at King’s College London is a teaching department and a research centre in the College’s School of Humanities. Its current status was reached during the course of the 2001-2002 academic year, and may be seen as the natural outcome of a process that began in the 1970s. ‘Humanities computing’ began at King’s in the early 1970s, with Computing Centre staff assisting humanities academics to generate concordances and create thesaurus listings. The arrival of Roy Wisbey as Professor of German gave the activity a particular boost. Wisbey had started the Centre for Literary and Linguistic Computing while at Cambridge (a Centre which is still in existence, with John Dawson as its Director). In 1973 the inaugural meeting of the Association for Literary and Linguistic Computing (ALLC) was held at King’s, and Wisbey was elected as its first Secretary. Although Wisbey did not feel it necessary to create a specific humanities computing center, he was Vice-Principal of the College in the late 1980s when a series of institutional mergers gave him the chance to propose the formation of a ‘Humanities and Information Management’ group in the restructured Computing Centre.
[Harold Short, July 2002 – reproduced with his permission]
My central argument is that DDH’s story is one of evolution, development, and even – perhaps contrary to some appearances – continuity. This is an impression driven by the kind of question that we have always asked at King’s. As when I joined the Department fourteen years ago, two far more interesting questions than “what is Digital Humanities” were what is “the digital” for in the humanities; and how has “the digital” changed through time? For the last fifteen years have been a key period for “humanities done with the Digital”. Digital tools allow humanists to interrogate data more deeply, more thoroughly, with greater attention to the nuance between qualitative and quantitative data (which is far less grounded in the humanities than in the social sciences). Those questions are as interesting now as they were then; and they have only become more acute as the wider landscapes of technology, the humanities, and connective research have changed.
To text or not to text
When I joined King’s, perhaps even before, one of the first things I learned about the heritage (with a small “h”) of Humanities computing/Computing in the Humanities/Digital Humanities is that, particularly in the US, it had a long history of engagement with the world of text, and its natural home there, where it had one, lay in University English Departments. Text was certainly a low hanging fruit for the kind of qualitative and quantitative research that computing enabled. Many DH scholars trace the origins of the field to the life and work of Roberto Busa (1913 – 2011), the Jesuit priest whose scholarship in the 1950s on lemmatizing the writings of Thomas Aquinas resulted the Index Thomisticus using punch-card programming, which is widely regarded as the first major application of “Computing in the Humanities”. In the context of 1950s computing, the project was enabled by the epistemological inclination of text to lend itself to calculation: text, as a formal system of recording “data”, which is convertible in to information by the process of sentences and paragraphs, and thence in to knowledge by the process of reading, can be (fairly) unproblematically transferred to punch cards, then the principle form of storing data (this was a vast human undertaking, utilizing the skills of many skilled and unskilled operatives, many of them women, whose stories are now being re-told by the work of Melissa Terras and Julianne Nyhan. Matthew Kirschenbaum takes up this argument:
First, after numeric input, text has been by far the most tractable data type for computers to manipulate. Unlike images, audio, video, and so on, there is a long tradition of text based data processing that was within the capabilities of even some of the earliest computer systems and that has for decades fed research in fields like stylistics, linguistics, and author attribution studies, all heavily associated with English departments.
[Kirschenbaum, Matthew G. “What is digital humanities and what’s it doing in English departments?”, in M. terras, J. Nyhan and E. Vanhoutte (eds.) 2016: Defining Digital Humanities. Routledge: 213.]
CCH did a lot of work on text, but it did many other things besides. Even before I joined, I remember seeing CCH as a dynamic crucible of new thinking, a forum in which classicists, archaeologists, textual scholars, literary researchers, visualization experts, theatre academics, and more, could come together and speak some kind of common language about what they did, and especially how they developed, critiqued and used digital tools. This was a view shared by much of the rest of the world. Most visibly to me, it was recognized by the AHRC’s award of the grant that enabled me to come to King’s, the AHRC ICT Methods Network, at that time the biggest single award the AHRC (and its predecessor, the Arts and Humanities Research Board) had ever made. It was a revelation to me that such a place could even exist. It was certainly nothing like the environment I had known as a lonely GIS/Archaeology PhD researcher, working in a Department full of experts on Plato. It was therefore with great excitement that I joined CCH in January 2006, as a research associate in the Arts and Humanities E-Science Support Centre, which was attached to the Methods Network (what AHeSSC was and what it did is another long story, which is partly told in a much earlier post on this blog). It was just as exciting as an environment to work in as it had looked as an outsider.
A delve into the Centre’s public communications at this time show how this collaborative spirit bought it to re-think what the humanities might mean in the Information Age. The Internet Archive’s Wayback Machine [a venerable resource now, having been launched back in 2001, but which is now an indispensable tool for much research on the history of the Web] took an imprint of CCH’s website on Saturday 14th January 2006, two days before my first day at work there. It has this to say about the Centre’s role. This seems to confirm the recollection above, that CCH was primarily an agent of collaborative research:
The Centre for Computing in the Humanities (CCH) is in the School of Humanities at King’s College London. The primary objective of the CCH is to foster awareness, understanding and skill in the scholarly applications of computing. It operates in three main areas: as a department with responsibility for its own academic programme; as a research centre promoting the appropriate application of computing in humanities research; and as a unit providing collegial support to its sister departments in the School of Humanities. As a research centre, CCH is a member of the Humanities Research Centres, the School’s umbrella grouping of its research activities with a specifically inter-disciplinary focus.
Despite the emphasis on collaborative research, one can see from this that CCH was also a place that did, very much, its own original thinking, grounded in the methods and thinking that were driving humanities computing at the time (see above). We can get a flavour of this by looking at the titles of the seminars it ran, which are still there for all to see in Wayback: “Choice in the digital domain – will copyright extend or stifle choice?”; “Adventures in Space and Time: Spatial and Temporal Information in the Digital Humanities”, “From hypermedia Georgian Cities to VR Jazz Age Montmartre: hyperlinks or seamlessness?”, and “The historian as aesthete: One scenario for the future of history and computing”. Some of these titles would not feel at all out of place in the seminar series of the DDH of today. Therefore, while the field, and the Department, have both (of course) changed significantly over the years, this suggests that there are some threads of continuity, as well as evolution, running through: innovation, responses to the challenges and opportunities of the digital, thinking through new approaches to the human record; and indeed what the “human record” is in the digital age. This, certainly, accounts for the first of the areas indicated above, in which CCH emerged as a department responsible for its own academic programme. What, I think, has changed most, is how the department collaborates.
The late 1990/early 2000s were a time of great change and innovation in DH, both technologically and institutionally. A 2011 Ithaka research report noted at the time that
In 2009 the Department of Digital Humanities (DDH), formerly known as the Centre for Computing in the Humanities (CCH), presented the model of a successful cross-disciplinary collective of digital practitioners engaged in teaching and research, with knowledge transfer activities and a significant number of research grants contributing to its ongoing revenue plan.
This highlights the fact that much of the Centre’s activity depended on income from externally funded research projects. Ground-breaking collaborations with which CCH were involved, and in many cases led, still resonate: Henry III Fine Rolls, in musicology, and the Prosopography of Anglo-Saxon England all produced world-class original research, which contributed to a variety of areas, some directly in the associated humanities domains, some in CCH itself. Interdisciplinary collaboration was the lifeblood of the centre at this time.
It’s all about the method
One thread of continuity is what I would call “methodological emplacement”. That is to say, CCH/DDH has always had an emphasis on what it means to do Digital Humanities, the practice of method as well as the implementation of theory (whose theory is a key question). This, in itself, challenges “conventional” views of the humanities, inherently rooted in the theory and epistemology of particular ways of looking at the world. Among other things, it results in a willingness to deconstruct the significance of the research output – the monograph, the journal article, the chapter in the august collection. Providing a space to think beyond this, to consider the process, the method, and other kinds of output, in itself, lends itself interdisciplinary digital work. Arguably, the three strands to the CCH of the mid-2000s outlined above must, surely, constitute such a space, and is surely still a concern at the core of the present-day DH.
In a forthcoming handbook volume on research methods for the Digital Humanities that we are co-editing myself and my DDH colleague Dr Kristen Schuster develop the idea of methodology drawing from multiple theories:
The fact that each concept illustrates a matter of process rather than output from different perspectives is, we argue, telling of how badly we need to discuss research methods at large instead of research outputs. Considering research as a process, rather than an amorphous mass of activity behind a scholarly output, makes room for identifying crosscurrents in theories, platforms, infrastructures and media used by academics and practitioners – both in and beyond the humanities.
One can hardly expect a field which is concerned with an academic research programme in digital methods and their “appropriate application” in the humanities” not to change, and to expand the theoretical basis that underpins it, as the Information Age galloped on. My personal “year zero”, 2006, was only two years after Facebook was founded (February 2004), and two months before the launch of Twitter (March 2006), a year after Google branched in to mapping, and two years after the first widely-used adoption of the term “Web 2.0”. In our 2017 book, Academic Crowdsourcing in the Humanities: Crowds, Communities and Co-production, my colleague Mark Hedges and I argue that the mid-2000s were a transformative period in networked interactivity online, the time when movements like internet-enabled crowdsourcing have their origins (the word “crowdsourcing” itself was coined in 2006 in Wired magazine): DH was transformed, just like everything else.
Change in DH here, and everywhere else, continued apace. A key moment of the Department’s more recent history was 2015, when the King’s Digital Lab was established, providing an environment for much of the developer and analyst expertise that had previously resided in DDH, and before that CCH. Today the two centres have a closely symbiotic relationship, with KDL establishing a ground-breaking new agenda in the emerging field of Research Software Engineering for the humanities. This is far more than simply a new way of doing software engineering: rather, KDL’s work is providing new critical insights into the social and collaborative processes that underpin excellent DH research and teaching, establishing new ways of building both technology and method. The creation of KDL also underlines the fact that “DH at King’s” is no longer the preserve of a single Department or centre; rather DH is now a field which is receiving investment of time, energy, ideas and, yes, money across the institution.
With and About the Digital: from Busa to Facebook
Just as Busa’s work in the 1950s opened up text to new forms of interrogation by transferring it to the medium of punch cards where it could be automatically concordanced, so these developments open up the human cultural record, including its more recent manifestations, to new kinds of interrogation and analysis. I do not think this should be a particularly controversial view. After all, my former employer, the AHRC, fully embraced this post-millennial epistemological shift in the humanities very explicitly with its “Beyond Text” strategic initiative, which ran from 2007 to 2012. This was described as a “a strategic programme to generate new understandings of, and research into, the impact and significance of the way we communicate”, a response to the “increased movement and cross-fertilization between countries and cultures, and the acceleration of global communications”. The reality that the humanities themselves were changing in the face of a newly technological society is writ large.
This truth is not changing. In the eight years since the Beyond Text initiative finished, global communications, and the kinds of digital culture and society they enable have grown more complex, more pervasive and less subject to the control of any individual human authority or agency (with the possible exception of the Silicon Valley multinational giants) and, with the emergence of phenomena such as Fake News, ever more problematic. The digitalization (as opposed to digitization) of culture, and heritage, and politics, and communications – all the things it means to be human – has opened up new arrays of research questions and subjects, just as the digitalization of text did in the twentieth century. To put it another way, the expansion of “the Digital” has given DH the space to evolve. I believe it must embrace this change, while at the same time retaining and enriching the humanities-driven critical groundwork upon which it has always rested.
Alan Turing himself said that “being digital should be of more interest than being electronic”. And so it has always been at DDH. Digital Humanists have always known this. The present strapline of the Department of Digital Humanities is “critical inquiry with and about the digital”. The prepositions “with” and “about” provide space for a multivocal approach, which includes both the work DDH(/CCH) has excelled at in the past, and that which it does now. Crucially in my view this enables them to learn from one another. Critical research with the digital” is, I would argue, exactly what Busa did, it is exactly what the English Laws, Prosopography and Fine Rolls project are. At the same time, “critical research about the digital” recognises the reality that “the digital” itself has become a subject of research – the elements of society and culture (increasingly all of these, at least in the West) which is mediated by digital technology and environment. As I finish my first six months as Head and look to the next, I want to see the Department continue to be a space which enables the co-equality of “with” and “about”.