This post is written as part of the Call for Papers over at ThenDig, looking at Zeitgeist in archaeological research and how to follow it, keep up with it, or create it. As will be clear from the previous posts on my blog, I am interested in using Mixed and Augmented Reality to aid in archaeological research. Augmented Reality (AR) is currently just over the ‘Peak of Inflated Expectations’ of the Gartner Hype Cycle meaning that it has been hailed previously as the next Big Thing, but has not quite lived up to the hype and so now needs a lot of work to make it a sustainable and useful technology – I have previously written about what this means in terms of archaeology here.
As I have just finished editing the final version of my PhD thesis on the use of AR in archaeology I decided to write this post to give some brief reflections on what it has been like trying to surf the Hype Cycle, whilst still producing 85,000 words of scholarly research on the topic.
Twitter is your enemy
Perhaps a controversial statement, but for one attempting to sit down and write intelligently about something that is currently the zeitgeist Twitter is not your friend. I don’t say this because of the many wasted hours of procrastination that goes into reading and obsessively checking a million and one tweets (although this is certainly true), I say it because when working on something at the bleeding edge of tech Twitter provides hundreds of teasing snippets of the amazing research that other people are doing. This isn’t just other researchers, but also companies and hackers who seem to have all the time (and money) in the world to make cool proof-of-concept videos. While initially amazing and a great source for early ideas and ways in which to give your research the ‘wow-factor’, it quickly becomes disheartening – seeing what other people are achieving whilst you are stuck still making sure your bibliography is formatted correctly. It provokes the need to be blogging/creating/making/
Remember your roots
One of the key things to remember when using new tech is that no matter how deeply you immerse yourself in the tech world, when you emerge you need to convince other archaeologists that what you have been doing is useful. Archaeologists are notoriously wary of new technology and will be your biggest crtics – and this is A Good Thing. Every new digital method or gadget should only be developed to further archaeological method/theory and our knowledge of the past – not simply for wow-factor or as a result of a ride on a Hypegeist bandwagon. If it won’t work outside in the rain or you can’t convince a colleague of the usefulness of it without resorting to fancy videos or Prezis then don’t bother.
Every surfer loses a wave
Be prepared to fall off the wave, and watch other people riding. It is going to happen anyway and by being patient, sitting back and watching other people ride the wave you can learn just as much as you can by constantly doing. It is less tiring and often very much more rewarding. I have found that acknowledging you are always going to be behind the curve promotes a feeling of calm reflection that is vital for properly researching what you are doing, and gives you the knowledge to choose the right time to jump back on the crest.
Take your time
Whilst blogs are great for working through ideas, writing academically makes you consider every word and sentence and forces you to find other research that backs up or challenges your claims. As someone who researches new technology everyday, a digital detox is almost unheard of. However, taking the time to unplug everything, sit down and write the paper or thesis makes you critically examine everything you are saying or promoting with a clear unhindered perspective.
I am convinced this is the reason that baking is so zeitgiest at the moment. People are craving time away from the digital world to watch their sourdough grow and savouring the time it takes for a loaf to prove and bake puts you back in the real world. Sadly, however, they are tech-ifying sourdough too.
PEER RESPONSE: James Stuart Taylor, University of York
Initially I wondered whether I might be the correct person to offer peer comment on Stuart’s Zeitgeist post. I do not blog (no time!) and I rarely tweet, maintaining a belligerent cynicism about the usefulness of this particular social media (this may be softening as I increasingly find I’m not averse to live-tweeting at conferences – which for me has the joyously irreverent feel of passing notes in class). But I am not a technophobe and I do get it. As a field archaeologist I have always maintained a deep interest in applied computational technologies, amongst other things they help us in our work as tools for data acquisition, analysis and dissemination – and in this sense I’m very open to the ‘bleeding edge’. I found myself smiling wryly at Stuart’s commentary on the problems of balancing popular hype and academic engagement of bleeding edge technologies, and at the same time nodding in agreement, reflecting upon the deeper issue here.
My research focuses upon applied GIS as a tool for getting to grips with intra-site spatiotemporal data from archaeological excavations. Being a computing technology that had its genesis nearly 50 years ago, GIS is far from the bleeding edge. My tech is well over the crest of its hype wave in archaeology (clawing its way out of the trench of disillusionment I’d say!). Indeed GIS is now a well-established technology in archaeology. Yet I am researching a fundamental that has never been satisfactorily addressed in the development of GIS: the interconnection (or lack thereof) of space and time. Important for GIS you’d think – critical for archaeologists; but rarely considered academically. Surely people should have solved this problem a long time ago? – apparently not! Why? – because it’s complicated…
Complexities and the challenge of overcoming them is ultimately whatdrives research, but a fundamental from an academic perspective is often low priority from a technical perspective. As a discipline it is easy to lose focus upon more tricky issues as we ride the tide of technohype that Stuart is alluding to: lidar, laser scanning, 3D modeling, drones, hyperspectral cameras, ‘space archaeology’, are part of a seemingly endless list of technological applications which vie with augmented reality (and the now passé GIS) to take a turn on the crest of the hype-wave. And like the stereotypical surfer that the metaphor alludes to, these technologies tend to be trendy, fun, immediate, impressive, but ultimately can be shallow in their application – this is a shame.
Stuart’s final message is important – as a discipline we do need to take time to think academically about the technology we apply, to reflect upon the theory that drives these new methodologies (or which they recursively help generate), to look for meaningful and practical applications which will outlive the zeitgeist, so that they can stand the test of time and answer key research questions. Perhaps moving away from the endlessly cloned variants of that ‘pointless’ conference paper we’ve all seen a thousand times: “Look at the Size of my Point Cloud”.
As someone who wrote a PhD a few years back on the Semantic Web and archaeology, I feel Eve’s pain in trying to create a nuanced, scholarly understanding of a technology that has now entered the ‘Trough of Disillusionment’. Only now do I feel we are starting to face the reality of what it’s going to take to make Semantic Web technologies useful for archaeologists, and just how much hard slog will be involved in realising it. Those riding the ‘Peak of Inflated Expectations’ headed for the hills long ago, giving those of us who are left a quieter place to work, and that’s no bad thing.
The deeper tension I feel Eve has hit upon here, is one I think is pervasive in the realm of digital archaeology, but seems to be rarely discussed: the fact that we sit between two very different disciplines and have to try to make sense of both. Archaeology is so interdisciplinary; I know we all feel the ‘Jack of all trades, master of none’ anxiety at one point or another. When manifest in its negative form, archaeologists don’t engage with the wider external discipline they are incorporating into their work, lest they be proven less than competent practitioners. For example, if we ask a statistician to look at our use of statistics in archaeology, we might find out we aren’t very good at statistics, so best not. For most of us however, we do our best to stay abreast of current good practice, and make sure we consult or collaborate with those who know the realm better than we do.
As Eve has pointed out, for archaeologists working with digital technology, the pace and chatter of the technology realm feels unrelenting, and it’s easy to get caught up in the hype that creates, to the detriment of the archaeological research we are trying to serve. I would argue we need to spend more time understanding and articulating the value of choosing to sit in between these disciplines, to both sides, and to each other. Understanding how to bridge two very disparate things is as much an area of deep expertise as knowledge of Roman fortification or Linked Data. Virtually all of my colleagues here at the ADS are both archaeologists and digital practitioners. I go to work every day with a group of people (all with very different backgrounds), for whom sitting between these two disciplines is our profession, but I think that is rare. We tend to be scattered and embedded in other groups, and I think that makes it even more difficult to resist hype and find useful and sustainable balance, so all the more reason to explore ways to come together and value this work. We can also share our favourite bread recipes.