Converted from a Word document
The idea that Digital Humanities practitioners might provide a translational capacity within and between the arts, humanities, information and computer science, easing collaboration between these disciplines and enhancing shared results, is not a new one: in fact, there is a long tradition of conceptualising at least some digital humanists as “intermediaries,” (Edmond 2005) “translators” (Siemens et. al., 2011) or “hybrid people” (Liu et al 2007, Lutz et al 2008 cited in Siemens et. al., 2011). As the long-predicted mainstreaming of digital humanities and digital methods into arts and humanities research advances, we might expect this transformation of the digital humanities from a disruptive to a supportive force to continue. Furthermore, while some within the academy certainly view the potential industrial relevance of the digital humanities with suspicion (Allington et. al., 2016), there are also many voices from industry itself calling for the development of a more humanistic, critical dimension in the work of the ICT industry (Hern, 2018; Madsbjerg, 2017; Hartley, 2017; Copenhagen Letter, 2017; Centre for Humane Technology, 2018).
While it may therefore seem timely to explore, as Liu (2012, 2016) has called for, how the digital humanities might deliver a linchpin set of critical competencies for and reflections on the techno-social interface, how this cultural intervention into technology development might resonate with of the core tenets of DH remains unclear. This paper will introduce such a frame of reference by exploring the implications for digital humanities to be found in a corpus of 38 linked interviews about big data research. The project that developed this material, an EU-funded collaboration known as Knowledge Complexity, or KPLEX for short (
www.kplex-project.eu
), explored in depth the perspectives of and attitudes toward big data found among computer scientists, collections holding institutions, and an interdisciplinary research community reaching from philosophy to fMRI studies (emotion research). The project originally focussed on understanding unconscious bias in such research, but they also expose the depth of the misalignment between approaches to how knowledge is generated and validated across contributing disciplines.
The data the project produced therefore offers much food for thought to those of us who identify as digital humanists, as it points toward a number of key barriers commonly faced and ideally negotiated within our hybrid research space. When viewed from the perspective of the KPLEX project’s data, six distinct points of ‘aporia’ arise, places where the interviewees explicitly or tacitly exposed gulfs in epistemic culture that are clearly at the heart of the tensions between disciplines as they seek to collaborate. These gulfs in goals and understanding echo the work of digital humanists, but also expand upon and throw into relief the underlying tensions in their research. While none of these findings presents, strictly speaking, an insoluble problem, the KPLEX interviews clearly illustrate the embeddedness of these challenges in the foundations of the contributing disciplines. This entanglement with professional identities and values raises them above the level of mere barriers, to a status where a more fundamental reconsideration of the scholarship produced within such collaborations may be required. In these fundamentals we may find future avenues for DH to grow in its own right, but also to expand and reconsider its potential impact. This paper will focus its exposition on the nature of and evidence for these gaps given in the interviews, which can be briefly described as follows:
Language matters. In particular the interviews with computer scientists showed a resistance to discussing what certain key terms might mean or imply, a lack of precision that would draw criticism in a purely humanities context. This impulse weakens the potential for self-reflection in computer science but also greatly impedes successful interdisciplinary work, which may progress for extended periods on a falsely constructed sense of common understanding. While this obscurity had already been observed by Borgman (2015), the KPLEX project results provide not only empirical evidence of the phenomenon, but also of its eventual negative consequences.
Context matters. Datafication implies decontextualization, and this data/context-tradeoff is only rarely reflected in data-driven methodologies (for a notable exception see Nelson, 2017). But in humanistic enterprises context is indispensable: for an historian, for example, provenance is an all-important facet in the understanding of any source. But that which is a potentially harmful data ‘modification’ for one community is a neutral, or in fact positive, process of data ‘cleansing’ for another.
Tools and standards are
pharmaka
, giving much but taking as well. In particular, information scientists can see how the availability of certain dominant tools (like keyword searches and metadata standards) are both liberating and limiting in equal measure. Data and metadata standards can be perceived by humanists as handcuffs, limiting possible iterative adaptation of parameters, but the resulting variability and complexity stand in opposition to interoperability, aggregation, and scaling (Saklofske et al., 2015).
Data without theory is as problematic as theories without evidence. A popular notion has been proposed that big data may have delivered us to the ‘end of theory’ (Anderson, 2008), but researchers actively working at the edges of big data can see clearly that this is not the case. That said, the lack of a critical frame merely pushes much of the transparency around complex phenomena into a black box with an authority based on a potentially flawed algorithm.
The power structures of technology inhibit accommodation of analogue or hybrid narratives. Much of the humanistic source landscape is still measured in kilometres of shelving rather than terabytes of data. Because of this, digital humanities practices must be well-adapted to resisting the Matthew Effect (Merton, 1968), by which research becomes concentrated on the limited, potentially flawed data - this is not always the case outside of the humanities, however. Moreover, the struggle between ‘archival thinking’ and ‘computational thinking’ evidenced in the interviews and the conceit of routinisation raises questions of who will control cultural heritage knowledge in the future.
Humanistic competences are not taught in conjunction with digital approaches. Critical, speculative, and hermeneutic thinking - the hallmarks of the humanities - are not taught alongside empirical methodologies, and critical approaches are not systematically implemented in computational studies -- Jonathon Morgan’s analysis of the Alt-Right Movement on Twitter (2016), and the Digital Humanities’ Now ‘Editor’s Choice’ Project ‘Torn Apart / Separados’ (2018) are two rare and enlightening exceptions.
The paper will conclude with a series of reflections on how digital humanities researchers could move within their disciplines and beyond to become uniquely able to negotiate some of these critical conversations. It will also address crucial points DH can share with all interdisciplinary collaboration, such as shared data formats and structuring approaches, how misconceptions are surfaced and resolved, the place of self-reflection and methodological discussions, and the incommensurability of research questions and methodologies. In conclusion, it will offer recommendations for how each of the six aporias might be met and used to create a stronger digital humanities community and culture, fulfilling its potential as both a disruptive and productive force.