Data School

News

Tool criticism: From digital methods to digital methodology

Cite as:
Van Es, Karin, Maranke Wieringa and Mirko Tobias Schäfer. 2018. “Tool criticism: From digital methods to digital methodology.” Datafied Society Working Paper Series. 28 May. Web. https://datafiedsociety.wp.hum.uu.nl/tool-criticism

NB an updated version has been published ACM and can be downloaded here

In this blog post we wish to reflect on the fact that digital tools and data have changed the production of knowledge (Meyer & Schroeder, 2015). Although there has been attention to biases in digital tools, discussions have been scattered not only across monographs, articles and book chapters lacking a proper label, but also tend to remain in their respective academic bubbles. In the digital humanities different methods have emerged, each embedded in their own fields. In the wake of the computational turn in the humanities, literature studies, history, media studies, languages and other disciplines developed their own distinct methods and tools (e.g. Burdick et al. 2012; Gold 2012; Rogers 2013). Whether the wave of new tools for digital scholarship really warrants a new domain of enquiry is open for debate. Of course, we should be concerned with how our research tools – which for many, indeed, are relatively new types of tools – intervene and shape research processes. At the Datafied Society we find it is useful to have a term such as tool criticism around which to focus our efforts.

In this post, we briefly touch on ideas from the Digital Humanities, Digital Methods Initiative and Software Studies Initiative (interested in an area of cultural analytics) about the ‘neutrality’ of tools. Here, then, the lineage to work done in New Media Studies is addressed as opposed to literary and information studies. We find that already quite a bit of attention has been paid to the productive role of digital tools in research. Moreover, the idea that media and technologies are biased as a general point of departure is not new and has been studied for several decades within New Media Studies. Subsequently, we briefly sketch the work conducted at the Datafied Society to enhance the knowledge we have about our tools. Concluding, we offer a working definition of tool criticism and make a plea for digital methodology.

On the ‘neutrality’ of tools

Within more traditional Digital Humanities research, Eric Meyer and Ralph Schroeder (2015), concerned with what they call ‘e-research’, write of “knowledge machines” and explore how digital tools and data transform both the consumption and production of knowledge. Johanna Drucker (2015) has been critical of data visualization tools particularly as they often have origins in the natural or social sciences. As such they raise similar concerns as scientific images have before due to their perceived objectivity (Daston and Galison 2007; Bredekamp, Dünkel and Schneider 2015). Drucker argues that, in relation to visualizations, “We should ask the same basic questions we use to study any artifact: Who made it, how, when, where, and with what assumptions?” But Drucker also points to a need to understanding the statistical models of the image of data. She warns for the “reification of misinformation” which refers to confusing the display for the source. To clarify this point she discusses the use of Ngram as example:

once someone makes an Ngram, they present it as if it were the actual phenomena. “See, the term god is popular in this period and not in that.” Instead, they should say “The Google corpus indexed by their search algorithms shows this or that statistical increase in the sample set.” (np)

Without labelling it tool criticism, she here clearly reflects on the impact of digital tools on knowledge production. Recently, Koolen, Van Gorp and van Ossenbruggen (2018) have made a plea for “digital tool criticism.” They find that more awareness of the biases in digital tools and how they shape one’s research is needed and have organized several workshops on the matter. They are involved with digital research in the heritage domain and build on ideas from (digital) source criticism.

Noortje Marres (2017), affiliated with the Digital Methods Initiative (DMI), recognizes how studies that make use of digital methods or data face the issue of digital bias. She explains, “once we start analysing online materials and data, researchers may easily find themselves studying not the social phenomenon they set out to investigate, but rather the peculiarities of digital platforms and digital practices themselves” (Marres 2017, 117). The ambiguity in this type of research centers on the question: Are we researching society or technology? Related directly to this question Venturini et al. (2018) have offered an article in which they discuss eight practical precautions to deal with the conflation between medium and message in using digital methods. Also connected to DMI, Bernhard Rieder and Theo Röhle zoom in on the type of knowledge needed by researchers: “Reflective practice requires much more than a critical attitude, it requires deeper involvement with the associated knowledge spaces to make sense of possibilities and limitations” (119). Thus, they claim that digital tools mobilize concepts and techniques which are not always sufficiently understood to assess their impact on research output. Gephi, for instance, gives a large audience the ability to produce network diagrams without understanding the “layers of mediation” involved in its production. To understand these layers requires, what they call interrogating a concept by David Berry, Digital Bildung.

There is also a wave of research centered around cultural analytics. Lev Manovich introduced the term in 2005 to reference the use of computational and visualization techniques to analyze cultural data sets and flows. Two years later he established the Software Studies Initiative to work on these types of research projects. Aside from these practical projects the lab is also concerned with “the theoretical analysis of how software systems (including apps, algorithms, machine learning and big data analytics) shape contemporary cultural and social life.” This work has resulted in prominent publications such as Software Takes Command (Manovich 2013) and the MIT Press Software Studies book series. As a field, software studies is closely linked to theoretical approaches such as interface studies, platform studies and more recently algorithm and code studies.

Both DMI and the Software Studies Initiative are part of the academic discipline known as New Media Studies, which inquires the qualities and the use of new technologies and their social impact (Lister et al. 2003). The critical inquiry into the politics of artefacts (e.g. Winner 1980), the psychology of design (e.g. Norman 1988), and scientific tools and their epistemic impact (e.g. Latour and Woolgar 1979; Knorr-Cetina 1995) is an essential aspect of Science and Technology Studies and is informative for New Media Studies. The conceptual origins of New Media Studies can be traced to the work of Harold Innis and Marshall McLuhan, who focused on how the medium (rather than the message) shapes society (e.g. Innis 1950; McLuhan 1962). In light of this theoretical grounding it is not surprising that readily in 2010, media scholar José van Dijck wrote about how search engines like Google Scholar produce academic knowledge through their ranking systems and profiling systems. Tool criticism can and should therefore not be considered a new domain. However, what makes present-day digital tools hard to unravel are – as Drucker, and Rieder and Röhle have identified –  the concepts and methods (often imported from other disciplines) they put to use. While they share similar concerns, traditional digital humanities are looking for answers more within literary studies and information studies. In the section that follows we consider our first contributions to tool criticism. We show how tool criticism can feed back into the scholar’s tool arsenal, in the form of the Gephi ‘field notes’ plugin and demonstrate how tool criticism can also provide scholars with a lens to investigate their own working process, by for instance charting the life of a graph.

Our work on tool criticism
Gephi is a popular network visualization package, and used widely within scholarly research. As open source software, it is easy to access at no cost. A dynamic developer community contributes to the software with new features, updates and bug fixing. However, working with Gephi makes one realize the limitations of the software, some of which have not been addressed yet by the developers, and stimulates inquiry about the epistemic impact of the tool(s) one uses for research. Provoked by the tool, members of the Datafied Society (Utrecht University) have been developing, in collaboration with with the Digital Humanities Lab Utrecht, a Gephi ‘field notes plugin’ (Wieringa et al. forthcoming). The plugin automatically logs selected choices and parameters to accompany produced network visualizations. Its goal is to facilitate collaborative projects and make the “interpretive acts” (Drucker 2011) involved open to scrutiny by others. At the moment such registration is not possible in the tool and a undo button is noticeably absent.

The lack of documentation of the researchers actions in Gephi prevents from retracing one’s own steps when necessary and makes it impossible for others to scrutinize and reproduce results. This opaqueness stands in stark contrast with traditional scholarly practices and codes of conduct. The plugin, as such, is deeply rooted in our tool criticism of Gephi. It furthermore makes evident that tool criticism as a label is somewhat misleading as it focuses on how the tool, rather than the relation and interaction between researcher and tool, shapes research output. It is worth connecting all the disparate reflections on the influence of digital tools under label “tool criticism,” and building on the foundation established these past decades in (new) media studies (which in affordance theory has already provoked us to think not about the tool alone, but the relationship between tool and user).

We find that approaching our tools through the lens of fields such as Science and Technology Studies, Material Culture Studies, and Ethnomethodology can expose how (the affordances of) our tools influence our research. In an attempt to show how our tools play an active role in the research process, members of the Datafied Society wrote a biography of network visualization (Van Geenen and Wieringa, forthcoming), in which they deployed Kopytoff’s (1986) ‘cultural biography of a thing,’ in order to expose the particularly influential moments in the life of a graph, and how the graph in turn feeds back into the research process (i.e. in the context of Exploratory Data Analysis). In this chapter they show how network graphs (dis)allow particular engagement with different kinds of audiences. It clarifies how the relation between the user and a tool, or its ‘output,’ is contextual.

A working definition
The various illustrations above all paint a slightly different picture, which is why we believe formulating a working definition for tool criticism can be helpful. In our practice we see tool criticism as a reflexive and critical engagement with tools. In this reflexive and critical practice, the limitations and presuppositions built into the tool and its output need to be put under scrutiny, as well as the user’s interaction with the tool. For this conception we draw from a variety of fields (e.g. STS, feminist theory, software studies and critical data studies, software studies) to come to terms with how the tools themselves are non-neutral, and afford particular kinds of use, and how output, such as visualization, is always already imbued with particular conventions, and manipulations.

There are many scholars already vouching for such reflection and we are not claiming that tool criticism is a new phenomenon. Instead, we advocate for an umbrella term for a wide variety of work which is already being done (e.g. Rieder 2013; Kennedy et al. 2016; Kennedy and Hill 2017; Drucker 2011). What the term offers to scholarly conduct, is the unification of technical reflection of the tool (e.g. Rieder 2013; Van Dijck 2010) and the critique of the research output (e.g. Kennedy et al. 2016). Such tool criticism, which covers not merely the tool, but its influence on the research process and the results and their presentation, and the way in which the user interact with it, paves the way to move from digital methods (cf. Rogers 2013; 2017) to ‘digital methodology’. In light of this we put forward the following working definition:

Tool criticism is the critical inquiry of knowledge technologies used for research purposes. It reviews the qualities of the tool in light of the research activities and reflects on how the tool (i.e. its working mechanisms, anticipated use, interface, and embedded assumptions) affects the research process and output.

Tool criticism is at the heart of methodology, which describes the rationale for selecting a specific method for the research. Our hope is that tool criticism catches on as a term to connect the efforts on the matter and that it creates more awareness for digital methodology.

 

References

Daston, Lorraine and Peter Galison. 2007. Objectivity. Cambridge, MA: The MIT Press.

Drucker, Johanna. 2015. “Humanizing Maps: An Interview with Johanna Drucker.” William Fenton PC Magazine. October 1. Web. https://www.pcmag.com/article2/0,2817,2492337,00.asp

—- 2011. ”Humanities Approaches to Graphical Display.” Digital Humanities Quarterly 5 (1) http://digitalhumanities.org:8081/dhq/vol/5/1/000091/000091.html

Haraway, Donna Jeanne. 1988. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14 (3): 575–99. https://doi.org/10.2307/3178066.

Horst Bredekamp, Vera Dünkel and Birgit Schneider (eds.). 2015. The Technical Image: A History of Styles in Scientific Imagery. Chicago: The University of Chicago Press.

Kennedy, Helen, Rosemary Lucy Hill, Giorgia Aiello, and William Allen. 2016. “The Work That Visualisation Conventions Do.” Information Communication and Society 19 (6). Taylor & Francis: 715–35. https://doi.org/10.1080/1369118X.2016.1153126.

Kennedy, Helen, and Rosemary Lucy Hill. 2017. “The Feeling of Numbers: Emotions in Everyday Engagements with Data and Their Visualisation.” Sociology. https://doi.org/10.1177/0038038516674675.

Knorr Cetina, Karin. 1995. “Laboratory studies: The cultural approach to the study of science.” Handbook of science and technology studies.

Koolen, Marijn, Jasmijn Van Gorp, and Jacco van Ossenbruggen. 2018. “Lessons Learned from a Digital Tool Criticism Workshop.” DH Benelux 2018.

Kopytoff, Igor. 1986. “The Cultural Biography of Things.” In The Social Life of Things, 64–91. Cambridge: Cambridge University Press.

Manovich, Lev. 2013. Software Takes Command. New York: Bloomsbury Academic.

Marres, Noortje. 2017. Digital Sociology. Cambridge: Polity Press.

Meyer, Eric T. and Ralph Schroeder. 2015. Knowledge Machines: Digital Transformations of the Sciences and Humanities. Massachusetts: MIT Press.

Rieder, Bernhard. 2013. “Studying Facebook via Data Extraction: The Netvizz Application.” Proceedings of WebSci ’13, the 5th Annual ACM Web Science Conference, 346–55. https://doi.org/10.1145/2464464.2464475.

Rieder, Bernhard and Theo Röhle. 2017. “Digital Methods: From Challenges to Bildung.” In The Datafied Society: Studying Culture through Data. Eds. Mirko Tobias Schäfer and Karin van Es. Amsterdam: Amsterdam University Press. 109-124.

Rogers, Richard Allen. 2013. Digital Methods. Cambridge: MIT Press.

Rogers, Richard Allen. 2017. “Foundations of Digital Methods.” In The Datafied Society: Studying Culture through Data, edited by Mirko Tobias Schäfer and Karin van Es, 75–94. Amsterdam: Amsterdam University Press.

Van Dijck, José. 2010. “Search Engines and the Production of Academic Knowledge.” International Journal of Cultural Studies 13(6): 574–592.

Van Geenen, Daniela, and Maranke Wieringa. Forthcoming, expected 2019. “Approaching Data Visualisations as Interfaces: An Empirical Demonstration of How Data Are Imag(in)ed.” In Data Visualization in Society: The Relationships between Graphs, Charts, Maps and Meanings, Feelings, Engagements, edited by Helen Kennedy and Martin Engebretsen. Amsterdam: Amsterdam University Press.

Venturini, Tommaso, Liliana Bounegru, Jonathan Gray, and Richard Rogers. 2018. “A reality check(list) for digital methods.” New Media & Society. Article first published online: April 20, 2018

https://doi.org/10.1177/1461444818769236

Wieringa, Maranke, Daniela Van Geenen, Karin Van Es, and Jelmer Van Nuss. Forthcoming, expected 2018. “The Field Notes Plugin: Making Network Visualization in Gephi Accountable.” In Good Data, edited by A. Daly, K. Devitt, and M. Mann. INC Theory on Demand.

Winner, Langdon. 1980. “Do Artifacts Have Politics?” in Daedalus 109 (1), Winter 1980. Reprinted in The Social Shaping of Technology, edited by Donald A. MacKenzie and Judy Wajcman (London: Open University Press, 1985; second edition 1999)

 

Do you have any questions or feedback? Please let us know! Contact EN
Feedback

Feedback form

Feedback