Our very own Dan Dohan, Ph.D., has recently co-authored and published the scholarly article “The Promises of Computational Ethnography: Improving Transparency, Replicability and Validity for Realist Approaches to Ethnographic Analysis.”
This article argues that the advance of computational methods for analyzing, visualizing and disseminating social scientific data can provide substantial tools for ethnographers operating within the broadly realist “normal-scientific tradition” (NST). While computation does not remove the fundamental challenges of method and measurement that are central to social research, new technologies provide resources for leveraging what NST researchers see as ethnography’s strengths (e.g., the production of in-situ observations of people over time) while addressing what NST researchers see as ethnography’s weaknesses (e.g., questions of sample size, generalizability and analytical transparency).
Specifically, the authors argue that computational tools can help scale ethnography, improve transparency, allow basic replications and ultimately address fundamental concerns about internal and external validity.
The article explores these issues by illustrating the utility of three forms of ethnographic visualization enabled by computational advances:
- ethnographic heatmaps (ethnoarrays)
- a combination of participant observation data with techniques from social network analysis (SNA)
- text mining.
In doing so, the authors speak to the potential uses and challenges of nascent “computational ethnography.”