Claims to knowledge are increasingly being expressed in the language of data. However, as data has gotten “bigger,” a host of epistemological problems, such as the replication of scientific studies, misinterpretation of p-values, and opaque machine learning models, have emerged. To better understand the roots of the present crisis, this talk returns to a moment in the mid-twentieth century United States when statistical techniques were becoming widespread in the social sciences, focusing on a heterodox statistician: John W. Tukey. Tukey’s work not only cuts against existing historical interpretations of the social sciences during the Cold War period, it can also prompt us to think differently about epistemic (and perhaps even political) commitments regarding objectivity and judgement, uncertainty and expertise in worlds of data.
Meet the Speaker: Alexander Campolo, Postdoctoral Researcher and Instructor, Stevanovich Institute on the Formation of Knowledge, University of Chicago
Alex Campolo studies the history of media, technology, and science, with an interest in epistemologies of data. At SIFK he is working on a book project on the history of data visualization, showing how a group of scientists worked against the formal, algorithmic rationality of the Cold War period, turning instead to human perception and visual techniques to manage new scales of digital information. He received his Ph.D. in the Department of Media, Culture, and Communication at New York University and teaches widely in the history of science, philosophy, and visual culture.