Three renowned researchers in digital humanities and computer science are joining forces with the Library of Congress on three inaugural Computing Cultural Heritage in the Cloud projects, exploring how biblical quotations, photographic styles and “fuzzy searches” reveal more about the collections in the world’s largest Library than first meets the eye.
Supported by a $1 million grant from the Andrew W. Mellon Foundation awarded in 2019, the initiative combines cutting edge technology with the Library’s vast collections to support digital humanities research at scale. These three outside researchers will collaborate with subject matter experts and technology specialists at the Library of Congress to experiment in pursuit of answers that can only be achieved with collections and data at scale. These collaborations will enable research onpreviously difficult to address due to technical and data constraints.
The three experts beginning work this month are:
Lincoln Mullen, associate professor at George Mason University in the Department of Art and Art History and director of computational history at the Roy Rosenzweig Center for History and New Media, will research “America’s Public Bible: Machine-Learning Detection of Biblical Quotations Across LOC Collections via Cloud Computing.” Dr. Mullen is no stranger to Library collections, having won first place in the 2016 Chronicling America Data Challenge.
Lauren Tilton is an assistant professor of digital humanities at the University of Richmond in the Department of Rhetoric & Communication Studies and is co-director of Photogrammar and the Distant Viewing Lab. Tilton’s project, “Access & Discovery of Documentary Images” will examine approximately 250,000 images from five early 20th century photography collections. The project will look for ways computer vision methods could be improved to better consider context and enhance discovery.
Andromeda Yelton, a software engineer and professionally trained librarian, will research “Situating Ourselves in Cultural Heritage: Using Neural Nets to Expand the Reach of Metadata and See Cultural Data on Our Own Terms.” Yelton’s project will create an interactive data visualization that clusters conceptually similar documents, helping users who only have a rough idea of the items they’re looking for. The project will use a searching capability that utilizes machine learning and “fuzzy search” to help users discover and navigate Library collections.
Direct to Complete Announcement
Direct to Computing Cultural Heritage in the Cloud Website
See Also: From 2019: Library Receives $1M Mellon Grant to Experiment with Digital Collections as Big Data