In April 2018 I attended an excellent NISO webinar entitled “Can there be neutrality in cataloging?”. Initially this struck me as a somewhat quaint title, as though there could be any answer other than ‘no’. Happily, the webinar came to pretty much the same conclusion, and I think it’s fair to say that at this point in time there is a broad understanding in the metadata and cataloging community that libraries are not neutral spaces, and therefore, neither is the description we create, manage, store, and display.
It takes intentionality and cultural humility to do descriptive work in a way that respects the diversity of our society and multitudinous perspectives of our patrons. I think we’re now in a moment where practitioners are recognizing the importance of approaching our work with diversity, equity, and inclusion (DEI) values in mind.
But we must also reckon with the fact that there hasn’t always this kind of focus on inclusivity in regards to our descriptive practices, and so we are left with the task of deciding how best to manage existing metadata and legacy practices that don’t reflect our values as librarians and archivists. So, we have to figure out how to appropriately “decolonize” our description.
Over the past few years I’ve encountered a number of ideas and initiatives aimed at addressing this issue by both reexamining and remediating existing metadata as well as updating and improving descriptive practices.
Institutional work & messaging
We can leverage our institutional structures.
- The University of Alberta formed a ‘Decolonizing Description Working Group’ to investigate, document, and propose a plan for more accurately and respectfully representing Indigenous peoples and contexts through descriptive practices.
- Temple University’s Special Collections Research Center website hosts a statement on potentially harmful language in archival description and cataloging.
Activism
We can participate in activism to make broad changes.
- Students and librarians at Dartmouth University worked together to lobby the Library of Congress to stop using the term ‘illegal aliens’ to describe undocumented immigrants. The documentary ‘Change the Subject!’ describes their campaign.
Programmatic Analysis
We can develop tools and techniques for analyzing our existing metadata.
- Noah Geraci, a librarian at the University of California Riverside, presented at Code4Lib 2019 on their project to identify problematic metadata and remediate it programmatically.
Implement Inclusive Vocabulary and Thesauri
We can identify and implement inclusive alternative vocabulary and thesauri in our systems.
- As part of the Hyrax project, developers and stakeholders have identified vocabularies and thesauri that are more inclusive and representative, listed here in a spreadsheet managed by Julie Hardesty.
Develop technical solutions
We can develop technical solutions for managing the presence of problematic metadata in our systems.
- And here’s something we’re working on locally! As part of TRLN Discovery (a recent and successful project to develop a shared Blacklight discovery interface for the Triangle Research Libraries Network consortium) developers incorporated code for re-mapping problematic subject headings to preferred terms. Problematic terms may still be searched, but only the preferred term will display in the record. We’re still working out how to implement this tool however, from a policy standpoint, e.g., who decides what is ‘problematic’, and how should those decisions be communicated across our organizations.
This is but a smattering out of many projects and ideas metadata practitioners are engaged in. Eradicating inaccurate, insensitive, and potentially harmful description from our library systems is a heavy and entrenched problem to tackle, but lots of smart folks are on it. Together we can address and remediate our existing metadata, reexamine and improve current descriptive practices, and work toward creating an environment that is more inclusive and representative of our communities.