E-Resource Database Trial Accessibility Evaluations

Overview

Over the past few years, Duke University Libraries have been making efforts to consider additional ways to incorporate DEI (Diversity, Equity and Inclusion) principles into our daily work. In Technical Services, one example of this work is advocating for strong accessibility language in e-resource license negotiations. This is a high priority, because accessibility compliance is inherent to complying with civil rights law. Prioritizing accessibility language in DUL licenses lets providers know that the library’s willingness to acquire e-resources is compromised when accessibility needs are not met or prioritized. Inclusion of strong accessibility language also codifies content providers’ acknowledgment that the e-books, e-journals, and online databases they are selling for inclusion in the library’s collection should be accessible according to national and international standards. It also ensures that the library has the right to modify material as needed when resources do not comply with patrons’ accessibility needs.

To build on this existing accessibility work, the DUL E-Resources Management Team is piloting a database accessibility evaluation project to more closely assess e-resources under consideration for acquisition during trials. E-resource trials are a common way for subject selectors to review e-resources before committing to add the material to the collection, typically via a temporary gratis access period of around 30 days. By testing databases on trial for a few key accessibility elements, we hope to provide additional useful information for our colleagues’ consideration as they build the library’s collection.

Designing a Template

Barb Dietsch, ERM Specialist, and Abby Wickes, ERM Librarian, based the pilot on the Library Accessibility Alliance E-resource Reports (LAA) which are highlighted in detail in an incredibly informative ASERL webinar, “Accessibility of Research Library E-resources”. Using the LAA model, Dietsch and Wickes developed a local Accessibility Evaluation Template for testing with a variety of free tools and manual testing methods according to the POUR rubric, which is adroitly described in the 2020 NASIG session, “Designing for Accessibility”.

Perceivable (can be accessed with more than one sense)

Evaluate for captions, alternative image text, screen reader success

Operable (provides flexible navigation options and can be accessed with a variety of input methods)

Attempt to navigate website using only keyboard

Understandable (behaves in an intuitive, logical, and predictable way)

Attempt to intuitively navigate website, looking for usability issues (e.g., help or documentation, clearly labeled links)

Robust (works across browsers and devices, follows standards)

Attempt to navigate website in multiple browsers, devices; attempt to zoom in to resize text

Using POUR as a guide, the evaluation template also incorporates data from free tools including the NVDA Speech Viewer screen reader and the WAVE online web service and browser extensions.

Additionally, the template includes the option to link to existing accessibility language and/or a VPAT (Voluntary Product Accessibility Template, describing providers’ compliance with accessibility standards), if the library has already negotiated a license with the content provider. (These licensing additions to the Accessibility Evaluation Template were incorporated after learning more about a similar evaluation process at the University of Washington through another very helpful and informative Library Accessibility Alliance webinar, “Accessibility Committees: Cultivating Cultures of Accessibility at Your Library”.)

Sharing Findings

After testing the Accessibility Evaluation Template with multiple databases and soliciting feedback from colleagues, we will be launching the project for all database trials going forward. The findings from the evaluations will be shared with both DUL colleagues for incorporation into acquisition decisions, as well as content providers in the hopes that they will remedy any accessibility problems the evaluation uncovers. As part of our local evaluation process, we also plan to try to reproduce results from existing LAA E-Resource Reports whenever a database has already gone through their more detailed review process. If LAA reports are not yet available for the database in question, we will typically use the form on the LAA site to suggest the more extensive LAA review.

Helpful Resources

In addition to the resources linked throughout this post, we also found these tools, webinars, and websites incredibly helpful when learning more about this topic:

Future Plans

We expect to learn a lot more after launching the pilot evaluations, and we have a few ideas on how to potentially expand our efforts in the future.

  • We’ve learned a lot from other organizations undertaking similar efforts, and it would be beneficial to figure out a way to share our local evaluations more broadly so peer institutions can incorporate findings into their own acquisition decisions.
  • It could also be helpful to more quantitatively assess the resources to come up with a score or color-coded range for easier comparison with other databases under review. However, since online databases can differ greatly in content and format, we anticipate this would be a challenging metric to quantify.
  • Finally, we hope to continue incorporating additional and emerging accessibility tools and resources. In the future, incorporating content’s availability in the FRAME repository of adapted, accessible materials could also be helpful information.

Database Accessibility Evaluation Template

Download the template

Please take a look at our template, and thank you for any feedback on this pilot project! If you’d like to hear more, Barb Dietsch and Abby Wickes will be presenting on this project at the upcoming November DUL First Wednesday presentation.

Database Trials Accessibility Review:

[Provider: Title]

Resource

Provider:
Title:  
Access URL:
Test search term:
Example page used in testing:
DUL Tester:

Overview Summary

[Paragraph and quick bullet points providing general overview]

Library Accessibility Alliance (LAA) Evaluation

Manual Evaluation according to POUR Rubric

(Perceivable, Operable, Understandable, Robust)

Perceivable (can be accessed with more than one sense)

Evaluate for captions, alternative image text, screen reader success:

Notes Screenshot(s)
·
·

Operable (provides flexible navigation options and can be accessed with a variety of input methods)

Attempt to navigate website using only keyboard (tab and shift tab to go forward and back):

Notes Screenshot(s)
·

 Understandable (behaves in an intuitive, logical, and predictable way)

Attempt to intuitively navigate website, looking for usability issues (e.g., help or documentation, clearly labeled links):

Notes Screenshot(s)
·

Robust (works across browsers and devices, follows standards)

Attempt to view website in multiple browsers, devices; attempt to zoom in to resize text:

Notes Screenshot(s)
·

High-level WAVE findings:

Notes Screenshot(s)
Errors

·   []

Alerts

·   []

Notes

·   []

Communicating Findings

Stakeholders Item Status/Notes
DUL Colleagues BTAA Evaluation or,

High-level summary of CRA Accessibility Review

Content Providers BTAA Evaluation (or flag that we have submitted to BTAA for review) or,

High-level summary of CRA Accessibility Review

Licensing

Link to License Accessibility Language Link to VPAT