#DivBlk: Principles in Action During a Website Migration
7) Website Usability Testing
Semesters: Fall 2019 – Spring 2020
Committee: Co-chair Kelli Coles (PhD student), co-chair Michelle Byrnes (undergraduate), Lauren Cooper (librarian), Caleb Trotter (CCP alum), and Keith Jones (Library IT Systems Programmer) with Jim Casey (co-director) providing strategic input
By: Dr. Amani C. Morrison, 2019-20 Council on Library and Information Resources (CLIR) Postdoctoral Fellow in African American Data Curation
One of the primary challenges of our new website was moving from one site to two, with our digital records on an Omeka site and our digital exhibits and other project information on a WordPress site. As a CLIR Postdoctoral Fellow with previous experience researching and developing strategy for large-scale digital humanities projects, I was tasked with leading website usability for user testing and working closely with the Website Team to strategize site enhancements.
Our team was well-acquainted with the massive undertaking this would be on the back-end, but we also needed to tackle how to ease users into making use of and navigating between our new linked sites. As we neared launch, we were on a compressed timeline (five weeks including Thanksgiving break) to conduct group and individual usability testing to ensure that the improvements to our site were showing up in the places that users would find useful, easily accessible, and clear. We researched best practices, spoke with Erin Daix, Universiry of Delaware’s Librarian and Director of Assessment who recently led a Library web design project, crafted test scenarios and user profiles, performed in-person and remote individual and group user test sessions, and incorporated insights and changes in an iterative process.
Test Scenarios and User Profiles
Based on our research on best practices for user testing, it was important to outline user profiles in order to understand the different needs of our broad user network, so that we could cover all our bases in our testing sessions. This step involved listing a range of known users—from undergraduate students to digital scholarship librarians, from community members to grant funders—and listing a set of tasks they might perform on our site, understanding that some user tasks would overlap. Using these user profiles, we developed a long list of possible test scenarios that we shortened to eliminate redundancy and to concentrate on both the most critical and most mundane site features.
To hone in on the most-used elements of our former site, we used questions crowdsourced from CCP members, and we settled on eight test scenarios that collectively would require users to navigate through the majority of the new linked websites’ offerings. Some test scenarios asked a question, such as “How can I partner with CCP on a panel/publication,” while other scenarios posed a direct task for a particular site: “Locate all minutes related to the 1848 conventions.” Others required users to navigate between the two sites, such as: “Where can I find information on Henry Highland Garnet?”
Usability Testing Sessions
We tapped into our network of CCP members, affiliates, researchers, Satellite Partners, and Teaching Partners across four universities to conduct thorough usability testing based on the user profiles we crafted.* Using an initial group testing with our diverse CCP membership of undergraduate and graduate students, librarians, postdocs, and faculty as a jumping-off point for feedback, we incorporated the insights gained into a more robust set of test scenarios that could be used for testing in person and remotely via Zoom.
In subsequent testing, our 30-minute sessions centered around 3-4 test scenarios (based on which profile the tester fit) where we asked the tester to perform a set of actions on the sites that they thought would lead to the desired results (e.g. locate a record or information about a CCP activity). We took copious notes as each tester talked through their thought process while we watched them navigate the sites to complete the test scenarios. For in-person testing, we connected a laptop to a large monitor for common viewing; for remote testing we requested that testers use the Zoom “Share Screen” option while navigating our site. As a collaborative project, it was important that at least two team members were present during testing sessions to support the collection of thorough feedback through listening, asking clarifying questions, offering guidance, and taking notes.
After completing the 3-4 test scenarios, the remainder of the session time was used to gather immediate feedback from the tester about the process, their impressions of the site’s utility and navigability for the tasks attempted, and suggestions for user experience enhancement.
Working with the rhythms of the academic semester, we decided to incorporate user testing feedback with iterative improvements, such that each tester encountered a website that was more improved than the last. In this way, we were able to maximize the generosity of time that our testers had to offer while not overextending our team or exhausting our networks. We tested each scenario at least twice (with two different users) to ensure that issues were corrected and that new issues did not arise with particular site functions. By the time we completed our final user test session, the tester was able to breeze through several test scenarios in record time, and the feedback was glowing on clarity, navigability, and ease of use.
We’d call that a hard-won success.
To ensure continued site performance and effectiveness for users, we chose to create a feedback form in our site to collect feedback on an ongoing basis that will be triaged and addressed by our team as time and resources allow.
As we move forward, some of our site features will undoubtedly shift as we are able to find ways to more effectively convey information. For example, the pop-up message that appears when a user first reaches coloredconventions.org grew out of my work experience with another digital humanities project, Visualizing Emancipation, that uses the pop-up as a greeting and instructional overview to help orient site visitors. With our pop-up, we sought to help inform new and returning users of our linked sites in hopes of easing the navigation after some of our user testers were unable to successfully locate pertinent information found on one of the sites. Initially set to appear upon each user visit to the site, we shifted the occurrences based on cached data so as to not annoy repeat visitors. We will continue to assess the utility and frequency occurrence of the pop-up along with other features as we work to maintain overall site navigability and ease of use.
Because what’s the point of developing a fancy new site if it’s not useful, anyway?
*Many thanks to the following people for providing invaluable feedback during user testing:
- Gwyn Johns, Librarian, Pennsylvania State University
- Dr. Alex Galarza, Digital Scholarship Librarian, University of Delaware
- Sabrina Evans, Ph.D. student, Pennsylvania State University
- Dr. Rafia Zafar, Professor and former CCP Teaching Partner, Washington University in St. Louis
- Dr. Thomas Keegan, Director of Digital Scholarship & Publishing Studio, University of Iowa (CCP Satellite Partner)
- the CCP team.
READ MORE: #DivBlk: Principles in Action During a Website Migration
The Colored Conventions Project was launched & cultivated at the University of Delaware from 2012-2020.