Skelton Primo — Academic Library UX Case Study

Skelton Primo — Academic Library UX Case Study

Improved Mercer University’s Skelton Primo library interface by identifying usability barriers and recommending redesigns that reduced user abandonment.

Improved Mercer University’s Skelton Primo library interface by identifying usability barriers and recommending redesigns that reduced user abandonment.

Usability Testing

Higher Education

Higher Education

Search Experience

Interaction Design

Academic Library UX

Overview

Skelton Library’s Primo interface is a central tool for students searching for articles, books, and digital resources. Despite being widely adopted across academic systems, the interface often presents usability challenges for novice users. A team of myself and two other team members conducted a usability study and proposed UX improvements to enhance discoverability, search clarity, and task efficiency.


High-Level Outcomes:

  • Identified top usability barriers related to search filters and terminology

  • Proposed IA and interaction design enhancements for clearer navigation

  • Improved user ability to complete common academic research tasks

Industry

Education

Education

My Role

UX/UI Researcher

UX/UI Researcher

Organization

University Libraries / Skelton Library

University Libraries / Skelton Library

[Timeline]

January 2021 - March 2021

January 2021 - March 2021

Business Problem


Users frequently struggle with locating materials on the current Primo interface. This results in:


  • Inefficient research workflows

  • Overreliance on librarian assistance

  • Missed or overlooked resources


Business Objective: Understand the current issues of the interface. In understanding these issues, engineers can enhance the user experience so users, mainly students, can independently and efficiently locate resources with minimal friction.

Research Questions


  • Can users locate and use advanced search features effectively?

  • Are filters, labels, and categories intuitive for academic research tasks?

  • Which parts of the interface cause hesitation or repeated errors?

Research Methods

Approach: Moderated usability testing and comparative analysis
Participants: 5 Faculty/Staff Members at Mercer University

Tools: Zoom

Tasks Observed:

  • Searching for a known article

  • Using filters to narrow search results

  • Accessing full-text materials

  • Locating physical resources

Artifacts Produced:

  • Test Plan & Script

  • Observation Notes

  • Formal Usability Report (for client)

  • Presentation (for client)

Figure 1: Script/tasklist for usability testing

Key Findings


A. Filter and Label Confusion

Users did not recognize certain buttons as there was no label or guide available to show what the function of the button was. There also was misunderstanding of filter terminology such as Availability, Resource Type, and Full Text Online.

B. Accessibility Issues

During usability tests, users reported that the text/font sizes on the pages were two small, which negatively impacted readability and overall user comfort.

C. Navigation Disconnects

Participants struggled to locate essential actions like saving, exporting, or accessing advanced options.

D. Website Abandonment

Users noted that they were more likely to abandon the interface for other external tools when faced with confusion or obstacles on the interface.

Key Findings


A. Filter and Label Confusion

Users did not recognize certain buttons as there was no label or guide available to show what the function of the button was. There also was misunderstanding of filter terminology such as Availability, Resource Type, and Full Text Online.

B. Accessibility Issues

During usability tests, users reported that the text/font sizes on the pages were two small, which negatively impacted readability and overall user comfort.

C. Navigation Disconnects

Participants struggled to locate essential actions like saving, exporting, or accessing advanced options.

D. Website Abandonment

Users noted that they were more likely to abandon the interface for other external tools when faced with confusion or obstacles on the interface.

Figure 2: Examples of label issues that users encountered

Figure 2: Examples of label issues that users encountered

Figure 2: Examples of label issues that users encountered

UX Recommendations

1. Clarify Filter Terminology with Plain Language

Replace ambiguous labels with task-oriented language.


2. Introduce a Results Page Visual Hierarchy

Distinguish primary actions (download, open, view) from secondary ones.


3. Combine Related Filters

Grouping filters reduces cognitive load and scannability issues.


4. Improve Advanced Search Entry Points

Ensure advanced options are easy to locate and clearly signposted.


5. Add Quick-Start Guidance for New Users

Brief contextual tooltips or short guides support novice researchers.

Impact

Expected improvements include:


  • Increased search success rates

  • Reduced dependency on library staff for basic tasks

  • Faster discovery of relevant materials

  • More intuitive filtering and navigation for novice users

  • Higher user retention on the interface

Reflection

Key lessons from this case study include recognizing that:


  • Academic platforms require balancing complexity with clarity.

  • Small changes to filter language dramatically improve usability.

  • Students benefit from guided search pathways when learning new systems.

Future Opportunities/Areas of Improvement

There were a few limitations/areas of improvement that I identified during the study, as well potential fixes for said limitations


  • Broader Group of Participants for Usability Testing

    • Suggestion: Although we were able to reach our target group for the usability tests, I believe it would have been more focused to have medical students within the recruitment, as they would be more likely to interact with the Skelton Primo interface.


  • Limited time for usability testing

    • Suggestion: With the duration of this project being only two months in total, we did not have enough time to run tests with the number of participants we originally set. This is why we were only able to run five tests with the five participants. If possible, I would request an extension on time or more resources to be able to complete more tests in a perfect scenario.

Jeremiah Pulliam

Copyright © 2025 by Jeremiah Pulliam

Jeremiah Pulliam

Copyright © 2025 by Jeremiah Pulliam

Jeremiah Pulliam

Copyright © 2025 by Jeremiah Pulliam

Jeremiah Pulliam

Copyright © 2025 by Jeremiah Pulliam