LAC Session Type
Poster
Date & Time
Friday, November 8, 2024, 7:30 PM - 8:30 PM
Location Name
Atrium Ballroom
Name
Where’s the Search Box?: Usability Testing Research Guides Created in LibGuides and Adobe Express
Description

Purpose & Goals

This poster explores students’ use and perceptions of online library research guides, comparing guides created in Springshare’s LibGuides product with webpages created in Adobe Express, a platform which is not library-specific. Librarians routinely devote a large amount of time to creating and maintaining online research guides to distribute to students, and for many, LibGuides is the default platform used. However, as with any undertaking that demands such a high time investment, assessment must be conducted to determine whether LibGuides is still the best choice for guide creation, or if non-proprietary technology, which students may also encounter outside the library context, might better serve the purpose of linking students to resources and providing basic research guidance. The home institution where this study was conducted is an Adobe Creative Campus, and so some librarians there had begun using Adobe Express webpages, which students may already be familiar with, to share the same content typically contained in a research guide. The purpose of this poster is to compare and contrast common benefits discovered and issues encountered by students using the two platforms. Library-specific software is created with unique user needs and instructional contexts in mind, and so does provide a high level of functionality from the librarian side. Still, because librarians are not the intended audience of online research guides, seeking a student perspective is key to understanding the extent to which they are accomplishing their goals in empowering students to conduct research. Gathering data on guide use can inform librarians’ decisions on where a pivot to new platforms or new design practices may be beneficial.

Design & Methodology

The methodology used in this study is usability testing. In a usability test, a small group of users are asked to perform a series of set tasks using a website or platform. An observer provides the tasks, takes notes, and prompts participants to share their impressions verbally using the “think aloud” method, where participants talk out loud about their thoughts and observations as they complete the tasks. Usability tests are focused on testing platform issues, not user skills, since the goal is for a software platform to be usable for patrons of all skill levels. Because the issues participants identify tend to overlap heavily, usually a group of 5 participants is sufficient to gather enough data to inform a platform redesign. In the present study, 2 groups of 6 students each were recruited, so that each platform (LibGuides and Adobe Express) would have a unique tester group, to prevent participants’ experience with one platform from shading their experience with the other. All users were undergraduate students, with varying levels of experience and confidence in using research guides and conducting research. Researchers created guides in both LibGuides and Adobe Express, replicating content and features as closely as possible, informed by best practices of design in each platform. Then, students were given tasks asking them to search for databases and books, find research tips and tools, and get help from librarians using the guides as starting points. Students’ interactions with the guides, their “think-aloud” comments, and their pre- and post-test questionnaires were recorded and analyzed for shared themes and frequently-encountered pain points.

Action & Impact

Based on the findings from this study, there are a couple of directions in which librarians may adopt changes to their research guide creation process. Having a better understanding of the typical issues students encounter when using both LibGuides and Adobe Express, librarians at the study’s home institution will be better able to design research guides that direct students to important resources and keep them from getting sidetracked or confused by non-essential content. As preliminary data shows, each platform has its strengths and weaknesses, and there are traps that students tend to fall into in both. Being aware of these will inform better guide design. This is a benefit to librarians regardless of which platform they adopt for guide creation. In addition to designing better guides, librarians will also be able to make better-informed decisions about which platform they use to create guides in the first place. Even if the institution retains LibGuides as a content platform for most contexts, given the convenience of its built-in integration capabilities with other library products, there are still contexts in which Adobe Express guides may be more suitable, given students’ demonstrated navigation habits, depending on how guides are used or introduced in class, and the experience level of users. Understanding these factors will allow librarians to more confidently select the appropriate tool from the array of options available. Lastly, conducting the study has reinforced the value of usability testing in assessing the user experience with online platforms, so it is likely that future research may be conducted that applies the methodology in other areas of library services.

Practical Implications & Value

The anticipated community response to the research covered in this poster takes a few forms. First, many academic institutions rely on Springshare’s LibGuides platform for research guide hosting, community members will become more aware of issues students encounter when using different design features in LibGuides. The data from that portion of the usability test will help inspire better design practices and may also encourage librarians to instruct students in using guides rather than assuming their navigation will be seamless. Second, the project invites community members to consider using other software platforms for digital learning object creation, beyond software designed specifically for library use. Adobe Express and other products like it have their own set of issues, but they may be better suited for certain purposes. At the very least, librarians should be proactive, being aware of and considering all available options. Finally, this project contributes to the culture of assessment in that it encourages the audience to routinely examine the user experience of even long-ingrained services and products, to understand how they are received and issues that users may encounter. The usability testing method can be particularly helpful to academic libraries, since it allows practical assessment with just a small participant group. This project pulls the need for assessment into the sphere of user experience, and offers an example for how such a project can be carried out.

View Poster (PDF)

Keywords
Usability, user experience, LibGuides, Adobe, digital learning object