LAC Session Type
Paper
Name
Using analytics and qualitative methods to improve and sustain online tutorials and research guides
Description

Purpose & Goals

Academic libraries produce many online learning objects, but with this abundance of materials comes the need for robust assessment. Librarians from the University of North Carolina Greensboro (UNCG) create, edit, and market a suite of research tutorials, as well as many Springshare LibGuides (research guides). The research tutorials, ULTRA (University Libraries Tutorials for Research Assistance), have become a popular research resource, and LibGuides are integrated in the learning management system (LMS), as well as the library website where they are highly used. There are now hundreds of course and subject LibGuides, as well as dozens of research modules in use by UNCG and global patrons. This paper's goal is to discuss asynchronous assessment strategies of LibGuides and ULTRA, as well as how these virtual resources have changed due to feedback and what future directions we should take to improve, as well as continue with our success. This paper’s research question is “how can academic librarians best perform assessment using analytics and qualitative methods to improve and sustain online tutorials and research guides?”

Design & Methodology

Assessment can highlight the effectiveness of online tutorials and research guides by looking at patron research needs. Some libraries assess their tutorials and LibGuides by looking at a variety of data, including analytics and surveying patrons; Blummer (2007) evaluated academic library tutorials using this kind of mixed methods study. Implementing tutorials through a learning management system (LMS) and within a course allows librarians and instructors to measure whether or not information literacy learning outcomes are met (Fontane, 2017; Henrich & Attebury, 2012). Being able to compare tutorial types, such as video versus interactive HTML5 based web pages, can help determine what method academic libraries should take when creating research resources (Lantz et al., 2017; Stonebraker, 2015). Assessing if knowledge was gained by patrons after taking tutorials is useful when designing research resources, whether through post tests or usability studies (Fontane, 2017; Held & Gil-Trejo, 2016; Lindsay et al., 2006). For this online tutorial and LibGuide assessment project, a mixed method approach was performed. Data was taken from online analytic tools, as well as a feedback form that users can fill out toward the end of each tutorial. We also conducted a diversity, equity, inclusion, and accessibility (DEIA) audit of each tutorial. Through these evaluations we hope to ensure accessibility, respect for diversity and inclusivity, and transparent content that encourages continual learning. These DEIA audits were conducted by Library and Information Science (LIS) graduate interns and were completed using rubrics to score each tutorial. And lastly, students are being surveyed about ULTRA tutorials relevance to assignments and research practices in their courses. Citations and more literature on asynchronous online assessment: https://go.uncg.edu/s9otjg
Findings    Having a variety of assessment data and methodologies to improve online tutorials and LibGuides have provided many different findings. We have found so far: Issues with the learning management research modules through the form on the ULTRA tutorials, with professors using older versions of the materials. This has helped us think through our communication strategies with instructors to better integrate the modules within Canvas. Access to LibGuides through Canvas and the library website, which helps librarians understand how students are connecting to materials, as well as navigation issues within LibGuides. Diversity and inclusion concerns within ULTRA to best represent our students. The survey shows the problems with link maintenance, usability issues, and ideas for improvement for when the modules are migrated to a new system.

Action & Impact

In terms of LibGuides assessment, this data is helpful when reviewing the structure and consistency of LibGuides, as well as how students are finding guides through the website or the LMS Canvas. UNCG University Libraries will be migrating to a new online tutorial system in 2025 due to issues with server space; these tutorials and modules will now run through Springshare LibWizard. This has created an ideal opening for us to make updates and improvements to our materials based on feedback and our previous experiences. The DEIA audit and the survey allows librarians to have a sustainable assessment method in place for improving and reviewing content for the research modules. Ultimately, these modules can become more relevant to UNCG specific research assignments and student needs.

Practical Implications & Value

Librarians are creating more asynchronous online content, services, and teaching sessions than ever before in higher education, demanding increasing retrospective analysis of what instructional content exists, what works, what does not, and what projects should come next. The ability to tell the story of how libraries meet the needs of learners (including online and distance students) is essential as we continue to navigate through this dynamic and transformative era of academic librarianship. This paper will provide a blueprint for asynchronous assessment strategies for all academic librarians that can be adapted based on each team’s institution and needs.

Keywords
Research tutorials, LibGuides, asynchronous assessment, mixed method