LAC Session Type
Poster
Date & Time
Friday, November 8, 2024, 6:30 PM - 7:30 PM
Location Name
Atrium Ballroom
Name
Evaluating One-shot Asynchronous, Online Historical Primary Source Instruction: a Case Study Using Student Feedback
Description

Purpose & Goals

How can we effectively assess students’ ability to apply skills about how to evaluate and use primary sources taught via an online tutorial for a large-scale (250+ students) writing program course which does not provide access to student work products?

Design & Methodology

We reviewed data from a writing program course taught in two quarters with a combined enrollment of 574 students. Students completed an online tutorial where they learned skills required to analyze primary sources. The examples used in the tutorial were a letter and an image from the Haitian Revolution. The assessment methodology reflects best practices in rubric design and instructional design assessment, including sample sizes and interrater reliability. The questions students answer about primary sources are grounded in best practices in how to teach critical evaluation of primary sources. Some research that supports our work Zenobia Chan & Simone Ho (2019) Good and bad practices in rubrics: the perspectives of students and educators, Assessment & Evaluation in Higher Education, 44:4, 533-545, DOI: 10.1080/02602938.2018.1522528 D. Royce Sadler (2009) Indeterminacy in the use of preset criteria for assessment and grading, Assessment & Evaluation in Higher Education, 34:2, 159-179, DOI: 10.1080/02602930801956059 Robert L. Johnson , James Penny & Belita Gordon (2000) The Relation Between Score Resolution Methods and Interrater Reliability: An Empirical Study of an Analytic Scoring Rubric, Applied Measurement in Education, 13:2, 121-138, DOI: 10.1207/S15324818AME1302_1 Keeping Up with…Primary Source Literacy: https://www.ala.org/acrl/publications/keeping_up_with/primary_source_literacy.

Findings

Our data show 50% of students can describe primary sources after completing the tutorial, but many struggle to compare two sources. Most students are able to make inferences and discuss how they would use primary sources for research purposes. We conclude that there are ways we can improve the instructional content and assessment questions. Findings: Describe: Half of the students scored highly on being able to describe a source (i.e. identifying people, objects, and activities, correctly identifying the setting, making observations about the creation, and correctly identifying the audience for the text). Students struggle most with identifying where the primary source originated. Compare: Less than one third of the students were able to compare the primary sources in the tutorial. Most students were able to compare observations about the two sources, but they lacked the critical thought that higher scoring responses demonstrated. The students who scored higher in this category were able to articulate observations around a shared theme. Infer & Reflect: Most students were able to make inferences and reflect on the source; however, they struggled with discussing a source’s bias. This category includes why a source is created, understanding the bias, and asking questions about the source. Students were able to infer the purpose of a source and ask relevant questions about it, but they were unable to discuss bias beyond obvious observations, e.g. “it was written in the first person.” Students who scored higher were able to connect the historical context of the source with the creator. Use: Most students were able to demonstrate that they knew how to use this source. This category includes observing what information could be provided about the context of the source and how the information could be used to support data, argument, or background (the framework presented in the tutorial).

Action & Impact

Assessment allows instructors to make adjustments where it is needed related to content or delivery to improve student learning. Moving forward, we plan to refine our rubric so that it accurately reflects our intentions for student learning. We want to create an assessment process that is scalable to hundreds of students so that it doesn’t require hours of librarian labor to review student responses. Actions: Describe: Student responses indicate that students need more instruction about how to identify where a source originates. We plan to include more explicit content in the tutorial to address this. Compare: We reviewed the learning outcomes and content and realized that there wasn’t a specific learning outcome for “compare” and there was minimal content presented. We will add a learning outcome and content to address this, specifically we want to encourage students to focus on similar themes from the text and image rather than simple observations. Infer & Reflect: We want to improve how we present content around identifying a bias for a source so that students can critically evaluate a creator’s bias within the historical context of the source. Use: The content for the tutorial is solid here based on student scores; however, many do not consult the framework we are asking them to use when considering how to incorporate a source into their paper. We will draw more attention to this in the next iteration of the tutorial.

Practical Implications & Value

This project tackled the challenge of teaching students how to evaluate and use primary sources via an online tutorial, as well as how to assess student learning for qualitative responses for one-shot instruction for a large population. Summative assessment is particularly challenging for one-shot courses due to the lack of access to student artifacts. Our assessment design provides librarians with a rubric that could be modified to examine student artifacts to make conclusions about the effectiveness of student learning as well as identify ways to improve workshop content.

View Poster (PDF)

Keywords
Teaching and learning, primary source assessment, asynchronous instruction assessment, tutorial assessment, primary sources, rubric design
Additional Authors
Dominique Turnbow, Instruction Librarian, UC San Diego Library

Dominique Turnbow combines her expertise in instructional design with over a decade of experience working in academic libraries to deliver information literacy instruction effectively in online environments. In 2002, Dominique received her MLIS from the University of California, Los Angeles, where she began her career as an instruction and reference librarian before moving to the University of California, San Diego. Since receiving her MEd degree in 2013 she has applied her instructional design expertise to the design of online information literacy tutorials.