LAC Session Type
Paper
Name
Using an environmental scan process and an AI assistant to evaluate the current OER and Open Practice landscape
Description

Purpose & Goals

During these uncertain times in higher education, college degree affordability and decreasing student enrollment are issues surfacing on many college campuses. Open Education Resources (OERs) and no-cost textbooks are touted as a possible student success strategy, and librarians have become primary players in this solution. In addition, Covid-19 identified gaps in traditional library practices and post-Covid change continues to pivot toward digital resources and online learning. Although many OER best practices and examples can be found to help librarians find and promote OERs, Open Practice (OP), or using OERs in teaching, is less studied. At an institution with no formal OER or OP initiatives, and no librarian or staff officially dedicated to OERs and/or OP, this research was designed to collect data that can be used to jump start decision-making and planning for new open education initiatives. The researcher is a librarian who also teaches graduate courses in qualitative research methods and uses OERs and OP in her own teaching, but she is not an expert in this area. A 2023 sabbatical, provided research opportunities to investigate the opportunities, benefits, and challenges of OERs and OP. It also afforded time to explore new AI assistant features embedded in Atlas.ti, a qualitative data analysis tool taught in her classes. The research questions are:

  • What are the OER and associated open education practice trends being discussed, reported on, and utilized both inside and outside the library? How do these trends align to the researcher’s library and institutional priorities?
  • Based on evidence collected in an environmental scan of the open education landscape, what could an open education initiative at a R1 public university look like?
  • How could the ChatGPT AI tools embedded in computer-assisted qualitative data analysis software (CAQDAS) tools, like Atlas.ti be used to conduct an environmental scan?

Design & Methodology

This paper presents the results of a 7-step environmental scan process (Wilburn, Vanderpool, et.al., 2016) conducted during a 2023 librarian sabbatical to explore, document, and analyze OER, open practice, and the broader open education landscape. Environmental scanning is a an analysis tool used to inform decision-making for designing new policy, program planning, and initiative development. The researcher, not an open education expert, used the scanning process to systematically learn about OERs and open education, as well as, learn how to use an Atlas.ti, an AI-assisted qualitative research analysis tool. Data were collected from open and available sources: (1) transcripts of open education webinars, events, and interviews; (2) abstracts of open education scholarly articles; (3) reports and newsletters; and (4) open education list-serves, notes, blogs and websites. In addition, evidence and statistics of open education state initiatives and national peer institutions were gathered. Names of open education organizations, OER experts, and OER textbook titles were also curated as part of the environmental scan process. A Content Analysis (CA) methodology (White and Marsh, 2006) was then used to analyze the data to identify trends and patterns. Data were collected, cleaned and then imported into Atlas.ti software for analysis. Two methods of auto-coding, a semi-manual method using word frequencies to identify codes, as well as an automated Atlas.ti AI-assistant tool were used to code and summarize data. The AI-assistant provided the primary coding of the data, and the researcher served as the second coder and triangulated the coding using manual coding strategies. The biggest lesson learned was that the auto-coding process resulted in too many codes and categories (between 500 and 600 codes). Codes were manually merged and purged, and re-organized into new categories before thematic analysis was conducted. Categories were then analyzed looking for patterns and connections and five themes were identified.

Findings

The large number of codes and categories created through the AI-assisted auto-coding tool were overwhelming, but the process also created an opportunity to be totally immersed in the data in a way not usually possible with more traditional coding practices. The categories resulted in a detailed understanding of the open education topic themes, but also provided tagged excerpts of the countries represented in the data, a relationship of openness and data types, details on the impact on student success, and a range of methodologies described in the abstracts. Codes and themes also indicated the value that OER and open practices could contribute to other campus initiatives. However implementing OER and open education practice is not without challenges and some of the challenges presented in the data include funding issues, faculty resistance to using OERs that may not be peer reviewed texts, and time constraints of finding and adapting or creating OERs. Third level of coding, or selective coding, and constant comparative strategies resulted in five themes. The resulting themes will be used to suggest possible next steps for moving forward with OER and open education initiatives at the researcher’s institution. The five themes identified include:

  • Integrating OERs and open practices on a college campus is a team sport
  • Building campus-wide awareness, open education capacity, and digital competencies that are customized to university culture and context
  • Designing a community of openness to bridge support for different stakeholders
  • Impacting student success through empowering students using equitable and inclusive open pedagogy best practices
  • Extending beyond the OER and aligning open education work to the library and university strategic planning priorities

Action & Impact

Since there are no formal OER or open education initiatives currently on the researcher’s campus, except for open access publishing, it was important to gather broad data and set guidelines around the planning process for new OER and open education initiatives. Findings from this research will be used as recommendations for discussions and building a foundation for identifying strategies for open education planning. The environmental scan provides a 30,000 foot view of the broader open education topic, not just from what is being published, but what experts are presenting on and discussing in conferences, meetings, and organizational communications. The next step is to design a way to visualize the findings so that an advisory group can use the information and findings to make decisions and move open education planning forward. I am using the findings to complete a preliminary sabbatical report for the Library Dean and making recommendations for the library role in a campus-wide open education initiative. I will present my findings and recommendations with a focus on the administrative aspects of designing and implementation of open education to two academic associate vice-presidents who are OER supporters and will become critical campus partners in decision making, planning and implementation. An online self-directed Canvas course already in production to introduce faculty to OERs and open education, will need to be restructured to meet the needs of all stakeholders, not just faculty, based on the research findings. Logic model planning, project mapping, and decision making by the advisory board will lead to an assessment plan for measuring success of the open education initiatives and impact on library stakeholders.

Practical Implications & Value

This research contributes to the library assessment community by presenting the how-tos and lessons learned when using an alternative data collection process, the environmental scan. This scan method could be adapted and used for exploring any library initiative or outreach project. We often think of conducting a needs assessment before planning library programming, but the environmental scan provides a broader view of the situational context and trends around the topic. After an environmental scan, a team or advisory group could use the findings to help create a logic model, a needs assessment, an assessment plan, or a communication plan based on the scan findings. This research also provides guidance on how to use AI tools for qualitative data analysis that would be of interest to the community. The researcher plans to use a Playbook to present information and instructions that people can use to make progress on something like information gathering, training, or using tools. Playbooks are used on the researcher’s campus for curating online learning policies and processes, in the school of business, and for international grant work, so there is campus familiarity with the term. In this case, an Open Education Playbook will be a framework for organizing all OER and OP initiative work, policies, training materials, tools, resources, and recommendations for using OERs and open practice in teaching. It will also be a way to create an open education presence, make a case for OERs and open education, and collect and display statistics. A Playbook provides an opportunity to continue to build on the open education work, for example where new initiatives like grants or graduate student opportunities can be posted and spotlighted as we move further along the OER journey. SPARC also uses a playbook to present OER state policy.

Keywords
OERs, open pedagogy, open education, content analysis methodology, environmental scan process, AI data analysis