LAC Session Type
Poster
Name
Are These Good Investments? Evaluation Rubric for Open Access Investments
Description

Purpose & Goals

Library professionals have long made use of evaluative tools for decision-making around collection development. Traditional tools are adequate for many collection development scenarios, but the increasing variety of acquisitions models combined with entrenched budgetary challenges present the need for new methods and inventive thinking. An R1 academic library’s Open Access Task Force developed a rubric to understand specific resource needs and opportunities in a rigorous and transparent manner. Rubrics can be both instruments of collection analysis, but informative objects that help guide our own values and philosophies in our collections practices. This presentation illustrates the rubric details and examples of initiatives evaluated.

Design & Methodology

The evaluation tool includes unique criteria for assessing a publishing initiative's value, the risks/benefits to institution authors, and factors like acquisition budgets, licensing, author rights and sustainability. The rubric identifies six unique categories (cost, sustainability & risk, significance to the institution’s authors, ethical practices, equity & mission, and logistical feasibility including user privacy), and scores each initiative received from publishers of all types. The presentation will include a sample of a completed analysis and provide data from library expenditures that demonstrate the cost/benefit analysis of open access publishing and read and publish agreements specifically.

Findings

Over three years, nine agreements were evaluated and seven were accepted. Of those accepted, there was a 16.8% increase in open access publications with these seven publishers; additionally, the article processing charge (APC) defrayed held a value of $479,067 for 229 articles. The renewal cost of these agreements over four years increased by almost $50k or 12.9%. While these results appear favorable, the larger question of the affordability of the open access future remains.

Action & Impact

This rubric helps the OA task force in understanding the many ways that a publishing agreement can support our university community. It takes into consideration the mission of the publishers, the value and diversity of their content and it also considers the author’s rights implications of each initiative. In this way, our rubric creates transparency that facilitates decision-making and offers that rationale to all interested stakeholders. As publishers continue to monetize aspects of open access publishing, this effort can support the efforts of libraries to understand which agreements work for their community and provides evidence to support the increasing need for budget expansion and advocacy within government levels where the academic library challenge with the open access mandate could be better articulated.
Practical Implications & Value    According to CHORUS, the independent, nonprofit membership organization that tracks open and public access publications, the university’s specific OA publications have increased eleven-fold, from 735 in 2017 to 8211 in 2022. Academic libraries continue to be funded without increases from year to year while for profit publishing costs continue to increase. The aim of the library’s Open Access Task Force evaluation rubric is to assist our acquisitions unit to identify beneficial scholarly publishing and balance the needs of the campus community with the many resources offered by the open research ecosystem.

To date, the rubric has been presented to the statewide consortial assessment committee and at two other conferences. However, we are interested in expanding the study to look at other factors in open access publishing such as author rights and development fees, so feedback on how to include this is solicited at presentations. LAC 2024 is the only assessment community to view this work. The rubric is publicly available on a free, open research platform.

Keywords
Open access, read and publish, rubric, assessment
Additional Authors
Erin Gallagher, UF George A. Smathers Library
Tara Tobin Cataldo, UF Academic Research Consulting & Services
Suzanne Stapleton, UF Marston Science Library
Perry Collins, National Endowment for the Humanities