Unbundling the small(ish) engineering journal package: Lessons learned
In a discussion group hosted by the STS Scholarly Communications Committee at ALA Midwinter, I had the opportunity to describe a case study in serials collection management at my institution: the “unbundling” of a small(ish) journal package in Engineering. The context for this study should be familiar to most readers: journal subscription costs increasing 4-6% annually against a backdrop of flat material budgets  which creates challenges for access to serial resources and causes academic libraries to consider cancellation of scholarly journal packages . While the net savings from unbundling our Engineering journal package was modest (about $30K annually), other institutions have effectively applied elements of our process to evaluation of larger serial packages .
In this Sci Post, I will present lessons learned during our unbundling experience, and propose several practices that other academic institutions might consider in evaluating journals and journal packages for (non-) renewal:
1. Initiate and maintain a conversation with faculty about their serial needs, and about serial resource value, cost, and access. In the five years prior to our evaluation of this small package, we conducted an annual “journal review”
of individually-subscribed titles and small package. Faculty in each department were provided a relatively short list of titles, along with data on cost and usage of each, and asked to categorize them as high/medium/low in terms of how critical each were to their research and teaching. The journal review also provided an opportunity for faculty to express their needs for new subscriptions to unheld titles. This annual review somewhat paved the way for our unbundling exercise, in that faculty had previously been exposed to the metrics used for journal evaluation, and had provided feedback on the titles in the package over several years.
2. Consider a variety of usage modes/metrics when evaluating serial titles for renewal or cancellation. Although our decision on how best to unbundle the package (i.e., which titles should be individually subscribed to post-breakup) was based largely on full text download volume as an indicator of future ILL demand. Other indicators of use/value (including citation by our faculty, and JCR rank in category) were also compiled, shared with stakeholders, and used to inform the decision. Along these lines, librarians working in the Triangle Research Library Network (TRLN) have proposed the use of a Cost-Per-Cited- Reference (CPCR) metric to supplement traditional Cost-Per-Use (CPU) metrics and other factors in evaluating e-journal packages for consortia .
3. Be transparent by sharing details of the analysis with affected faculty constituents when cancellation looms. When presented with the background and details of our unbundling proposal, engineering faculty in affected departments grasped the situation instinctively and reacted positively. Each institution, school/college and department will differ in how faculty receive such news, but based on this experience we tend to present faculty stakeholders with the facts and let the facts speak (mostly) for themselves.
 Bosch, S., Albee, B., and Henderson, K., 2018, "Death By 1,000 Cuts," Library Journal, 143(7), p. 28.
 SPARC, 2019, "Big Deal Cancellation Tracking," https://sparcopen.org/our-work/big-deal-cancellation-tracking/.
 Pedersen, W. A., Arcand, J., and Forbis, M., 2014, "The Big Deal, Interlibrary Loan, and Building the User-Centered Journal Collection: A Case Study," Serials Review, 40(4), pp. 242-250.
 Martin, V., Gray, T., Kilb, M., and Minchew, T., 2016, "Analyzing Consortial “Big Deals” via a Cost-Per-Cited-Reference (CPCR) Metric," Serials Review, 42(4), pp. 293-305.
Author of this post:
Jim Van Loon, Science Librarian, Wayne State University