Skip to main content

ACRL Libraries Transform Toolkit: Assessment

Why assessment of marketing and outreach is challenging

Assessment may be a natural extension of instruction services, but its application within other realms such as outreach activities and services is challenging at best. Relationships, satisfaction, and values are very difficult to measure because they do not easily lend themselves to the same types of methods that work in the classroom.

The article written by Shannon Farrell and Kristen Mastel, "Considering Outreach Assessment: Strategies, Sample Scenarios and a Call to Action" provided a strong roadmap for this section of the toolkit, and remains to date one of the most comprehensive resources that attempts to look at these issues holistically as well as provide specific assessment methodologies based on the goals and type of outreach being done.

In addition, the article "Mapping out a Strategy: Curriculum Mapping Applied to Outreach and Instruction Programs" written by Sarah LeMire and Stephanie Graves, outlines a good strategy for getting started on documenting and assessing outreach by utilizing a curriculum mapping framework to capture this information for each activity and event.

There are several issues that complicate matters, all of which we hope to address in this toolkit:

1. Developing a programmatic approach to outreach activities and connecting them to the institutional mission

2. Exploring different assessment methodologies and understanding how they can be applied to various types of outreach

3. Understanding the type of data collected within the context of the broader goals of the outreach activities in question

Made possible with support from 

This image is licensed under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) by Cinthya Ippoliti

Mapping your outreach vision to your institutional mission

Outreach assessment template

Share your assessment examples

Types of outreach assessment

This section is taken from the Outreach Assessment Toolkit developed by the University of Washington Libraries, Amanda Hornby, Emilie Vrbancic and Linda Whang. For additional information regarding specific needs assessment methods, the "Methods for conducting an educational needs assessment. Guidelines for cooperative extension system professionals. University of Idaho Extension" by Paul McCawley is also a great resource.

Method Limitations
Capturing comments: Capturing a specific idea or suggestion via paper, whiteboard or other media. Great for a snapshot, quick perspective Not a lot of room for detailed information or following up for clarification if something is ambiguous
Ethnographic observations: Wide variety of methods, allow for both direct and indirect interaction with users to determine how they are using services and spaces and their experiences. Great way to gather in-depth, mostly qualitative information May require significant staff time and different analytical methods
Focus groups: Allow for in-depth discussion, great to use as a way to determine user perspectives on a specific issue. Recruitment bias can be a problem, time consuming, and people may not feel comfortable being honest in a group setting
Headcounts: Great as a quick, quantitative method, relatively easy to collect, indicator of how well something was attended No real marker of engagement or of the usefulness of an event
Interviews: Can really help with gathering in-depth information and specific points of view, relatively simple to collect data Can be time consuming, participation might be an issue, individual information might be too piecemeal to get a more holistic sense of an issue. Hard to develop questions that are not leading or that fully represent what you want to know
Photographic diaries: Get information directly from the user, allows user to individually interact with the subject of the study in question Time intensive, and can be difficult to quantify how data is presented-what part of the photo is the focus, etc.
Self-reflection: Allows for a meta view of what worked and what didn't Make sure to apply what was learned to your next program if applicable and to be honest about what the issues were
Social media engagement: Great variety of information from different platforms, short turnaround, direct user feedback and perspective Can be hard to compare information across platforms, data might be inconsistent or missing, time consuming and hard to keep track of trends across time
Surveys: Fairly quick and easy to set up, quantitatively focused, get a high amount of feedback especially with multiple choice type questions Survey design is difficult, people experience burnout and are difficult to recruit, data collected is typically superficial and does not allow for in-depth focus
Vox pops and real-time impressions: Quick sound bytes and commentary as an event is going on Snapshot view of what is going on, can be difficult to record and share back, data might be hard to collect due to the multimedia requirements for this type of data capture method