NISO Training Series: Assessment Practices and Metrics for the 21st Century

This 2018 Training Series Was Sold Out!

Note: The 2018 program archived recordings are available for free with registration for the 2019 series! Click here for more information.

Objective: To provide consistency of training and a baseline of knowledge for information professionals in considering and applying assessment practices for scholarly resources across multiple information services and systems.  

Who Will Benefit From This Training Series:

  • Academic librarians whose role includes assessment as a professional role or who have been assigned to conduct assessment exercises in their own library
  • Early or mid-career publishing staff in product development or marketing roles who seek a better understanding of assessment activities and the metrics by which a product or service may be evaluated
  • Mid-career managers or supervisors whose roles require them to conduct small and medium size assessment projects or exercises.

Course Moderator: Martha Kyrillidou, Director & CEO, Qualitymetrics, LLC

Confirmed Guest Lecturers:

  • Starr Hoffman, Director, Planning & Assessment, University of Nevada – Las Vegas; 
  • Sarah Murphy, Assessment Coordinator, Research and Education, Thompson Library, Ohio State University;
  • Steven BraunData Analytics and Visualization SpecialistNortheastern University Libraries
  • Nancy B. Turner, Associate Director for Organizational Research & Strategy Alignment, Temple University;
  • Frankie Wilson, Head of Assessment & Secretariat, Bodleian Library, Oxford University; 
  • Rachel LewellenHead of Assessment and Program Management, Harvard Library
  • Charles Watkinson, Associate University Librarian, University of Michigan
  • Scott W.H. YoungUser Experience & Assessment Librarian, Montana State University and Doralyn Rossmann, Associate Professor and Head of Digital Library Initiatives, Montana State University.

Course Dates and Duration: October 19, 2018 – December 20, 2018. The course will consist of eight segments, one per week and each lasting approximately 60-90 minutes. We will schedule a consistent time across the eight weeks; in NISO experience, this is usually a lunch hour on Friday afternoons, 11:30 – 1:00pm (Eastern)

Students supply their own computers for participating in the program. Guest lecturers may be featured in specific segments, as the course instructor determines necessary. Program exercises may not be required for every single course segment.

The course will begin by introducing current assessment activities and practices so that practical learning for attendees can be attained. The closing sessions will encourage attendee thinking about emerging needs associated with assessment of information products and services.

Basic Competencies: This course assumes that the registrant will already have a basic familiarity with the following:

  • Basic Statistical Analysis
  • Data Collection and Management
  • Established forms of reporting usage (COUNTER, SUSHI, etc.)

Event Sessions

October 19 -- Basis of Assessment in Academic Libraries: An Overview

This opening session will provide the rationale for and benefits of assessment training. Administrators, librarians and others working in the academic environment need to base decisions on user and usage data gathered from a variety of library services and systems. This initial overview lays the ground work for understanding what’s different about library assessment in the 21st century, the variety of means for conducting assessment in the library, the skill sets needed, and the challenges to be faced – whether those be issues of handling sensitive personal data or unanticipated resistance from co-workers

Question
Question to Audience: How do you hope to benefit or what do you hope to gain from this training series?
Answer

Attendee, University of Pittsburgh: more focus on outcomes; how do we prove causality?

Attendee, University of Tennessee-Knoxville: One strength of our program is that we have done a lot to establish a culture of assessment, so folks throughout the Libraries are doing assessment. I'd like our assessment program to be better integrated with the university's assessment program.

Attendee, McGill University: What do you see as being the hot current trends in assessment and UX?

Attendee, NIH: What we would like to see: dynamic data analytics (dashboards, data visualization) to assist in library decision-making

Attendee, UC-Berkeley: The strengths are mostly focused on input and output assessment, but not outcome assessment.

Attendee, University of Florida

1. Supported by Lib leadership and welcomed by University Community.

2. Institutional research data; Would be great to have more consistent data reporting on use of electronic resources.

3. Greater interaction with Institutional Research to be able to examine relationships between student library use and academic success

Attendee, University of Wyoming:

1. student learning assessment is our strength currently.

2. [Missing Information]

3. We want a Libraries-wide assessment program. It's currently smaller in scale.

Question
Question to Audience: What types of data would be meaningful to your assessment concerns? What might be of value to decision-makers?
Answer

Attendee: NIH : Data meaningful to decision makers: resource usage statistics, instruction course survey results & feedback, customer satisfaction survey results

Attendee, University of Florida: Data from multimethod study of overnight library and auxiliary study space use.

Attendee, UC-Berkeley: We would like to have the data that can be integrated among multiple resources, e.g. integrated with university enrollment and faculty data

Attendee: University of Northern Texas: For collection decisions, CPU is still the primary metric (Note: CPU is Cost-Per-Usage). Again, for decisions about the subscribing to Big Deals, we use the distribution of uses across all titles (aka "Pareto" or "80/20").  CPU does need to be evaluated in context of the type of resource.

A few years ago, we led an effort to coordinate information literacy with the local school district.  Our university has been leading efforts in dual-credit programs with local schools. I'm interested in efforts of libraries to reach these kinds of students.

Attendee, University of Pittsburgh: We used instructor survey data to update a first-year course library visit.

Attendee, University of Oregon:  We use a median CPU over a period of time (4-5 yrs), median subscription price and then compare that cost with the potential ILL costs if we stopped the subscription.

Attendee, Vanderbilt: Agreed, CPU is used regularly. Additionally, we have looked at attendance during holidays to determine which libraries need to be open on holidays, winter break

October 26 - Collection of Data and Research Design

This second session will address the starting point of any assessment activity – an understanding of what data may be available to the investigator, what additional data may be needed and the process of research design. The lecturer will touch on privacy concerns in the gathering of data as well as the challenges of collecting and working with data contained in third-party provider systems.

Question
Can you provide links to some of what was referenced in your presentation?
Answer

The link to Sarah Murphy's Tableau Public assessment content for Ohio State libraries is here

This is a link to UNLV Libraries' survey of some of the renovations on our second floor. The results can be filtered using the floor map on the left, so each separate area (and different renovation project) can be viewed individually. 

https://public.tableau.com/profile/unlv.libraries.assessment#!/vizhome/LiedSecondFloorSpaceSurvey/LiedSecondFloorSpaceSurveyResults

November 2 - Using Available Tools

Once the individual charged with assessment has inventoried available data and collected any additional data needed, the next step is to select appropriate software for working with that data. This session will outline a range of APIs, plug-ins, and other available software (Excel, Tableau, MINES for Libraries, etc.) that allow professionals to “get their hands dirty” in productively working with the data.

November 9 - Formulating, Interpretation and Presentation of Data

This segment of our assessment training series will engage registrants in thinking creatively about what data might be used and applied to the areas of investigation. Beyond simple counts, what might be interesting statistical techniques in considering the data. How to identify and describe a correlation as apart from causation? What represents a reliable benchmark? What metrics should be a part of determining the benchmark? How best to approach those resistant to such metrics as reliable indicators?

November 16 - Formulating, Interpretation and Presentation of Data, Part Two

This session further delves into the available data derived from library activity; this might encompass everything from data gathered through the library web site to sensor data arising from traffic within the library.  How is the library assessing programs offered? What data arises from mobile devices when delivering location based services? How to evaluate student learning outcomes in the context of the library?  

November 30 - Reporting and Developing Strategies

Having collected and studied the needed data, what might be the best means of developing a narrative? Training participants will focus in this session on how to explain the story being told by the data. This segment may involve case studies from different institutions to discuss what works in a particular example or what may be missing. The session may include discussions of data visualization, the role of assessment in strategic planning, as well as how to use the story in activities of advocacy and outreach.

December 14 - Developing New Metrics

The digital information environment means that users engage with content and services in ways that the assessment community is still trying to identify and understand. New and nuanced metrics can help tell a deeper story of impact for libraries. In this session, we will examine the complex issue of data privacy and ethics in the context of library assessment, with case studies drawn from altmetrics, learning analytics, community-based assessment, ebooks, and OER. Questions that will frame our discussion include: What privacy concerns does following user data in that context raise for those responsible for assessment? What are the logistics of gathering that data? What are we collecting, and for what purpose?

 

Thursday, December 20 - Developing New Metrics, Part II

Perhaps the most theoretical of all the training sessions, this final segment will be addressing more nebulous questions. What information products and services require new metrics? What data might provide insights? Who controls that data? There is a need for collaboration between various stakeholder communities in developing useful and constructive metrics. How can that be accomplished? What techniques or tools are needed?

Additional Information

  • Cancellations made by October 5, 2018 will receive a refund, less a $35 cancellation. After that date, there are no refunds.

  • Registrants will receive detailed instructions about accessing the virtual conference via e-mail the Monday preceding the specific session. Due to the widespread use of spam blockers, filters, out of office messages, etc., it is your responsibility to contact the NISO office if you do not receive login instructions before the start of the webinar.

  • If you have not received your Login Instruction e-mail by 10 a.m. (ET) on the day before the specific session in this training series, please contact the NISO office at nisohq@niso.org for immediate assistance.

  • Registration is per site (access for one computer) and includes access to the online recorded archive of the conference. You may have as many people as you like from the registrant's organization view the conference from that one connection. If you need additional connections, you will need to enter a separate registration for each connection needed.

  • If you are registering someone else from your organization, either use that person's e-mail address when registering or contact nisohq@niso.org to provide alternate contact information.

  • Registration includes access to an archived recording of each of the eight sessions included in this series.

  • Registrants will receive an e-mail message containing access information to the archived conference recording within 48 hours after the event. This recording access is only to be used by the registrant's organization.