Expanding the Assessment Toolbox: Blending the Old and New Assessment Practices

Virtual Conference

About the Virtual Conference

Every day libraries and publishers are asked to demonstrate the value of the content they provide through quantitative metrics and assessments. Existing metrics, such as the Journal Impact Factor, and tools, such as COUNTER and SUSHI, have proven their worth in providing useful data. But as both the forms of content and the way content is used evolves, alternative forms of assessment are also needed. Data at the container level, e.g., the journal, is no longer sufficient. Downloading full text in a PDF file is no longer the only (or even primary) way that users access content. Citation alone is not sufficient to capture all the new social media ways that content is shared. Traditional assessment techniques are being modified, completely new measures are being developed, and both old and new need to be blended in a meaningful way that creates a trusted system. Both the creation of these new or blended metrics and the information the metrics provide are generating new services and products.

This Virtual Conference will examine some of the innovative ideas and techniques that are being employed in the never-ending struggle to measure how content is accessed and used. It will include discussions related to usage statistics, altmetrics, gaming the numbers, and open access. NISO's Alternative Assessment Metrics Initiative will also be discussed.

UPDATE: Due to the instructor's schedule, the May 7 Training will now take place on Thursday, May 14. All registrants to the April 29 Virtual Conference will receive the free login by Monday, May 11.

NEW! All registrants to this virtual conference will receive a login to the associated Training Thursday on Implementing SUSHI/COUNTER at Your Institution to be held on May 14. (Separate registration to the training event only is also available.)  If you are unable to attend the Training Thursday in person, you can view the recording of the session.

Event Sessions

Introduction

Speaker

11:00 a.m. – 11:10 a.m.

Keynote Address: The Value of Library-Provided Content: Assessing Usage and Demonstrating Impact

Speaker

Megan Oakleaf

Associate Professor of Library and Information Science
iSchool at Syracuse University

11:10 a.m. - 12:00 p.m.

Librarians have long sought to determine the value of library-provided content, and researchers have developed a variety of tools—both old and new—to approximate its use and impact. Over time, a number of approaches have been deployed: satisfaction and self-report usage surveys, expenditure analysis, quantitative dashboards, competitive comparisons, use counts, citation analysis, impact factors, altmetrics, and more. Each of these strategies can be used to approximate the usage of library-provided content; none of them, in isolation, can provide a complete picture. Furthermore, none of them offer a “magic bullet” solution to librarians who struggle to move past assessing usage and on to demonstrating the true impact of content…the changes that result from users reading, consuming, analyzing, evaluating, debating, and expanding the content they encounter and using it to create new content, solve problems, make decisions, take actions, and so on. This presentation will set the stage for the presentations that follow by providing a lens for understanding the challenges that surround defining, demonstrating, and developing library content value as well as the progress that has been made towards that end.

 Value in numbers: A Shared Approach to Measuring Usage and Impact ​​​​​​​

Speaker

Jo Alcock

MSc(Econ) MCLIP, Researcher, Evidence Base
Birmingham City University

12:00 p.m. - 12:30 p.m.

 

Dismantling a Single-Discipline Journal Bundle: A Triangulation Method for Assessment

Speaker

12:30 p.m. - 1:00 p.m. 

Academic libraries acquire access to many journals through “Big Deal” packages. As serials prices continue to rise at unsustainable rates it will become increasingly necessary to consider breaking-up these bundles and just subscribing to the most important titles individually. To date, most of the LIS literature has focused on discussing the large, multidisciplinary Big Deals of commercial publishers – but what about the smaller, society journal packages? Recently, it appeared that the University Library, University of Saskatchewan would likely no longer be able to subscribe to the entire American Chemical Society package of 36 journals, and tough decisions would need to be made. In an effort to arrive at the most conscientious and evidence-based decisions possible, three discrete sources of data were collected and compared: full-text downloads, citation analysis of faculty publications, and user feedback. This presentation will describe the triangulation methodology developed – including the unconventional approach of applying a citation analysis technique to usage data and survey responses. When it becomes necessary to break up a smaller bundle of journals, important to researchers in a particular discipline, this method may provide strong evidence to support librarian decisions as well as involve faculty in the process.

Lunch Break

1:00 p.m. - 1:35 p.m.

Preview of May 14 NISO Training Thursday: Implementing SUSHI at your Institution ​​​​​​​

Speaker

1:35 p.m. - 1:45 p.m. 

Assessing Game-Based Library Initiatives

Speaker

Kyle Felker

Digital Initiatives Librarian
Grand Valley State University Libraries

1:45 p.m. - 2:15 p.m. 

Game-based library programming seems to promise greater and deeper engagement with the library by users, and more active learning experiences that improve understanding and retention of skills and information. However, in order to determine of projects deliver on those promises, assessment of game-based programming and projects is essential. In this presentation, we will explore some basic principles of program assessment and how they can be applied to game-based initiatives. We will examine the practical application of such techniques and principles by looking at the assessment component of Grand Valley State Universities’ Library Quest mobile game.

Brace for Impact: Using Assessment Evidence to Communicate the Value of Your Library SERs

Speaker

Amanda B. Albert

Distance Learning Librarian, Horace W. Sturgis Library
Kennesaw State University

2:15 p.m. - 2:45 p.m. 

Libraries are blending new and old assessment techniques to gather evidence about their services, expertise, and resources (SERs) and the impact these have on their stakeholders’ lives. As assessment becomes more unique, librarians increasingly need to communicate results in dynamic ways. It is not enough to just collect the data and share it in an annual report or national library survey. Rather, the data should be used to actively engage stakeholders in order to communicate the full breadth of library impact. This presentation will discuss how to blend the old and the new to communicate library value effectively. It will offer up proactive strategies to target and time communication in order to build trust and increase the visibility of libraries by using data as evidence of library influence. 

Afternoon Break

2:45 p.m. - 3:00 p.m.

‘Good Enough’: Applying a Holistic Approach for Practical, Systematic Collection Assessment ​​​​​​​

Speaker

Madeline Kelly

Head of Collection Development, University Libraries
George Mason University

3:00 p.m. - 3:30 p.m. 

Ongoing and systematic collection assessment is essential to building, managing, and justifying strong and balanced library collections. Unfortunately, full-scale collection assessment is too-often understood as something labor-intensive and exacting—an involved process that must result in incontrovertible answers. In reality, no assessment tool is perfect, no answer incontrovertible. Instead, this presentation proposes an alternative “holistic” approach to collection assessment that incorporates a variety of methods into a single, flexible assessment portfolio. The results of each tool, flawed on their own, accumulate into a more reliable sketch of the collection. Applied on a subject-by-subject basis, the portfolio of tools can be adapted to meet the peculiarities of the moment: assessments can be goal-oriented or exploratory, in-depth or brief, collaborative or centralized. Most importantly, a portfolio-based assessment program can be implemented by a team as small as one, making collection assessment feasible for libraries short on staff, time, and statistical expertise.

This ‘good enough’ holistic approach is in place at George Mason University, where five subject assessments have been completed (including three during a one-year pilot program) and another five are underway.

E-Journal Metrics: Exploring Disciplinary Differences

Speakers

Katherine Chew

Research/Outreach Services Librarian, Health Sciences Libraries
University of Minnesota

Mary Schoenborn

Subject Liaison Librarian, Humphrey School of Public Affairs and the Carlson School of Management
University of Minnesota

3:30 p.m. - 4:00 p.m.

Collection librarians have an ongoing need to align acquisition and retention decisions about library resources in order to provide the best possible outcomes for their users and accountability to administrators. In previous collection management research, we developed a decision-making blueprint by incorporating the relationships between the journals that our users downloaded and the journals that our faculty cited in their articles.

In this presentation, we take the next step by exploring the extent to which disciplinary differences exist in the relationships between the downloading of our subscribed journals and a) faculty decisions to author articles in these journals and b) the choices their external peers make as to whether or not to cite our faculty’s articles in these journals. Does the strength of the relationships vary by discipline? Do the social sciences / humanities differ from the physical or health sciences? Are there differences between similar disciplines such as the physical and health sciences, or within disciplines, such as nursing to medicine, or are they alike enough for one formula to suffice? Together, these metrics will help fine tune our sense, at a disciplinary level, of the value that our users assign to our collection through their decisions about which journal articles to download, read, and cite.

NISO Altmetrics Project: Update from 3 Project Working Groups

Speakers

4:00 p.m. - 4:30 p.m.

Development of specific definitions for alternative assessment metrics 
Mike Showalter, Product Manager, Plum Analytics - NISO Altmetrics Project Working Group A Co-chair

Definitions for appropriate metrics and calculation methodologies for specific output types
Mike Taylor, Senior Product Manager, Informetrics, Elsevier - NISO Altmetrics Project Working Group B Co-chair

Development of strategies to improve data quality through source data providers 
Martin Fenner, Technical Lead PLOS Article-Level Metrics, PLoS - Chair, NISO Altmetrics Project 

Roundtable Discussion: Presenters from the day return to answer questions in an open format discussion

Speaker

4:30 p.m. - 5:00 p.m.

Moderated by: Todd Carpenter, Executive Director, NISO

Questions for Diane (DeDe) Dawson:

Question
Did you considered using anything like the JIF as a factor in your analysis, and if cost was any kind of factor at this stage of the analysis?
Answer

No, for this analysis I was mostly just interested in local usage (downloads/citations) and local opinions (users survey). If a journal has a high IF but is not useful locally, then I’m not sure subscribing to it would be a good use of limited funds. I think I would rather subscribe to a lower IF journal if it was more important to researchers/students locally. As for the cost… I didn’t have a specific budget target to meet at the time of the analysis. I simply needed to get a good idea of the important titles (beyond the obvious ones: i.e. JACS), and be ready with a ranked list to go into the negotiations with.

Question
Did you get faculty feedback on the methodology? Which metrics made the most sense to them? Any outreach to stakeholders outside the Chemistry Dept?
Answer

Interesting question. Faculty were very engaged in this issue, and most responded to the survey – and appreciated the opportunity to provide this input. So far, I haven’t shared any results with them since I didn’t want to create confusion about our access to this bundle. As I mentioned, the crisis was averted and everyone went back to their daily grind. I have been conflicted about bringing this topic up again in case they think we are back in crisis mode! I do plan to share this with them when the time is right, which may be soon since our current (extended) license is just till the end of 2015. As for stakeholders outside of chemistry – as I mentioned, there is now evidence that there are other users on campus, I can speculate about where these downloads are coming from, but cannot confirm with the present data. I still believe the chem dept researchers are the primary users (and the most invested users) of this bundle as a whole.

Question
If you had to subscribe individually, would you have been able to afford all the essential titles? Or would it have been worthwhile to move to individual titles? Would you have ended up paying less with those selected titles? How much more would they cost
Answer

A the time it looked like we wouldn’t be able to afford the entire bundle without the negotiation power of the consortium. So I did this analysis in case we had to cancel the bundle and just subscribe to titles individually. I didn’t have a specific budget target at the time. I simply needed to be prepared with a list of the most important titles. When I arrived at my overall list of 10 “essential” titles I looked at the individual subscription costs for each title listed on the ACS website. The total came to a little over half of the cost of the entire bundle – for just those 10 titles. This was the print price though, I couldn’t find the electronic subscription price. So, this gave me some cost savings if needed.

Additional Information

  • Cancellations made by Wednesday, April 21, 2015 will receive a refund, less a $35 cancellation. After that date, there are no refunds.
  • Registrants will receive detailed instructions about accessing the virtual conference via e-mail the Friday prior to the event. (Anyone registering between Monday and the close of registration will receive the message shortly after the registration is received, within normal business hours.) Due to the widespread use of spam blockers, filters, out of office messages, etc., it is your responsibility to contact the NISO office if you do not receive login instructions before the start of the webinar.
  • If you have not received your Login Instruction email by 10AM (ET) on the Tuesday before the webinar, please contact the NISO office or email Juliana Wood, Educational Programs Manager at jwood@niso.org for immediate assistance.
  • Registration is per site (access for one computer) and includes access to the online recorded archive of the conference. You may have as many people as you like from the registrant's organization view the conference from that one connection. If you need additional connections, you will need to enter a separate registration for each connection needed.
  • If you are registering someone else from your organization, either use that person's e-mail address when registering or contact Juliana Wood to provide alternate contact information.
  • Conference presentation slides and Q&A will be posted to this event webpage following the live conference.
  • Registrants will receive an e-mail message containing access information to the archived conference recording within 48 hours after the event. This recording access is only to be used by the registrant's organization.