This Training Series Now Sold Out!
Objective: To provide consistency of training and a baseline of knowledge for information professionals in considering and applying assessment practices for scholarly resources across multiple information services and systems.
Who Will Benefit From This Training Series:
- Academic librarians whose role includes assessment as a professional role or who have been assigned to conduct assessment exercises in their own library
- Early or mid-career publishing staff in product development or marketing roles who seek a better understanding of assessment activities and the metrics by which a product or service may be evaluated
- Mid-career managers or supervisors whose roles require them to conduct small and medium size assessment projects or exercises.
Course Moderator: Martha Kyrillidou, Director & CEO, Qualitymetrics, LLC
Confirmed Guest Lecturers:
- Starr Hoffman, Director, Planning & Assessment, University of Nevada – Las Vega;
- Sarah Murphy, Assessment Coordinator, Research and Education, Thompson Library, Ohio State University;
- Steven Braun, Data Analytics and Visualization Specialist, Northeastern University Libraries
- Nancy A. Turner, Associate Director for Organizational Research & Strategy Alignment, Temple University;
- Frankie Wilson, Head of Assessment & Secretariat, Bodleian Library, Oxford University;
- Rachel Lewellen, Head of Assessment and Program Management, Harvard Library
Other names to be announced shortly.
Course Dates and Duration: October 19, 2018 – December 21, 2018. The course will consist of eight segments, one per week and each lasting approximately 60-90 minutes. We will schedule a consistent time across the eight weeks; in NISO experience, this is usually a lunch hour on Friday afternoons, 11:30 – 1:00pm (Eastern)
Students supply their own computers for participating in the program. Guest lecturers may be featured in specific segments, as the course instructor determines necessary. Program exercises may not be required for every single course segment.
The course will begin by introducing current assessment activities and practices so that practical learning for attendees can be attained. The closing sessions will encourage attendee thinking about emerging needs associated with assessment of information products and services.
Basic Competencies: This course assumes that the registrant will already have a basic familiarity with the following:
- Basic Statistical Analysis
- Data Collection and Management
- Established forms of reporting usage (COUNTER, SUSHI, etc.)
October 19 -- Basis of Assessment in Academic Libraries: An Overview
This opening session will provide the rationale for and benefits of assessment training. Administrators, librarians and others working in the academic environment need to base decisions on user and usage data gathered from a variety of library services and systems. This initial overview lays the ground work for understanding what’s different about library assessment in the 21st century, the variety of means for conducting assessment in the library, the skill sets needed, and the challenges to be faced – whether those be issues of handling sensitive personal data or unanticipated resistance from co-workers
October 26 - Collection of Data and Research Design
This second session will address the starting point of any assessment activity – an understanding of what data may be available to the investigator, what additional data may be needed and the process of research design. The lecturer will touch on privacy concerns in the gathering of data as well as the challenges of collecting and working with data contained in third-party provider systems.
November 2 - Using Available Tools
Once the individual charged with assessment has inventoried available data and collected any additional data needed, the next step is to select appropriate software for working with that data. This session will outline a range of APIs, plug-ins, and other available software (Excel, Tableau, MINES for Libraries, etc.) that allow professionals to “get their hands dirty” in productively working with the data.
November 9 - Formulating, Interpretation and Presentation of Data
This segment of our assessment training series will engage registrants in thinking creatively about what data might be used and applied to the areas of investigation. Beyond simple counts, what might be interesting statistical techniques in considering the data. How to identify and describe a correlation as apart from causation? What represents a reliable benchmark? What metrics should be a part of determining the benchmark? How best to approach those resistant to such metrics as reliable indicators?
November 16 - Formulating, Interpretation and Presentation of Data, Part Two
This session further delves into the available data derived from library activity; this might encompass everything from data gathered through the library web site to sensor data arising from traffic within the library. How is the library assessing programs offered? What data arises from mobile devices when delivering location based services? How to evaluate student learning outcomes in the context of the library?
November 30 - Reporting and Developing Strategies
Having collected and studied the needed data, what might be the best means of developing a narrative? Training participants will focus in this session on how to explain the story being told by the data. This segment may involve case studies from different institutions to discuss what works in a particular example or what may be missing. The session may include discussions of data visualization, the role of assessment in strategic planning, as well as how to use the story in activities of advocacy and outreach.
December 14 - Developing New Metrics
The digital information environment means that users engage with content in ways that the information community is still trying to identify and understand. In the instance of ebooks (as a potential case study) or in the emerging area of OER textbooks, those working to assess the effectiveness of content, its presentation to the reader and use by the reader may recognize that new metrics of engagement are needed. What aspects of user engagement might be considered as valid? What privacy concerns does following user data in that context raise for those responsible for assessment? What are the logistics of gathering that data? What can be identified as best practices?
December 21 - Developing New Metrics, Part II
Perhaps the most theoretical of all the training sessions, this final segment will be addressing more nebulous questions. What information products and services require new metrics? What data might provide insights? Who controls that data? There is a need for collaboration between various stakeholder communities in developing useful and constructive metrics. How can that be accomplished? What techniques or tools are needed?
Cancellations made by October 5, 2018 will receive a refund, less a $35 cancellation. After that date, there are no refunds.
Registrants will receive detailed instructions about accessing the virtual conference via e-mail the Monday preceding the specific session. Due to the widespread use of spam blockers, filters, out of office messages, etc., it is your responsibility to contact the NISO office if you do not receive login instructions before the start of the webinar.
If you have not received your Login Instruction e-mail by 10 a.m. (ET) on the day before the specific session in this training series, please contact the NISO office at email@example.com for immediate assistance.
Registration is per site (access for one computer) and includes access to the online recorded archive of the conference. You may have as many people as you like from the registrant's organization view the conference from that one connection. If you need additional connections, you will need to enter a separate registration for each connection needed.
If you are registering someone else from your organization, either use that person's e-mail address when registering or contact firstname.lastname@example.org to provide alternate contact information.
Registration includes access to an archived recording of each of the eight sessions included in this series.
Registrants will receive an e-mail message containing access information to the archived conference recording within 48 hours after the event. This recording access is only to be used by the registrant's organization.