The primary selling point of metrics for the academic researcher was the promise that the proof provided by such metrics of the value of one’s work would be the increased and long-term funding needed to do such work. Prestige, tenure, influence, even celebrity -- these have been stepping stones to securing significant (and much-needed) grants to educational institutions of all sizes and types. But have these incentives been subverted over time or in specific ways? Is the drive to publish-or-perish the best mechanism for encouraging substantive study? The integrity of the publishing process and perhaps the integrity of the funding model for higher education itself is at stake. This session will look at some of the troubling questions surrounding the incentives offered to the working scholar, researcher, and scientist.
Presenters in this virtual conference will consider the following questions:
· How might institutions and research facilities best weld available indicators of use or influence into a meaningful metric?
· If individual scholarship is best gauged by the value assigned to it by the larger community, then what collection of metrics should be gathered for purposes of determining appropriate rewards in the context of academia?
· How might institutions better address this challenge and reward faculty appropriately?
This Virtual Conference Includes a Training Thursday Session, Presenting Metrics for Better Understanding and Use, originally scheduled for February 28, 1:00 p.m. - 2:30 p.m. Eastern Standard Time. This follow-up session has had to be rescheduled to a later date; registrants will receive sign-on credentials when those details have been finalized.
Confirmed speakers include:
- Holly Falk-Krzesinski, Vice President, Research Intelligence, Elsevier
- Mike Taylor, Head of Metrics Development, Digital Science
- Christine Casey, Editor, Morbidity and Mortality Weekly Report, Centers for Disease Control
- Dave Kochalko, Founder, Artifacts
- Jonathan Adams, Director, ISI, Clarivate Analytics
- Nicky Agate, Assistant Director of Scholarly Communication and Digital Projects, Columbia University
12:00 p.m. – 12:10 p.m. Welcome and Introduction
12:10 p.m. – 12:45 p.m. How did we get here? Is there a clear path forward?
The skills to understanding how well we’re performing, to reflect on the implications and to act on the insights are essential to the human state. Our ability to abstract these thoughts, and express them – whether in language or numbers – has given us the ability to communicate all of these processes through time and space; to compare, to evaluate and judge performance. Research evaluation is one small part of the human condition, and one in which that expression of performance is increasingly communicated through the abstraction of numbers. But in the middle of all of these data, we mustn’t lose sight of the essential humanity of our endeavor, an essence that – perhaps – can’t be abstracted. While there is a path that is emerging, there are challenges to be faced in the future, balances to be found between abstracted data and human need
12:45 pm – 1:15 pm. Why we should stop worrying about high impact journal indicators and start loving high impact scientists
There is far too much emphasis on researchers to publish in “high impact” journals. The journal brand has become an unsuitable proxy for relevance, influence and impact. Research suggests this approach is too simplistic and has been counter-productive. It is time to redefine what it means to be a “high impact scientist or scholar” in one’s field, institution, career, and member of society. We will achieve a more meaningful understanding of the scholar’s contributions by obtaining a more holistic view of the researcher’s creative works and work products, their behaviors, influence on the performance of colleagues, and advancement of their discipline. In brief, we must shift our focus of attention and the indicators we rely on for assessing scientists and their scholarship away from journals and squarely onto the individuals. With contemporary discovery tools, we no longer need depend on the filter of journal brand to identify important science. The crisis of research integrity, reproducibility, lack of access to data/algorithms/methods and inability of editorial and peer review processes to address these issues reinforce the need to “think different” about indicators and how they are used to influence behavior. With a more dynamic set of indicators focused on contributions, behaviors, influence, and impact, our research ecosystem will be better positioned to direct and invent behaviors that advance discovery and its attendant socio-economic benefits. We have technologies that are cost-effective to accomplish such a transition and are eager to collaborate among stakeholders and experts in the NISO community to realize this change.
1:15 p.m. – 1:45 p.m. Profiles, not Metrics; Why it is important to drill into the data that feed into any 'single point' metric
In a recent report, the ISI team drew attention to the information that is lost when data about researchers and their institutions are squeezed into a simplified metric or league table. In this NISO session, I will draw on that report and look at four familiar types of analysis that can obscure real research when misused and four alternative visualisations that unpack richer information that lies beneath each headline indicator. These four examples cover profiling of individuals, journals, research groups and institutions. The aim of enhanced, profiled information is to get away from the idea of metrics as a substitute for informed discussion and move instead towards tools that support better, more responsible research management.
1:45 p.m. -2:00 p.m. Comfort Break
2:00 p.m. – 2:30 p.m. Administrator (Institutional Use of the Data): Data-informed Strategic Planning for the Research Enterprise
As competition for extramural research funding continues to increase and resources become more difficult to acquire and even maintain, universities and other research institutions are relying more heavily on data to help inform their decision-making. In this session, I will address considerations for research leaders and institutional administrators using research information systems, data, metrics and analytics to support the strategic planning for their institutions’ research enterprise.
2:30 p.m. – 3:00 p.m. Measuring Science Impact Beyond Citations (Case Studies)
Traditional journal metrics help us understand how widely the article content is disseminated. But then what? Three case studies will illustrate how CDC’s Science Impact Framework can describe the importance of your work beyond citation data. The Framework utilizes a combination of quantitative and qualitative indicators to measure outcomes, through five levels of influence: disseminating science, creating awareness, catalyzing action, effecting change, and shaping the future. The complex dynamics between the levels of influence and the intricate environment in which influence materializes create a path of impact which does not necessarily follow a linear progression.
3:00 p.m. – 3:30 p.m. Impact on the Scholar & Researcher: Interview with Nicky Agate
How are researchers adapting to this new world of metrics? What trepidations exist? What concerns might faculty be sharing with you? And how might the information professional allay those concerns regarding the use of analytics and metrics?This interactive session with Dr. Nicky Agate of Columbia University will explore the challenges of developing evaluative metrics for the digital humanities while still satisfying the needs of both the scholar as well as the institution.
3:30 p.m. – 4:00 p.m. Roundtable Discussion
Cancellations made by Wednesday, February 13, 2019 will receive a refund, less a $35 cancellation. After that date, there are no refunds.
Registrants will receive detailed instructions about accessing the virtual conference via e-mail the Friday prior to the event. (Anyone registering between Monday and the close of registration will receive the message shortly after the registration is received, within normal business hours.) Due to the widespread use of spam blockers, filters, out of office messages, etc., it is your responsibility to contact the NISO office if you do not receive login instructions before the start of the conference.
If you have not received your Login Instruction e-mail by 10 a.m. (ET) on the day before the virtual conference, please contact the NISO office at email@example.com for immediate assistance.
The NISO registration model assumes one computer in use per site (one registration = one computer in use for a group). You may have an unlimited number of staff from your institution/organization view the live broadcast from that connection. Those unable to listen in to the live broadcast will be able to listen to the archived recording included in the cost of your registration. Please contact NISO (firstname.lastname@example.org) if you have a particular need for additional access to the live broadcast at your institution.If you are registering someone else from your organization, either use that person's e-mail address when registering or contact email@example.com to provide alternate contact information.
Speaker presentation slides and Q&A will be posted to this event webpage following the live conference.
Registrants will receive an e-mail message containing access information to the archived conference recording within 48 hours after the event. This recording access is only to be used by the registrant's organization.
For Online Events
- You will need a computer for the presentation and Q&A.
Audio is available through the computer (broadcast) and by telephone. We recommend you have a set-up for telephone audio as back-up even if you plan to use the broadcast audio as the voice over Internet isn't always 100% reliable.