Headshot of NISO Managing Director, Todd Carpenter

January 2010

The beginning of a new year is always filled with perspectives on the past year and a look forward to the coming year. As the last decade came to a close, I was a member on the closing panel of an NSF-sponsored workshop on Scholarly Evaluation Metrics. Organized by Johan Bollen and Herbert van de Sompel, the meeting brought together a stunning range of experts on assessment and research evaluation to discuss new opportunities for measuring and quantifying the impact of publications. As we review the decade past and the decade to come, the timing of this topic is quite propitious. Metrics have an ability to focus our attention not only on the past, but also on the future and what we hope the future to become.

Much like the Hawthorne Effect in social science experiments, the act of being measured has had significant impact in the scholarly world. For example, the Journal Impact Factor, created in 1955 by Eugene Garfield and Irving Sher, has become a quick proxy for assessing the relative merit of an article or a researcher's influence and has determined a journal's success or failure and even tenure decisions. Although Garfield himself had stated repeatedly that it is an imperfect measure, the Journal Impact Factor has become a de facto standard against which our community is measured.

In an era when so much information is available in so many different forms and venues, are there newer measures we can add to our assessment repertoire? As we consider the landscape of potential new measures, several critical needs will have to be addressed. First, among the many subtly different data streams and measures, any fair assessment must compare apples to apples. At the same time, the community should avoid infinitely bifurcating into subject specific measures. Additionally, many measures are, like the Journal Impact Factor, prone to problems and each can be gamed in some way or another. Finally, a great deal of increasingly important information exchange is taking place outside of the traditional structure of journal publication and needs to be included. Regardless of the approach, a concerted effort to come to a common understanding of the criteria for a metric is a prerequisite. It took nearly a decade for the publishing community to come to agreement about what a "download" is before COUNTER released its first Code of Practice in 2003. Similar front-end efforts on new assessments and measures, if done effectively, can have a profound impact on scholarship for decades to come.

Another area that needs attention early in the coming decade is the management of supplemental materials for journal articles. NISO and NFAIS are sponsoring a workshop (see story below) in Washington, DC on January 22 to discuss current practices and barriers. We expect this to be the first step in developing industry best practices or standards.

Another scholarly research-related project worth watching this year is Open Researcher and Contributor ID (ORCID), an effort to create an author identifier. Led by esteemed publishers—including Thomson Reuters, Nature Publishing Group, Elsevier, ProQuest, Springer—and other leaders in the information community—such as CrossRef, the British Library, and the Wellcome Trust (many of whom are NISO members)—the initiative seeks "to resolve the systemic name ambiguity, by means of assigning unique identifiers linkable to an individual's research output." Although ORCID has only just organized, they hope to get a working system in place within six months using software based on Thomson Reuter's ResearcherID. The project team recognizes that if the new author ID system becomes an accepted de facto standard, it will make new metrics possible that can impact funding, promotion, tenure, etc.—which brings us full circle to where this column started: discussing new metrics and assessment.

For NISO, looking forward we have a busy year ahead with a new initiative on OpenURL quality metrics just approved (see story below) and several others "waiting in the wings." We look forward to serving our new and renewing membership and others in the NISO community in advancing the standards and best practices that are needed in this new decade.

Best wishes for a successful new year,

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

January Webinar: From ILS to Repository and Back: Data Interoperability

NISO's monthly webinar for January 2010 will be From ILS to Repository and Back: Data Interoperability. The webinar will take place on Wednesday, January 13, 2010 from 1:00 to 2:30 p.m. (Eastern time).

While institutional repository (IR) systems are meant to focus on the storage of digital objects, most repositories contain not only objects, but also metadata about those items. One of the main functions of library systems is to help facilitate the discovery of items and content using metadata. Where is the line between IR systems and the discovery and management tools the libraries use to manage their collections? What are some strategies that libraries can employ to integrate these systems? Is anyone having success?

As libraries increasingly become the home to special digital collections and scanned materials, finding simple ways to exchange information and ensure discoverability of IR content will be critical to efficient management of both systems. This webinar will provide attendees with an overview of recent research in this topic as well as real-world examples from organizations who are working toward interoperability.

Speakers and topics for the webinar are:

  • Kathleen Menzies (Researcher Centre for Digital Library Research, University of Strathclyde), OCRIS: Online Catalogue and Repository Interoperability Study – Funded by the JISC Scholarly Communications Group, OCRIS—the Online Catalogue and Repository Interoperability Study—examined the interoperability of LMSs, IRs and other administrative systems in operation within UK universities. The project team produced extensive recommendations for improving the visibility and usefulness of intellectual assets and associated data by allowing central, and rapidly developing, information systems to interoperate. The findings of the OCRIS project will be presented followed by a discussion of their implications and how they relate to the wider context.

  • Sarah Shreeves (IDEALS Coordinator, Scholarly Commons Coordinator University Library, University of Illinois at Urbana-Champaign), IDEALS repository at the University of Illinois at Urbana-ChampaignIDEALS is the digital repository for research and scholarship—including published and unpublished papers, datasets, video, and audio—produced at the University of Illinois.

For more information and to register, visit the event webpage. Registration is per site (access for one computer) and includes access to the online recorded archive. NISO and NASIG members receive a discounted member rate. A student discount is also available. Can't make it on January 13th? Register and you get access to the recorded archive for one year to watch at your convenience.

NISO @ ALA Midwinter

Don't miss the LITA Standards Interest Group session at ALA Midwinter on Saturday, January 16, 2010 from 4:00 - 5:30 p.m. where the NISO Update will be presented. Presenters and topics for the session are:

  • Mark Bide, Executive Director, EDItEUR, will discuss current standardization challenges facing EDItEUR, including e-book identification, subscription products, rights and license communication, and media convergence.

  • Peter McCracken, co-chair of the NISO/UKSG KBART (Knowledge Bases and Related Tools) working group will discuss the recommended practice report and the project's next steps.

  • Ted Koppel and Ed Riding, co-chairs of the NISO CORE (Cost of Resource Exchange) working group will review the status of the CORE draft standard for trial use.

  • Todd Carpenter, Managing Director of NISO, will provide an update on current and upcoming work within NISO including: ONIX-PL, Metadata Supply Chain, Physical Delivery of Library Materials, Single Sign-On Authentication, and ERM Gap Analysis

Other standards-related sessions at ALA Midwinter that will discuss NISO projects are:

  • NISO AVIAC (Automation Vendors Interest Advisory Committee) Meeting – On the agenda for this open meeting is: Defining Compliance and Open Source and Standards

  • LITA Electronic Resources Management Interest Group – Included at this meeting will be an update of NISO's ERM Data Standards & Best Practices Review Working Group.

  • In the "Know": E-Resource Knowledge Base Management and Standards, ALCTS E-Resources Interest Group (ERIG) – This program will include a presentation by Jason Price (Claremont University Consortium Library and SCELC) on KBART: Improving the Data Supply to Knowledge Bases and OpenURL Link Resolvers that discusses the NISO/UKSG KBART Recommended Practice that is expected to be released this month.

  • ALCTS Continuing Resources Section College and Research Interest Group (CRS C&RL IG) – Included in this meeting will be a presentation/update by Hana Levay (University of Washington) on SUSHI (ANSI/NISO Z39.93-2007, The Standardized Usage Statistics Harvesting Initiative (SUSHI) Protocol).

For specific dates, times, and locations of these sessions, visit the NISO@ALA Midwinter webpage.

February Webinar: What It Takes To Make It Last: E-Resources Preservation

Thirty years into the Digital Revolution, we are still grappling with how best to preserve electronic content. Whether born digitally or the electronic version of analog content, electronic resources are relied upon more and more, and their long-term usability must be ensured. NISO's February webinar to be held February 10, 2010 from 1:00 - 2:30 p.m. (Eastern time) will address What It Takes to Make It Last: E-Resources Preservation. This webinar will provide attendees with an overview of current digital preservation issues and standards and an example of a working collaborative digital repository.

Priscilla Caplan, Assistant Director, Florida Center for Library Automation will provide an introduction to e-preservation and provide a closer look at the PREMIS (PREservation Metadata: Implementation Strategies) Data Dictionary standard.

Jeremy York, Assistant Librarian, University of Michigan Library, will discuss the work underway by HathiTrust to build and preserve a comprehensive and cooperative digital library to archive and share electronic resources. The repository already contains over 195 terabytes of information.

For more information and to register, visit the event webpage. Registration is per site (access for one computer) and includes access to the online recorded archive. NISO and NASIG members receive a discounted member rate. A student discount is also available.

March Forum on Discovery to Delivery

NISO's first in-person event for 2010 will be a one-day forum on Discovery to Delivery: Creating a First-Class User Experience to be held on Tuesday, March 23, 2010 at the Georgia Tech Global Learning Center in Atlanta.

There is information everywhere today and access to it relies on a seamless discovery process that offers all appropriate options to the unassisted information seeker. The journey between discovery and delivery is accomplished with a variety of differing technologies and processes, many of which depend on a reliable and accurate knowledge base of coverage information. As demands on these technologies change and expand, NISO supports a variety of efforts to strengthen and improve them.

This forum will explore new and innovative ways to meet users' needs and expectations. Among the topics that will be discussed are: OpenURL knowledgebase quality, improving single sign-on authentication, using DOIs in repository applications, integrating usage data into search, and delivering content in the networked world.

NISO educational forums are routinely praised for their excellent selection of speakers representing a diversity of viewpoints across the scholarly information community and the small size which provides opportunities to network with speakers and other attendees. The agenda for the March forum is being finalized and registration is open. Reserve your space now at the early bird rate.

For more information and to register, visit the forum event webpage.

New NISO Work Item Launched on OpenURL Quality Metrics

NISO's Business Information Topic Committee has approved a two-year project to evaluate OpenURL quality metrics.

This project builds on work already underway by Adam Chandler (Cornell University Library) to investigate the feasibility of creating industry-wide transparent and scalable metrics for evaluating and comparing the quality of OpenURL implementations across content providers. As Chandler notes on his blog about the previous work, "At a typical academic library thousands of OpenURL requests are initiated by patrons each week. The problem is, too often these links do not work as expected, leaving patrons frustrated by a lower than desired quality of service." Metrics on the failures would aid both content providers and librarians in fixing the problems and improving the service. Chandler will chair the NISO working group that will continue his project over the next two years to test and evaluate the metrics collection and reporting.

Additional information and sign-up for an email interest group list for those who want to follow this project are available from the working group's webpage.

NISO and NFAIS to Hold Roundtable Discussion on Journal Supplemental Materials

The results of a recent survey on how publishers handle supporting materials in scientific journals undertaken by Alexander (Sasha) Schwarzman of the American Geophysical Union has generated considerable interest within the information community. In part as a result of this study, but also in recognition of the importance of this issue to the community, NISO and NFAIS will co-sponsor a roundtable discussion on the need for more standardized bibliographic and publishing policies and best practices for supplemental materials. The invitational meeting will be hosted by the American Psychological Association at APA Headquarters in Washington, D.C. on January 22, 2010.

Among the topics expected to be discussed are: the state of the art and current practice for dealing with supplemental materials, Sasha Schwarzman's survey of publishers, the EU proposal that the STM Association and others are working on regarding the preservation of digital scientific information, ICSTI's upcoming meeting on Interactive Publications and the Record of Science, and other initiatives that attendees are pursuing in the area of supplemental journal materials.

The expected outcome of the meeting is the scoping of a new work item to develop best practices for management of supplemental materials that NISO and NFAIS can jointly move forward. A report on the meeting should be available in mid to late February.

Open Teleconference Regarding U.S. Position on AFNOR Appeal of Draft DOI Standard

NISO will be holding an open teleconference call on Wednesday, January 13, 2010 at 9:30 a.m. (Eastern time) to review the recent appeal that was filed by the French standards organization, AFNOR, on the draft international standard, ISO/DIS 26324, Digital object identifier system. AFNOR's main objection is that the scope of the DOI as defined in the standard is redundant with / overlaps that of other international identifier standards.

Norman Paskin (International DOI Foundation), who is chair of the working group for the DOI standard, and Brian Green (International ISBN Agency) will join NISO Managing Director Todd Carpenter to provide more information about the appeal and a forthcoming U.S. position. The AFNOR appeal is currently at ballot to the ISO TC46 membership and NISO has opened a corresponding ballot for the U.S. TC46 Technical Advisory Group, made up of U.S. NISO voting members. NISO is hosting this discussion in its role as a recognized information standards developer, which has standardized the DOI syntax (ANSI/NISO Z39.84), and as the consensus body developing U.S. positions through ANSI to ISO—and not within the context of NISO's responsibilities as Secretariat of TC 46/SC 9. However, international participation in the call is welcome.

Background information on the appeal can be found in these documents: Notification of TC46 Ballot re AFNOR appeal of ISO/DIS 26324, SC 9 Secretary and WG 7 Convenor responses to the claims in the AFNOR appeal, and AFNOR supplemental documentation.

To join the call on January 13, 2010 at 9:30 a.m. (Eastern time), dial 877-375-2160 (U.S. and Canada) or direct dial (outside the U.S.) +1-480-337-5046 and enter the code 178-00-743. If you are planning to participate, we request that you RSVP by e-mail to nisohq@niso.org by close of business on January 12, 2010.

ISBN and E-books: Survey Regarding Requirements

The International ISBN Agency is doing an informal survey in the U.S. and U.K. to gather preliminary requirements information on several issues related to assignment of ISBNs to e-books. An overview document describes the issues and is recommended reading prior to participating in the survey. The simple four-question online survey is designed to assess both the real needs of users and the ability of publishers to satisfy them. Interested parties are encouraged to respond.

New Specs & Standards

ISO 639-6:2009, Codes for the representation of names of languages – Part 6: Alpha-4 code for comprehensive coverage of language variants

This new standard specifies a method for establishing four-letter language identifiers (alpha-4) and language reference names for language variants and a hierarchical framework, which facilitates backward compatibility with other ISO 639 codes, for relating them to languages, language families, and language groups. The alpha-4 language identifiers have been developed for use in a wide range of applications, especially in computer systems, where there is a potential need to cover the entire range of languages, language families, and language groups as well as language variants within each identified language.

ISO 16245:2009, Information and documentation – Boxes, file covers and other enclosures, made from cellulosic materials, for storage of paper and parchment documents

First edition of the standard that specifies requirements for boxes and file covers, made of cellulosic material, to be used for long term storage of documents on paper or parchment. The requirements apply to boxes made of solid or corrugated board and to file covers made of paper or board and can also be applicable to other types of enclosure for long term storage such as cases, portfolios, tubes, and envelopes made of cellulosic material. It is not applicable to storage of photographic materials.

ISO/TR 12033:2009, Document management – Electronic imaging – Guidance for the selection of document image compression method

This technical report provides information to enable an informed decision on the selection of compression methods for digital images of documents in order to optimize their storage and use. It includes information on image compression methods incorporated in hardware or software in order to help the user during the selection of equipment in which the methods are embedded. ISO/TR 12033:2009 is applicable only to still images in bit map mode and only takes into account compression algorithms based on well-tested mathematical work.

Vocabulary Mapping Framework, Introduction version 1.0 and VMF Matrix version 1.001

The introduction to the structure and development of the Vocabulary Mapping Framework (VMF), covering the first stage of this work through November 2009, provides an overview and technical description of the newly available VMF Matrix tool that can be used to automatically compute the "best fit" mappings between terms in controlled vocabularies in different metadata schemes and messages. The Matrix includes the VMF ontology and mappings from third-party vocabularies and parts of vocabularies from CIDOC CRM, DCMI, DDEX, FRAD, FRBR, IDF, LOM (IEEE), MARC 21, MPEG21 RDD, ONIX, and RDA as well as the complete RDA-ONIX Framework.

IEEE Learning Technology Standards Committee, Request for Comment on Three Published Standards

The IEEE Learning Technology Standards Committee (LTSC) is reviewing three of its published standards to determine whether or not the standards need to be revised. Anyone with comments on the standards should contact Schawn Thropp by January 22, 2010. The three standards are:

  • 1484.11.1, IEEE Standard for Learning Technology – Data Model for Content Object Communication
  • 1484.11.2, IEEE Standard for Learning Technology – ECMAScript Application Programming Interface for Content to Runtime Services
  • 1484.11.3, IEEE Standard for Learning Technology – Extensible Markup Language (XML) Schema Binding for Data Model for Content Object Communication

Media Stories

Campus-Based Publishing: Can a Marriage of the Library with the University Press Sparc a Solution to the Serials Pricing Crisis?

IFLA ITS Newsletter, December 2009, p. 9; by John Ben DeVette

A new innovation is occurring at some universities where the university press and the library partner to manage the entire publication cycle from creation through storage and delivery. Coined campus-based publishing, it represents a major change in how universities are addressing scholarly communications. Traditionally, university presses have operated independent of faculty writing and library information management, underutilizing an existing knowledge base. Institutional repositories (IRs) were the first method used by universities to capture their own research writings, but copyright issues and management costs have limited some of the benefits of IRs. Four or five publishing organizations dominate academic publishing and own copyright to much of the published scholarship from the last 50 years and are able to set pricing. The open access (OA) movement is changing this with 10% of today's journals now OA, the growth in IRs that deposit and provide OA to research, and a trend to merge the library and university press. University presses are expert in the publishing workflow. Using new OA tools can significantly reduce the cost of publishing a university's own research. Libraries know how to acquire and archive information and train users in access. This training can be expanded beyond using library resources to how to manage the university's intellectual property. "The library can become the intra-university marketing arm of the press. The press can become the dissemination arm of the library." Librarians should be pro-active in working with the university press to build a campus-based publishing system. (Link to Web Source)

Forget E-Books: The Future of the Book Is Far More Interesting

Fast Company, December 23, 2009; by Adam Penenberg

The end of the printed book is coming but it's not going to be the e-book that replaces it. A plethora of e-book readers are available or coming but this technology basically replicates the existing printed book in electronic form and will just be a stopgap. New technology frequently does imitate its predecessor at first but is then transformed by innovation and user demands. Movie cameras were initially used to film live theater before they were used to create an entirely new art form. Books are poised to become theater productions, "with authors acting more like directors and production companies than straight wordsmiths." Multimedia will turn books into works of art with text side by side with video and layered with photos, hyperlinks, and social networking additions by readers. A special device won't be necessary; a book will be accessible by all types of mobile devices that support multimedia. The opportunity is especially rich for non-fiction authors who can create a one-stop shop for a particular topic that could include text, newsreels, audio interviews, schematics, interactive maps, virtual tours, photos, hyperlinked bibliographies and indexes, and discussion threads and wikis. Novelists could create virtual videogame-like worlds complete with video and audio and the ability of readers to become characters and interact. Penenberg emphasizes he is not predicting the end of reading but instead is seeing a future where "immersive reading coexists with other literary, visual and auditory modes of expression." Today's youth have been raised on such multimedia-rich experiences and will expect more than just text. Authors should embrace the opportunities. (Link to Web Source)

Beyond Bibliographic Records

Lorcan Dempsey's Weblog; December 6, 2009; by Lorcan Dempsey

Cataloging today is based on manifestations of works, which is represented by the data in a bibliographic MARC record. This same manifestation data is then provided as the output of a search. Newer catalog features are just building on the manifestation concept. Faceted browsing, for example, pulls together different manifestations based on their common facet. OCLC's Worldcat Identities uses an alternative approach that takes "data from many records, recombines it, and present[s] it in an integrated way." The identity page for author/translator Donald Ervin Knuth contains an overview listing the total number and types of works, alternative names for Knuth (including transliterations), a graphic timeline showing works both by and about Knuth, a listing of the works most widely held in library collections, graphics of covers, related identities (places and other people), useful links, and a tag cloud of relevant terms. There are many entities—works, places, subjects, time periods—that could be expressed similarly by mining the existing record-oriented data. John Mark Ockerbloom has also blogged about the need for concept-oriented cataloging to organize the vastly expanding sea of information that users are navigating. Ockerbloom, in addition to Worldcat Identities, points to the Subject Maps work at the University of Pennsylvania. Instead of managing information as individual containers of data, libraries should look at how the data can be aggregated to create new information for users. (Link to Web Source)

NISO Note: OCLC is a NISO voting member.

21st-Century Rights Management: Why Does it Matter and What is Being Done?

Learned Publishing, v. 23, pp. 23-31, January 2010; by Mark Bide and Alicia Wise

Businesses in the UK that depend on a copyright business model account for 7.5% of the country's economy, which is typical in developed countries. Over time, UK laws have established that copyright is "an economic right belong[ing] to authors, something that can be owned and controlled." Most of Europe approaches copyright as a human and moral right, rather than economic. Technology has challenged basic tenets of copyright legislation and management. While the originator still bears all the costs of creating the original, their return on the investment can be wiped out by the almost cost-free ease of digital reproduction and dissemination by others—unless copyright protections are in place. Legislation is not sufficient; technology itself must be used to meet this technology challenge. Publishers currently manage their rights in very diverse ways and may not have good records about which rights they own or communicate clearly which rights they are licensing. The process for obtaining additional permissions is frequently difficult and costly. Many publishers are not functionally organized to effectively manage rights; responsibility is fragmented across many different departments. The same publishers who have been innovative leaders in digital publishing are often apathetic about digital rights management issues. In 21st century publishing, every transaction in the supply chain is a rights transaction that needs to be managed. "If rights are to be managed effectively in a networked environment, there must also be clear, direct ways to move from the content when it is discovered online (e.g. in a search engine) back to the rights controller so that appropriate permissions can be sought (and provided automatically in as many cases as possible). This information needs to be available anytime, anywhere, on demand." A number of service providers are now in the business of providing rights management services or tools for publishers including Automated Content Access Protocol (ACAP), Accessible Registries of Rights Information and Orphan Works (ARROW), Attributor, Book Rights Registry (for Google Books), clipping services (Clip & Copy, eClips), Creative Commons, iCopyright, model licenses (including the Shared E-Resource Understanding (SERU)), ONIX for Licensing Terms, Ozmo, Picture Licensing Universal System (PLUS), Publishers Licensing Society PLSe rights repository, the Copyright Clearance Center's Rights Central and Rightslink, and Rightsphere. The authors' "key message is that none of the services or activities outlined in this paper is enough on its own, but each is a good way to begin embracing 21st century rights management practices." A standards-based approach, such as ONIX for Licensing Terms, for the collection, maintenance, and communication of rights will reduce costs, improve time to market, provide better control over content, and allow for permissions to be purchased where needed.
(Link to Web Source)

NISO Note: The Copyright Clearance Center and Publishers Licensing Society, Ltd. are NISO voting members. NISO and EDItEUR have a joint ONIX-PL working group to support and contribute to the continued development of the ONIX-PL standard for license expression. NISO held a webinar in December 2009 on ONIX for Publication Licenses: Adding Structure to Legalese; slide presentations and the Q&A write-up are available from the event webpage. The Shared E-Resource Understanding is a NISO project; the recommended practice, FAQs, and a registry of interested implementers are available from the SERU webpage.

An E-Book Buyer's Guide to Privacy

Electronic Frontier Foundation Deep Links Blog December 21st, 2009; by Ed Bayley

The e-book reader is one of the technology must-haves for 2009 holiday shopping. While transforming the way people read books, it also presents significant threats to privacy. These readers can transmit data about both reading habits and a reader's location. Manufacturers have not been clear about the data that is being collected and how it is used. The Electronic Frontier Foundation (EFF) has created a Buyer's Guide to E-Book Privacy [see table in article], summarizing the privacy and data collection policies for users of Google Books, Amazon Kindle, Barnes & Noble Nook, Sony Reader, and FBReader. Google Book Search, for example, logs every book and page including "how long you viewed it for, and what book or page you continued onto next." Its Web History service maintains a listing of all books purchased. Kindle only licenses its material, which must be purchased through the Kindle Store; therefore Amazon has a list of all of a user's content. Kindle's privacy policy specifically gives it the right to gather "information related to the content on your Device and your use of it." The type of information they can gather is never spelled out. The Kindle's wireless ability to buy books could provide Amazon with a reader's GPS locations over time. Sony Reader works with books and formats beyond what is offered at its own store so would have only limited information about a user's reading habits. Open source programs like FBReader allow readers to use content from many sources on many different devices without the user having to provide information.
(Link to Web Source)