Headshot of NISO Managing Director, Todd Carpenter

November 2011

Few things are more central to the entire library community than the bibliographic record format, MARC. Developed by Henriette Avram at the Library of Congress in the 1960s, it is widely used today as the basis for most OPAC systems and as a method for exchanging bibliographic information in the library community. Over the decades, MARC has become a lynchpin of cataloging, of library resource management, and the bibliographic description of items in a collection.

The fact that MARC remains the cornerstone for so much is amazing considering how much has changed over the past 40+ years in computer technology since MARC's release. This speaks highly of the strength of the vision of Avram, but unfortunately also speaks to the difficulty (and relative unwillingness) of the library community to adapt to change. MARC has grown and expanded over the years, most notably with the development of MARC 21 and MARC XML in the 1990s. However, it has developed in an organic way, which has led to problems, even contradictions, in the way information should be encoded in MARC. These problems will only be exacerbated as content becomes more integrated, more complex, more multimedia, and more of a "mash-up" from multiple sources. For example, what would a MARC record look like for an "enhanced book" distributed in EPUB that included not only the text, but also clips from the film version, author interviews, and an audio reading of the text? Content forms like these, and many others that are equally if not more complex, are not unimaginable, nor are they too distant in their future mass distribution. If we are rethinking the basic definition of what it means for a resource to be a "book," how can our metadata records be so far behind the current state of the art?

The basic question of what is the data structure for description, discovery, and patron service for the coming decades remains an open one. The work on development of Resource Description & Access (RDA) that began in 2004 and was published in 2009 is one element of this process. Earlier this spring, the Library of Congress announced a Bibliographic Framework Transition Initiative, a second element in developing a new bibliographic structure for our community. The goal of the project is to "transform our digital framework" in the light of technological changes and budgetary constraints. Earlier this week, LC provided some more information about their plans and approach.

There is much to admire and appreciate in LC's leadership here. However, quoting from the site: "This work will be carried out in consultation with the format's formal partners…" Inclusive is not the same as openness and collaborative doesn't necessarily imply consensus outcomes. While LC should be lauded for its leadership and the desire and interest in solving a most difficult problem, the real question for the community is whether this should be led by LC alone? The community that uses MARC records encompasses nearly every library, nearly every software provider, and the countless organizations supplying records to the community. Karen Coyle wrote about this on her blog this summer, comparing LC's leadership on this to the control exerted by the US in ICANN.

We in the standards community are often reticent to adapt those things that were working well, even if they no longer fit a new environment. Perfect evidence of this can be found in Sally McCallum's quote in a Library Journal Newsletter article when speaking about the pace of this transition: "We want change with stability." But can we really have both when we're looking at radical change? This conundrum often leads us to inaction. Many standards when they are reviewed—as all are every five years—are approved for reaffirmation, as is, despite an awareness of their shortcomings and flaws. For example, Z39.50 was reaffirmed at the same time staff at LC were working on a new model for search using web infrastructure, which became SRU/SRW. If there are significant problems with the MARC record structure, Z39.2 shouldn't be reaffirmed when it next comes up for revision. Change is difficult and painful and certainly not "stable." Focusing on stability allows people to delay and push off the decision to change and makes the stable thing increasingly irrelevant as the rest of the world moves on.

The standards that underlie MARC are not simply LC's alone to manage and transform as they see fit. Z39.2 (Information Interchange Format), in particular, is a national standard which would need a consensus-driven process, operating within the NISO community, to revise. There is also an international equivalent, ISO 2709, which is the product of international consensus within ISO TC 46. Both communities have ultimate control over those particular standards and while LC represents an important voice, they are by no means the only voice controlling the ultimate outcome.

The technical skill and expertise, as well as leadership in the management of MARC should not be underestimated. The MARC standards office at LC is adeptly led and they have the best of intentions, with a goal of trying to represent and serve all that use this important format. However, there is a fine line between leadership and control. Hopefully, LC is willing to lead while letting the broader community control, as messy as that process might be.

The process for moving MARC into today's information environment is important, as noted above. Wouldn't the process be better served by utilizing the existing and open standards development processes already in place that have served our community so well in so many areas? Understanding that this process will take time, but will engage all of the various community members in an open and consensus-based approach will help to ensure buy-in to the final outcome, as well as to help foster adoption once work is completed.

In the end, the goal of everyone involved is to improve our bibliographic structures in such a way that they are easier to use, less expensive, easier to understand and manage, and connect our community into the larger Internet information community. That is the point at the end of the day of all standards.

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

November NISO Webinar: New Discovery Tools: Moving Beyond Traditional Online Catalogs

The migration of traditional online academic and public library catalogs to the notion of "discovery platforms" promises new ways to expose library collections and other resources tailored to individual patron needs. For patrons, this personalized search and discovery experience can encompass functions and tools unheard of just a few years ago, such as powerful faceted searching tools and tagging, and can provide a collaborative portal for social media and other online scholarly community initiatives. For libraries, the discovery platform can provide an efficient method for maximizing the use of and investment in its online resources.

NISO's November webinar, New Discovery Tools: Moving Beyond Traditional Online Catalogs—to be held on November 9, 2011 from 1:00 - 2:30 p.m.(Eastern)—will explore the areas of the discovery platform marketplace, selection and implementation strategies, and usability.

Topics and speakers are:

  • A Web Scale Discovery RefresherAthena Hoeppner (Electronic Resources Librarian, University of Central Florida Libraries) will put web-scale discovery systems in context by reviewing the core concepts and terminology and looking at the major systems side-by-side.

  • Usability Testing of Discovery InterfacesRice Majors (Faculty Director for Libraries IT/Librarian, University of Colorado at Boulder Libraries) will share data about the methodology and results of his own research: a task-based usability test of vendor-provided next-gen catalog interfaces and discovery tools (Encore Synergy, Summon, WorldCat Local, Primo, and EBSCO Discovery Service).

Registration is per site (access for one computer) and includes access to the online recorded archive of the webinar for one year. Can't make it on the webinar date/time? Register and watch the recorded version at your own convenience. NISO and NASIG members receive a member discount; a student discount is also available.

For more information and to register, visit the event webpage.

This webinar is sponsored by CrossRef

NISO/DCMI Webinar: The RDA Vocabularies: Implementation, Extension, and Mapping

During a meeting at the British Library in May 2007 between the Joint Steering Committee for the Development of RDA and DCMI, important recommendations were forged for the development of an element vocabulary, application profile, and value vocabularies, based on the Resource Description and Access (RDA) standard, then in final draft. A DCMI/RDA Task Group has completed much of the work, and described their process and decisions in a recent issue of D-Lib Magazine. A final, pre-publication technical review of this work is underway, prior to adoption by early implementers.

The NISO/DCMI webinar, The RDA Vocabularies: Implementation, Extension, and Mapping—to be held on November 9, 2011 from 1:00 - 2:30 p.m. (Eastern)—will provide an up-to-the-minute update on the review process, as well as progress on the RDA-based application profiles. The webinar will discuss practical implementation issues raised by early implementers and summarize issues surfaced in virtual and face-to-face venues where the vocabularies and application profiles have been discussed.


  • Diane Hillmann is a partner in the consulting firm Metadata Management Associates, and holds an appointment as Director of Metadata Initiatives at the Information Institute of Syracuse. She is co-chair (with Gordon Dunsire) of the DCMI/RDA Task Group and is the DCMI Liaison to the ALA Committee on Cataloging: Description and access, the US body providing feedback on RDA Development.

  • Thomas Baker, Chief Information Officer of the Dublin Core Metadata Initiative, was recently co-chair of the W3C Semantic Web Deployment Working Group and currently co-chairs a W3C Incubator Group on Library Linked Data.

Registration is per site (access for one computer) and includes access to the online recorded archive of the webinar for one year. Can't make it on the webinar date/time? Register and watch the recorded version at your own convenience. NISO and DCMI members receive a member discount; a student discount is also available.

For more information and to register, visit the event webpage.

December NISO Webinar: Assessment Metrics

With ever-shrinking library budgets it is more essential than ever to ensure that the library collection is targeted, relevant, and well-used. Return on Investment (ROI) has become the mantra of library management and libraries need to show accountability for collection decisions. A key method for supporting collection development decisions is the use of assessment metrics.

NISO's December webinar on Assessment Metrics—to be held on December 14, 2011 from 1:00 - 2:30 p.m. (Eastern)—will feature speakers who have successfully implemented assessment metrics (such as COUNTER 3, Eigenfactor and impact factors).

Speakers are:

  • Tim Jewell, Director, Information Resources, Collections and Scholarly Communication, University of Washington
  • Robin Kear, Reference/Instruction Librarian, University of Pittsburgh
  • Oliver Pesch, Chief Strategist, E-Resource Access and Management Services, EBSCO Information Services

Registration is per site (access for one computer) and includes access to the online recorded archive of the webinar for one year. Can't make it on the webinar date/time? Register and watch the recorded version at your own convenience. NISO and NASIG members receive a member discount; a student discount is also available.

For more information and to register, visit the event webpage.

This webinar is sponsored by CrossRef

ESPReSSO Single-Sign-on Authentication Recommended Practice Published

NISO has published a new Recommended Practice, ESPReSSO: Establishing Suggested Practices Regarding Single Sign-On (NISO RP-11-2011), that identifies practical solutions for improving the use of single sign-on authentication technologies to ensure a seamless experience for the user.

Currently a hybrid environment of authentication practices exists, including older methods of userid/password, IP authentication, or proxy servers along with newer federated authentication protocols such as Athens and Shibboleth. This recommended practice identifies changes that can be made immediately to improve the authentication experience for the user, even in a hybrid situation, while encouraging both publishers/service providers and libraries to transition to the newer Security Assertion Markup Language (SAML)-based authentication, such as Shibboleth.

This recommended practice is the result of the NISO Chair's Initiative—a project of the chair of NISO's Board of Directors, focusing on a specific issue that would benefit from study and the development of a recommended practice or standard. Oliver Pesch, Chief Strategist for E-Resource Access and Management Services at EBSCO Information Services) and the 2008-2009 Chair of NISO's Board of Directors, chose the issue of standardizing seamless, item-level linking through single sign-on (SSO) authentication technologies in a networked information environment.

The ESPReSSO Recommended Practice is available for free download from the NISO website.

New Specs & Standards

International Digital Publishing Forum, EPUB 3 Becomes Final IDPF Specification

EPUB 3.0, a major revision to the global standard interchange and delivery format for e-books and other digital publications, has been elevated by the membership of the International Digital Publishing Forum (IDPF) to a final IDPF Recommended Specification. Based on HTML5, EPUB 3.0 adds support for rich media (audio, video), interactivity (JavaScript), global language support (including vertical writing), styling and layout enhancements, SVG, embedded fonts, expanded metadata facilities, MathML, and synchronization of audio with text and other enhancements for accessibility.

PREMIS Editorial Committee, PREMIS OWL Ontology Available For Public Review

The PREMIS Editorial Committee has announced the publication of an OWL ontology for the PREMIS Data Dictionary for Preservation Metadata version 2.1, a digital preservation standard based on the OAIS reference model. Until now the PREMIS Data Dictionary was only implemented as an XML schema, which remains ideal for creating, validating, and storing the preservation metadata of a particular digital asset. This OWL ontology allows the same information to be expressed in RDF. With this alternative serialization, information can be more easily interconnected, especially between different repository databases. The PREMIS OWL ontology also reaches out to preservation-specific vocabularies already published by the Library of Congress. Please send comments to the PREMIS OWL Wiki no later than Nov. 10, 2011 to be considered in a revised version.

W3C Library Linked Data Incubator Group, Final Report

The mission of the W3C Library Linked Data Incubator Group, chartered from May 2010 through August 2011, has been "to help increase global interoperability of library data on the Web, by bringing together people involved in Semantic Web activities—focusing on Linked Data—in the library community and beyond, building on existing initiatives, and identifying collaboration tracks for the future." This final report of the Incubator Group examines how Semantic Web standards and Linked Data principles can be used to make the valuable information assets that libraries create and curate—resources such as bibliographic data, authorities, and concept schemes—more visible and re-usable outside of their original library context on the wider Web. Among the recommendations is that library standards bodies increase library participation in Semantic Web standardization, develop library data standards that are compatible with Linked Data, and disseminate best-practice design patterns tailored to library Linked Data.

Dublin Core Metadata Initiative, New DCMI Schema.org Alignment Task Group

The DCMI Architecture Forum has formed a new DCMI Schema.org Alignment Task Group to define and publish mappings (alignments) between Schema.org vocabularies and DCMI Metadata Terms. A Task Group wiki has been set up. Discussion in the Task Group will take place on the dc-architecture mailing list. Membership in the Task Group is open to any interested member of the public.

Media Stories

Data is the New Black
The Signal: Data Preservation [LC blog], October 14, 2011; by Leslie Johnston

At a recent Storage Meeting, cultural heritage organizations referred to their content as "data." Librarians and archivists now talk about data in general, not just about metadata. The meaning of "Big Data" can vary depending on an individual's perspective of "big." Some organizations are adding over a TB of data every week. "The Twitter archive has 10s of billions of tweets in it. The Chronicling America collection has over 4 million page images with OCR. Web Archives, such as the one at the Library of Congress, may be comprised of billions of files." While these are all collections, they are also large data repositories requiring a supporting infrastructure. Will libraries and archives be able to provide effective searching of such large collections as well as the kind of tools that researchers will want to use for analysis? What about large and frequent downloads from those collections? As organizations move forward with their "Big Data" repositories, these are issues that must be addressed. (Link to Web Source)

University Presses Lead the Way for Publisher-Based Ebook Systems
Information Today Newsbreaks, posted October 10, 2011; by Nancy K. Herther

Michael Jensen of the National Academies Press stated in 2010 that university presses needed to prepare for the disruptions and changes of a "universal digital environment of knowledge ubiquity." In the e-book arena, university presses have responded in a way that could be a model for commercial publishers. There are some 130 university presses in the U.S. today, many of whom are known for their coverage in specialized areas and for finding audiences for topics that commercial publishers have deemed as not profitable enough. The University Press e-book Consortium was formed with five presses in 2010, who worked with academic libraries and solicited partners to provide collections of scholarly e-books to libraries. The resulting partnership, the University Press Content Consortium (UPCC), a merger of Project MUSE Editions and the UPeC, provided a beta release in August 2011 and plans a launch in January 2012. The focus is on a "non-competitive, publisher-neutral, and format-neutral" e-book collection and search functionality. While the initial launch will use PDF format, the plan is to transition to the EPUB format. Another press project, the Oxford University Press Scholarship Online with e-books from six press partners, is designed to integrate the full text of scholarly publications into library systems, using an XML format with OpenURL and DOI support for citations within a text. JSTOR announced a Books at JSTOR program in collaboration with nine university presses that will integrate the books with the journals in JSTOR, including links to book reviews and references in the journal articles. NYU Press has taken a strategy of working with multiple e-book markets, which has resulted in a significant increase in their e-book sales. The University of Minnesota Press, which uses MUSE/UPCC, JSTOR, and Oxford as sales channels, has also seen record e-book sales in the last two years. Minnesota Press Director Doug Armato predicts these projects will "demonstrate university press' collective value and expertise." (Link to Web Source)

NISO Note: Project MUSE is managed by the Johns Hopkins University Press, a NISO voting member. OpenURL and the DOI Syntax are NISO standards.

Research Librarians Consider the Risks and Rewards of Collaboration
The Chronicle of Higher Education, October 16, 2011; by Jennifer Howard

The Association of Research Libraries' Membership meeting, held in Washington, DC, on October 12-13, featured presentations on a number of large digital collaboration projects and on how libraries are operating in a print/digital hybrid environment. Council on Library and Information Resources president Charles J. Henry provided an update on the Digital Public Library of America, which he described as a "federation of existing objects" rather than a true library collection. Ed Van Gemert, deputy director of libraries at the University of Wisconsin at Madison, spoke on the HathiTrust digital repository, which his library uses at a third of the cost of local storage. A session on orphan works discussed the recent lawsuit against HathiTrust by the Authors Guild, which could signal an end to the previous reluctance of publishers to sue libraries and universities. Charles Kurzman, a professor of sociology at the University of North Carolina at Chapel Hill, emphasized that many researchers like himself are working in areas where information is not available in electronic format. He mentioned Scribd as a mechanism that researchers are using to share their own digital resources. Trevor Owens with the National Digital Infrastructure and Information Preservation Program at the Library of Congress mentioned the Recollection platform as a way to build interfaces such as maps or timelines to library catalog or similar metadata. However, John V. Lombardi, president of the Louisiana State University system, said digitization and technology investments should be based on whether the results will make the university more competitive. (Link to Web Source)

NISO Note: The Association of Research Libraries, the Council on Library and Information Resources, and the Library of Congress are NISO voting members.

The Digital Preservation Disconnect
InfoStor, posted October 03, 2011; by Henry Newman

A significant area of disconnection between digital preservation archivists and technology vendors is in the area of data loss. In the information systems industry, both availability and loss of data are discussed in "terms of '9s." The author illustrates this with a chart showing the percent of data reliability and the corresponding loss of data in bytes when the total data size ranges at 1 petabyte (PB), 50 PB, 100 PB, 500 PB, and 1 exabyte (EB). For example at the 5th level of "9's," i.e. 99.999% reliability, there is a loss of over 10 GB for every 1 PB of data. When visualizing loss in this way, some preservationists may find even the loss at a reliability level of 15 "9s" to be unacceptable, but the cost of providing integrity at that level is likely prohibitive. This "9s" table doesn't address the integrity once data starts moving between a server and storage, and such end-to-end integrity is very difficult to determine. In discussions about preservation, both archivists and technologists need to come to terms on what are realistic expectations.
(Link to Web Source)

Journal Title Transfer Problems
UKSG Serial e-news, Issue 256, October 14, 2011; by Lorraine Estelle

Since the beginning of 2009, over 3400 journal titles have been transferred between publishers. Scholarly societies, which own two-thirds of the ISI-ranked top 500 journals, have been outsourcing their journals to commercial publishers in an effort to deal with all the issues involved with digital publishing. The combination of this outsourcing with the number of transferred titles creates problems for libraries and users ranging from loss of access (at least temporarily), unexpectedly high price increases, lack of knowledge by libraries about the impact on the title from the transfer, and uncertainty about perpetual access to previous years. A 2011 JISC survey of UK academic libraries about the problems provided detailed examples and confirmed that the problems are continuing despite initiatives like the TRANSFER project. Access problems often occurred because of delays in transferring content. Librarians also identified large time sinks in checking on transferred titles and in updating their management systems. The Virtual Library of Virginia and the Florida Center for Library Automation did a similar survey in the US and had comparable findings with the UK survey. The JISC Electronic Information Resources working group published Society Journal Publishing Transfer: Guidelines to Help Achieve a Successful Transition to bring these issues and recommended guidelines to the attention of society publishers. Included is a recommendation that publishers being considered for outsourcing should confirm they will comply with the TRANSFER Code of Practice. (Link to Web Source)

NISO Note: The issue of transferred journal titles is one of the problem areas that the NISO Presentation and Identification of E-Journals (PIE-J) Working Group's forthcoming recommended practice will address. Visit the PIE-J webpage for more information.