ALA Midwinter 2013
Headshot of NISO Managing Director, Todd Carpenter

January 2013

Each year, organizations like NISO need to reinforce their value to their constituency group and seek another commitment from their community. This could be on an individual basis or on an organizational basis, as it is with NISO. The question is ultimately one of value and mission. Does participation in this group and contributions made here exceed what they could be done if the monies were invested somewhere else, of course presuming there are resources to invest anywhere. Each year (at least!), we should take a moment to reflect on this value.

Where would subscription agents be without the data exchange made possible by identifiers such as the ISSN or the ISBN? How would publishers distribute content without paper, ink, binding, or file format standards? Booksellers would have challenges gathering information on their available products and sales without structures like ONIX, BISAC, or barcodes. How much less efficient would libraries manage their catalog information if bibliographic information structures like MARC, AACR2, or Z39.50 didn't exist?

What could you do if your organization were 5% more efficient? 10%? 20%? How many libraries are managing 50 percent or more of their collections budget in electronic format with electronic resource management staff smaller than the staff managing the print collections? How many orders can your organization process because they arrive in electronic format associated with ISBN or ISSN numbers to link to shared cataloging or metadata? These efficiencies are only possible because of the standards that make information distribution more effective.

One might respond that these systems already exist now and that they don't need much maintenance or upkeep. Setting aside the obvious registration needs of some identifier or schema standards, there are long-term costs associated with the benign neglect that comes with that perspective. Things break. Technology begins to fail when it doesn't keep up with a changing environment. Our community is facing the outcome of that situation today in many systems. The bibliographic information exchange system that was launched nearly 40 years ago is straining. The ISSN system is pressed to address the digital transformations in content distribution since the ISSN was launched decades ago. The ISBN system is struggling to address the prospects of multiple digital formats of the same title.

As we look forward to the new year, each of you should ask yourselves: Where is your organization 10% or 20% less efficient than it could be and how can we collectively squeeze out that inefficiency? I would be willing to bet that removing that inefficiency is not your organization's core competency and that removing it will not differentiate you from your competitors. The existence of that inefficiency is only a cost borne by the entire system and removing it will make all of us more competitive and efficient.

In 2013, NISO will be doing three things to help identify inefficiencies in our marketplace. First is a strategic planning exercise led by the NISO Architecture Committee, which has already begun, on identification of 18-36 month horizon issues. Second, a forum of executives to identify inefficiencies is being planned to contribute to the Horizon planning effort. Finally, NISO recently received grant funding from the Mellon Foundation to help to define community interests and needs related to bibliographic information exchange. Other ideas are being pursued to help focus attention on identifying and addressing community problem areas.

The model of identifying community problems and addressing them through consensus standards development is one that has proven itself effective, though certainly not perfect. It requires each one of us in the community to contribute our ideas, our time, and our energy to finding and implementing solutions. We have made progress. We will continue to do so. However, there is a lot before us and it is time to get moving.

We are looking forward to doing great things in 2013 and to your participation in reaching that success.

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

NISO/DCMI January Webinar: Translating the Library Catalog from MARC into Linked Data: An Update on the Bibliographic Framework Initiative

In May 2012, the Library of Congress announced a new modeling initiative focused on reflecting the MARC 21 library standard as a Linked Data model for the Web, with an initial model to be proposed by the consulting company Zepheira. The goal of the initiative is to translate the MARC 21 format to a Linked Data model while retaining the richness and benefits of existing data in the historical format.

In the joint NISO/DCMI webinar Translating the Library Catalog from MARC into Linked Data: An Update on the Bibliographic Framework Initiative, to be held on January 23 from 1:00 - 2:30 p.m. EST, Eric Miller of Zepheira will report on progress towards this important goal, starting with an analysis of the translation problem and concluding with potential migration scenarios for a broad-based transition from MARC to a new bibliographic framework.

Miller is co-founder and president of Zepheira, which provides solutions for managing information across boundaries of person, group, and enterprise. Until 2007, Eric led the Semantic Web Initiative for the World Wide Web Consortium (W3C) at MIT and was one of the key leaders in the development of the Resource Description Framework and other Semantic Web technologies.

Registration is per site (access for one computer) and closes at 12:00 pm Eastern on January 23 (the day of the webinar). Discounts are available for NISO and DCMI members and students. All webinar registrants receive access to the recorded version for one year.

Visit the event webpage to register and for more information.

February Webinar: Metadata for Preservation: A Digital Object's Best Friend

Over the past decade, as the scholarly community's reliance on e-content has increased, so too has the development of preservation-related digital repositories. The need for descriptive, administrative, and structural metadata for each digital object in a preservation repository was clearly recognized by digital archivists and curators. However, in the early 2000s, most of the published specifications for preservation-related metadata were either implementation specific or broadly theoretical. In 2003, the Online Computer Library Center (OCLC) and Research Libraries Group (RLG) established an international working group called PREMIS (Preservation Metadata: Implementation Strategies) to develop a common core set of metadata elements for digital preservation. In 2005, and then again in 2008, PREMIS published versions of its Data Dictionary for Preservation Metadata. Currently, the PREMIS data dictionary and corresponding XML schema are being implemented by digital repositories around the world.

Join NISO for the February 13 webinar on Metadata for Preservation: A Digital Object's Best Friend to learn more about PREMIS and how it is being implemented.

Speakers and topics are:

  • Rebecca Guenther, Consultant, Meet Your Data – As former Standards Specialist at the Library of Congress, Ms. Guenther served as the co-chair of PREMIS and will begin the webinar by outlining the types of information that should be associated with an archived digital object and how these translated into the development of the PREMIS Data Dictionary and corresponding XML schema.

  • Amy Kirchhoff, Archive Service Manager, Portico – Ms. Kirchhoff will describe how Portico's digital preservation repositories have applied this standard.

Registration is per site (access for one computer) and closes at 12:00 pm Eastern on February 13, 2013 (the day of the webinar). Discounts are available for NISO and NASIG members and students. NISO Library Standards Alliance (LSA) members receive one free connection as part of membership and do not need to register. All webinar registrants and LSA webinar contacts receive access to the recorded version for one year.

Visit the event webpage to register and for more information.

February Virtual Conference: Future Perfect: How Libraries Are Implementing Emerging Technologies

Virtual conferences are new type of educational event that NISO is offering this year. These 5-6 hour conferences are held online in webinar-like formats, with occasional breaks in the schedule for participants. The longer length allows the depth of coverage of a conference coupled with the convenience of a webinar.

As former Library Journal Editor-in-Chief Francine Fielkoff wrote in an editorial last year, "Libraries Should Be What Users Want—With a Little Help from Librarians." Libraries everywhere are planning, strategizing, experimenting, and conversing with their users in order to realize how they may best serve as successful learning spaces in rapidly-changing technological, social, and economic environments. Creation is often seen as an important theme of future community spaces. Supporting 3D printers and other means of publishing content are a few examples of how libraries may stake their willingness to extend their capabilities in managing shared resources beyond those of an information provider to becoming an information creator.

In the first NISO virtual conference of the year, Future Perfect: How Libraries Are Implementing Emerging Technologies—to be held on February 20, 2012 from 11:00 a.m. to 5:00 p.m. (Eastern Time)—a variety of experts will discuss some of the technologies that offer the most promise for libraries. We aim to interact with the audience as much as possible to extend the conversation with those who are attending.

Topics and speakers are:

  • Keynote: Overview of the emerging technology landscape/how to keep up with emerging technologiesJason Griffey, Head of Library Information Technology, University of Tennessee, Chattanooga

  • Augmented RealityChristine Perey, PEREY Research & Consulting

  • >
  • 3D Printing: Makerspaces – speaker TBA

  • 3D Printing: Intellectual Property ConcernsMichael Weinberg, Senior Staff Attorney and Innovation Evangelist, Public Knowledge

  • Espresso Implementation – speaker TBA

  • Library in a Box (Bibliobox)David Fiander, Web Services Librarian, Western University

  • Roundtable discussion: Implementing Emerging Technologies at Your Institution – Facilitated by Todd Carpenter, Managing Director, NISO

Registration is per site (access for one computer) and closes at 12:00 pm Eastern on February 20, 2013 (the day of the webinar). Discounts are available for NISO members and students. All virtual conference registrants receive access to the recorded version for one year.

Visit the event webpage to register and for more information.

New ISQ Issue Focuses on the Future of Library Systems

The final 2012 issue of Information Standards Quarterly on the Future of Library Systems, and featuring articles on the new Library Services Platforms, is now available in open access on the NISO website. The earlier term integrated library systems (ILS) is associated with the functionality and concepts for managing print collections and the metadata about them. These new products and projects cast a wider net, consistent with the expansion of library collections to include a complex assemblage of electronic and digital materials in addition to their physical inventories.

Guest Content Editor Marshall Breeding has assembled a collection of articles that present a range of products and projects from this new realm of library services platform. Carl Grant provides an overview of this new genre and gives an introduction to each of the major products in this category.

A series of articles follows this introduction with discussions of real-world implementations of several of these systems. Paul Bracke relates the experience of Purdue University Libraries as a development partner with Ex Libris for Alma and how the system fits within that institution's strategic transformation already underway. Gentry Holbert presents the experience of Spring Hill College as one of the early adopters of OCLC's WorldShare Management Services. William Eric Atkinson describes how the Orange County Library System migrated from Innovative Interfaces' Millennium ILS to that company's Sierra services platform, taking advantage of its APIs to enable integration with a variety of local applications. Michael Winkler and Robert H. McDonald provide an overview and update of the Kuali OLE project that is building a next-generation, enterprise-oriented library system as open source software.

Ted Koppel of Auto-Graphics contributes an article on the Cost of Resource Exchange (CORE) project that started as a standard and ended as a recommended practice, due to a lack of uptake during the draft for trial use stage.

The NISO Reports column by Nettie Lagace discusses the publication of the NCIP standard revision and a new COUNTER-SUSHI Implementation Profile, as well as the Mellon grant that NISO received to understand the requirements for a new bibliographic framework.

As always, the issue concludes with the Noteworthy column on recent standards-related developments of interest to the community and a summary table on the status of NISO's in-development projects.

ISQ is available electronically in open access on the NISO website. Both the entire issue and individual articles may be freely downloaded. Print copies are available by subscription and as print on demand. For more information and to access the free electronic version, visit: www.niso.org/publications/isq/2012/v24no4/.

New Specs & Standards

HTML5 Definition Complete, W3C Moves to Interoperability Testing and Performance

The World Wide Web Consortium (W3C) has published the complete definition of the HTML5 and Canvas 2D specifications. Though not yet W3C standards, these specifications are now feature complete, meaning businesses and developers have a stable target for implementation and planning. HTML5 is the cornerstone of the Open Web Platform, a full programming environment for cross-platform applications with access to device capabilities; video and animations; graphics; style, typography, and other tools for digital publishing; extensive network capabilities; and more. The W3C community continues to enhance existing HTML features and develop new ones, including extensions to complement built-in HTML5 accessibility, responsive images, and adaptive streaming. To reduce browser fragmentation and extend implementations to the full range of tools that consume and produce HTML, W3C now embarks on the stage of W3C standardization devoted to interoperability and testing (called "Candidate Recommendation").

ISAN and EIDR to Provide Seamless Registration of Content IDs to Leverage the Strength of Both Systems

The Entertainment ID Registry (EIDR) and the International Standard Audiovisual Number International Agency (ISAN-IA) have embarked on efforts to support seamless registration of content IDs in either system to enable content producers and distributors to take full advantage of the capabilities of both systems. ISAN-IA and EIDR plan to link their two systems so that any ISAN registrant can obtain alternate EIDR IDs whenever needed in EIDR-based solutions. Similarly, EIDR registrants should be able to obtain alternate ISAN IDs to link their EIDR ID hierarchies into ISAN-based solutions. The two IDs and ID systems will be linked and cross-mapped to ensure easy interoperability for all users.

Media Stories

Standards and Data Citation
Chapter 26 in: For Attribution: Developing Data Attribution and Citation Practices and Standards: Summary of an International Workshop, 2012; by Todd Carpenter

In the print world, we take for granted standards for such things as page numbering, which is critical for use in citations. In the digital environment of reflowable text, page numbers become meaningless and there is no current good method for citing a particular section in such a digital object. Instead we are moving to standardizing around key identifiers that could be actionable links and build on the opportunities of a machine-intermediated world. A good digital citation should disambiguate the item, provide location information, attribute and disambiguate the author, and have the ability for reuse and preservation. Developing a standard is important, but the adoption of the standard is the more critical part of a standard's lifecycle. Among those who need to be deeply engaged in data citation standards adoption are researchers, educators, data centers, publishers, promotion tenure committees, administrators, funding agencies, consumers of the data, and repositories. (Link to Web Source)

NISO Note: Complete workshop proceedings are available online. The NISO Digital Bookmarking and Annotation Sharing working group is developing a standard syntax for how bookmarks and notes should be located in a digital text.

SUSHI: Delivering Major Benefits to JUSP
Ariadne, December 5, 2012; Paul Meehan, Paul Needham and Ross MacIntyre

The Journal Usage Statistics Portal (JUSP) implementation was only possible through the Standardized Usage Statistics Harvesting Initiative (SUSHI) protocol that automates the retrieval of the data. By reducing the manual effort by more than 97%, JUSP was able to substantially increase the number of publishers supported and institutions served in the UK Higher Education community. JUSP began implementing SUSHI following its mandated use in Release 3 of the COUNTER Code of Practice. In addition to the publisher data, the JUSP website also provides other reports and analytical tools. JUSP now also provides its own SUSHI server to enable participating institutions to re-harvest their data from JUSP into their own systems. The lack of an existing SUSHI client software that could be adapted required JUSP to create a Perl language client, which was done in only a couple of days. Unfortunately not all the publishers involved with JUSP were supporting SUSHI and JUSP often had to work with the publisher to develop a SUSHI server. The generic SUSHI client often had to be adjusted for each publisher, especially due to differing authentication requirements. "By September 2012, over 100 million individual data entries had been collected; this represents three and a half year's worth of data for 140 institutions." In a typical month, JUSP handles some 2800 data files, which without SUSHI would take some 180 days of staff time; with SUSHI it is done in 24 hours and is automated. Some errors have been encountered, e.g., server time-outs, but most of these are fixed by a subsequent retrieval. JUSP has worked very successfully with the publishers to address recurring issues, such as incorrect or missing identifiers. As an intermediary between the library and the publisher, JUSP needed to build a SUSHI server capability in addition to its client retrieval process and used the lessons learned on the retrieval side to build a reliable server. For authentication, which is not addressed in the SUSHI standard, JUSP chose to use IP addresses, combined with the standard's Requestor ID element. Some participating libraries, however, choose to use third-party software rather than SUSHI and JUSP had to establish support for those applications. The current implementation supports COUNTER Release 3; support for Release 4 is underway. The newly issued COUNTER-SUSHI Implementation Profile should help in obtaining more consistency between publisher implementations. "The key lesson that can be drawn...is that a centralised, reliable harvesting service based on using SUSHI to collect data is practical, cheap to run, and provides enormous economies where staff time is concerned." (Link to Web Source)

NISO Note: Visit the SUSHI webpage for more information on the standard, the COUNTER-SUSHI Implementation Profile, the schemas, and implementation support documentation. NISO members mentioned in this article are: Elsevier, Ex Libris, and Oxford.

Library of Congress' BIBFRAME Initiative (two parts)
The Digital Shift, December 12, 2012 and January 2, 2013; by Roy Tennant

LC's release of Bibliographic Framework as a Web of Data: Linked Data Model and Supporting Services provides a direction for the initiative to move beyond MARC. The report shows a plan to: differentiate between works (the conceptual content) and instances (physical manifestations); focus on unambiguous identification of information entities; and leverage and expose relationships between and among entities. Shortly after the report's publication, open source software to convert MARCXML records to BIBFRAME Resources was released. The software is available in a Python version that produces JSON files and an XQuery version that produces RDF. Underway is a service that will allow the submission of MARC record batches for transformation. (Link to Web Source for Part 1. Link to Web Source for Part 2. )

NISO Note: NISO has received a grant from The Andrew W. Mellon Foundation to coordinate the needs and requirements of key communities—including libraries, technologists, and library system providers, as well as other international standards development organizations—in the development of the new Bibliographic Framework. For more information or if interested in participating, contact Nettie Lagace.

New Players, New Priorities - Part 1: Governments and Politics Enter Scientific Publishing; Part 2: The Problematic Role of Funders; Part 3: It's Never About the Money; It's Always About the Money
The Scholarly Kitchen, December 10-12, 2012; by Kent Anderson

The traditional scientific publishing model involving scientists, academics, and commercial and university publishers has maintained autonomy from governments and funding organizations. In the last 10 years, though, publishing changes—especially in open access (OA)—are breaking down this separation. The proposal where the term "open access" was created was authored by two US government employees. That paper had proposed a system where the government handled submissions. The trend to involve government and funding agencies "has the potential to erode traditional editorial and political firewalls." The illogical argument is being made that if taxpayers fund research they should have free access rights to any published reports from that research. Forcing government or funding agencies to pay journal articles costs of Gold OA will actually reduce the funding available for research. Some government statements have indicated a desire to place more economic constraints on publishing and impact how information sharing occurs. "Governments are placing not only an extra burden on publishers and editors by requiring them to check for or participate in compliance, but [are] also placing an extra burden on investigators and authors....By allowing momentum to grow behind a trend that would bind publishers to suppliers rather than users of the literature, and tie our industry to public financing and government rule-making about what scientific publications can and cannot do, we are risking a system of objective evaluation that has served science well as an independent method of publication that is driven by quality." A number of non-governmental funders are aligning themselves with OA and are imposing mandates on what gets published and where, the launching of new titles, and even how editors get paid. With corporate authors' institutions more willing to pay OA fees, the ability to pay can influence which industries and research ultimately get published. Some funders are becoming the publishers themselves, a clear conflict of interest to unbiased review and selection. "It's worth reminding ourselves of three factors that make funders so powerful, and potentially so difficult to resist, in scientific publishing—their relative size, their ability to coordinate large and long-term funding strategies, and their inherent alignment." Green OA, free repository publication, has been shown to be impractical and unviable. The subscription model's real problem is not prices; it's the tremendous growth in the amount of information that is getting published at the same time that budgets for journals are dropping. OA publishing just shifts the money from the subscribers (demand side) to the authors (supply side). And government and funders can put more pressure and leverage on the supply side about what and how information gets published. "Author interests align with more publications—authors want publication, and the fewer barriers, the better. Editorial review, statistical review, rejection, resubmission—all these things are barriers. They are also expenses." When the readers are left out of the equation, quality can suffer, because no reader validation of quality through buying power occurs. The supply side has tremendous amounts of money and can align in ways that will influence scientific behavior in many areas. Some publishers are happy to take OA funding monies and see it as a more stable stream of revenue. "OA has the potential to change very fundamental assumptions and the social role of scientific publishing. It shifts our world from "hands off" to "hands on" for funding and government bodies. It takes the power of the purse-strings away from readers and scientists and their proxies in the market and gives it to funders and governments, who have different priorities and a demonstrated willingness to leverage science grants in order to achieve political, organizational, and bureaucratic goals." (Link to Web Source for Part 1. Link to Web Source for Part 2. Link to Web Source for Part 3.)