Headshot of NISO Managing Director, Todd Carpenter

September 2008

One of the biggest challenges for standards development organizations like NISO is ensuring the implementation and uptake of the final standard. The end results of our work at NISO are adopted on a voluntary basis by libraries, information suppliers, publishers, and the vendors that provide search, organization, or management tools to the community.

Achieving consensus is already a difficult process. Trying to persuade members of the information community that it is in their business interests to adopt a particular technological approach adds to the challenge. To help address these concerns, NISO's Topic Committees have put into place a set of criteria that focus on the business case of standards to help guide decisions when reviewing new projects and during reaffirmation of existing standards. But this only goes part of the way. The role of everyone involved in NISO must extend beyond the development phase by encouraging the use of our standards, as well as of related or shared standards from other standards development organizations.

One thing NISO can do is communicate broadly how standards can benefit the community. NISO has invested tremendously in expanding our educational programs to help with that communication. By educating purchasers of content and services about the benefits of information standards, we are helping to create a demand for products that conform to those standards. Educated content producers and vendors can see how being an early adopter of standards could give them a competitive edge.

We've had good successes over this past year with our educational events. Most recently, we have hosted two web seminars: one in conjunction with ALCTS on standards in a library environment and a second on OpenURL. Upcoming web events on ONIX-PL, SUSHI, and Identifiers will focus attention on other current initiatives. The in-person Collaborative Library Resource Sharing seminar and the Performance Measurement and Assessment seminar will provide a broader view of their topics. We hope that the participants in these meetings will take away not only a deeper understanding of the subject matter and how it will positively impact their work, but that they will also call for greater adoption of our standards.

We encourage each of you to actively learn how standards can positively impact your organization and to work toward having your colleagues and suppliers conform to standards. By creating an awareness of the benefits of standards for both the developer and the consumer, and encouraging these groups to actively ask for and use these information standards, we will have taken another step towards speeding up the adoption rate for everyone.

With kindest regards,

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

OpenURL Webinar Shows How Far Link Resolution Has Come and Where it Still Can Go

Phil Norman (OCLC) and Peter McCracken (Serials Solutions) were the presenters at NISO's August 21 webinar, OpenURL Implementation: Link Resolution That Users Will Love.

Phil provided background on the OpenURL standard from its early pre-NISO specification, now referred to as version 0.1, through its development as an ANSI/NISO standard. The pre-NISO version addressed the appropriate copy problem, focusing on electronic journal content licensed by libraries. Version 1.0 as defined in the NISO standard (ANSI/NISO Z39.88-2004) expanded the framework to allow new genres and new descriptions of existing genres. It also provided for an OpenURL registry and a maintenance agency. OCLC was selected by NISO in 2006 to manage the OpenURL registry.

Phil described the various components of an OpenURL "ContextObject" as defined in the standard and explained the purpose of each. He also reviewed what types of entities are included in the registry, including namespaces, metadata formats, character encodings, ContextObject formats, transports, and community profiles. A community profile defines the core characteristics for a specific application of OpenURL. Two initial profiles were defined in the standard: Level 1 and Level 2 San Antonio Profiles (SAP). Level 1 uses the Key-Encoded Value (KEV) ContextObject Format, which is limited to representation of only one object. Level 2 uses the XML ContextObject Format, which can represent one or more objects as an XML document. Both profiles were designed for scholarly information types of applications. SAP Level 1 is the type of application seen today in most library systems. A third profile, the Dublin Core Community Profile, was in trial use at the time the standard was issued and became an official profile in 2007. As one might infer, it supports the Dublin Core metadata format. The newest profile, Request Transfer Message, is currently in trial use. It supports the same XML formats as in the SAP2 profile, plus several additional XML metadata formats including MODS and ONIX for Books.

Several OpenURL implementations, beyond the traditional library application, were described. Google Scholar provides links from Scholar search results to resources held by the participating institution of the user's choice. COinS (Context Object in Spans) allows linking to an OpenURL resolver from any HTML encoded webpage. Wikipedia has examples of COinS implementation. OCLC has developed an XML interface from WorldCat Link Manager to WorldCat Local. New applications are underway as well, such as a requests services for jpeg images. The standard was deliberately defined in a way to encourage and support creative implementations.

In addition to the official registry for the OpenURL standard, OCLC provides a commercial OpenURL Gateway, a central link resolver "knowledgebase." They also provide a WorldCat library registry service that includes OpenURL information in the library's record, which allows third parties to provide services with OpenURL links to a particular library's resources for their authorized users. Libraries are encouraged to sign up on the WorldCat Registry and enter a profile that includes the URLS to provided services.

Peter McCracken followed with a discussion of efforts underway to improve OpenURL services through better data transfer and more accurate data. While OpenURL has been a great leap forward in library services by providing users with content they would not otherwise have found, it still doesn't get users to content as easily as it should, and inaccurate data or incorrect implementation can lead to bad links. Lack of knowledge about the standard means that some vendors and libraries who could benefit from OpenURL still aren't using it.

A 2007 study underwritten by the UK Serials Group (UKSG) on Link Resolvers and the Serials Supply Chain provided recommendations on how to address some OpenURL implementation issues. UKSG and NISO have jointly sponsored a working group to follow-up on the report's recommendations. The group, called KBART (Knowledge Bases And Related Tools), is co-chaired by Peter McCracken and Charlie Rapple (TBI Communications) and includes members from link resolver/ERM suppliers, publishers, subscription agents/aggregators, libraries and consortia.

The KBART working group intends to address all three of the major problem areas: 1) lack of knowledge through more and better information to non-using content providers; 2) incorrect implementations by identifying problem implementations, providing opportunities for vendors to grade themselves, standardizing the transfer of data among participants, and offering more and better examples of working implementations; and 3) inaccurate data, the hardest problem to solve, which is why the group is still debating possible approaches.

Initially the group plans to issue best practice recommendations, hopefully in time for the UKSG Annual Conference next spring. They may go on to develop a standard, if that approach appears to be appropriate. The KBART group maintains webpages on both the UKSG and the NISO websites. An email "interest group" list is available for those who want to receive regular updates of the group's progress or post questions and suggestions. To subscribe, send an email to: kbart_interest-subscribe@list.niso.org.

Presentation slides from the OpenURL webinar and links to additional resources on the topic are available from the event webpage.

Registration Open for the ONIX-PL and SUSHI Webinars

The next two webinars in NISO's Demystifying Standards series will focus on ONIX for Publications Licenses (ONIX-PL): Simplifying License Expression and Standardized Usage Statistics Harvesting Initiative (SUSHI): Beyond Trial into Real Use.

ONIX-PL, to be held on September 10, 2008 from 1:00 - 2:30nbsp;p.m. (eastern time), will feature Alicia Wise, Chief Executive, Publishers Licensing Society (PLS) and Chair of NISO's new ONIX-PL working group. She will begin the webinar by describing the need for ONIX-PL, the benefits it provides for various stakeholders, the ongoing maintenance she and the NISO working group will be doing with this standard, and she will provide an overview of the work done to get the trial version of the standard to its present stage.

Jeff Aipperspach, Senior Product Manager, Serials Solutions, and Rick Burke, Executive Director, Statewide California Electronic Library Consortium (SCELC), will follow with their hands-on perspectives of the trial they are conducting with ONIX-PL, a project that should provide a model for future ONIX-PL implementations. SCELC and Serials Solutions are partnering with a number of publishers to test the transmission of licensing data using the ONIX-PL messages.

Register at the ONIX-PL webinar event webpage.

SUSHI, to be held on October 2, 2008 from 1:00 - 2:30 p.m. (eastern time), will feature Adam Chandler, Coordinator, Service Design Group, Cornell University Library, and Co-Chair, SUSHI Maintenance Advisory Group. Adam will introduce the webinar by bringing to the audience a technical perspective and sharing more about the relationship between SUSHI and COUNTER and possible next steps for this standard.

Hana Levay, Information Resources Librarian, Collection Management Services, University of Washington Libraries, will then give a real-library perspective, sharing with the audience one example of how SUSHI was not only implemented at the University of Washington, but how it is being applied in a real way, including setting up SUSHI (in this case, using Innovative Interfaces' ERM), the kinds of reports being supplied via SUSHI, integrating usage statistics into a collection development assessment tool via an ERM, and providing examples of how usage reports are being used in decision making

Register at the SUSHI webinar event webpage.

International Update: TC46/SC9, Identification and description

Newsline will periodically provide updates of developments in the international standard committee, ISO TC46 (Information and documentation) SC9 (Information and description), for which NISO is the new Secretariat.

Expect to see a lot of ballot activity this fall. Two standards have recently been submitted to ISO for issuance as Draft International Standard (DIS) ballots. DIS ballots run for five months.

  • ISO/DIS 690, Guidelines for bibliographic references and citations to information resources, combines and updates the two parts of the standard into a third edition. It applies to all the different kinds of information resources that might be cited, both print and electronic, including software, audio, and visual resources.
  • ISO/DIS 26324, Digital Object Identifier System, is a new standard that specifies the syntax, description, and resolution functional components of the Digital Object Identifier (DOI®) system, and the general principles for the creation, registration, and administration of DOI names. (The NISO standard on the DOI, ANSI/NISO Z39.84-2005, specifies only the syntax.)

Two standards are in their final editing stages and should also be ready for balloting this fall.

  • ISO/DIS 27729, International Standard Name Identifier (ISNI), is a new standard that specifies an identifier for the public identity of parties, that is, the identities used publicly by parties involved throughout the media content industries in the creation, production, management and content distribution chains. It aims to provide an efficient means to disambiguate such public identities in the digital era so that the roles participants play in creation, production, management, and content distribution chains can be recognized accurately, and the content they are involved in creating can be managed effectively.
  • ISO/CD 27730, International Standard Collection Identifier (ISCI), is the first committee draft of a new standard to specify a unique international identification system for each collection and fond (a collection of papers) and part(s) of collections and fonds. It specifies the structure of an identifier, and promotes the use of the identifier with regard to the already existing identifying systems.

Edition 2 of ISO 10957, International Standard Music Number (ISMN), was approved by the SC9 members for publication and will also be submitted to ISO in September. A French translation of the standard, which normally takes about two months, will be required before final publication.

Collaborative Library Resource Sharing Forum Will Address Standards, Developments, and New Models for Cooperating

NISO will host a two-day forum on Collaborative Library Resource Sharing: Standards, Developments, and New Models for Cooperating on October 6-7, 2008, in the Georgia Tech Global Learning Center in Atlanta. Participants will explore areas where collaborative effort and standards can help improve library efficiency through resource sharing. This includes the area of interlibrary loan, physical resource management, collaborative storage and preservation, and related open source developments. New developments in each of these areas will help to improve efficiency in library resource sharing and hopefully improve user outcomes and satisfaction.

Confirmed speakers and topics include:

  • Matt Goldner (OCLC), Opening Keynote
  • Margaret Ellingson (Emory University), Current Resource Sharing Environment
  • Cecelia Boone (MINITEX), Union Lists in Support of Resource Sharing
  • Valerie Horton (Colorado Library Consortium), Physical Delivery
  • Rob Walsh (Envisionware) and Paula Kelsall (Library & Archives Canada), Resource Sharing Standards Update
  • Ted Koppel (Auto-Graphics), Gail Wanner (SirsiDynix), and Rob Walsh (Envisionware), Vendor Roundtable
  • Gail Wanner (SirsiDynix), Rethinking Resource Sharing
  • Evan Simpson (Brandeis University), Open Content Alliance – A New Approach to ILL
  • Adam Wathen (Kansas State University Libraries), Improving Library Workflow Efficiencies

If you are looking for ways to reduce costs and improve efficiency through collaborative resource sharing, you won't want to miss this seminar. Visit the event webpage for more information and to register. Early bird registration ends on September 17.

New Specs & Standards

COUNTER Code of Practice for Journals and Databases, Release 3

Major changes in this new release include two new library consortium usage reports specified only in XML format and incorporation of the use of the SUSHI (Standardized Usage Statistics Harvesting Initiative) protocol (ANSI/NISO Z39.93:2007). A complete list of changes is in the introduction of the document. The deadline date for implementation of this new release is August 31, 2009.

ISO 11620:2008, Information and documentation – Library performance indicators

Edition 2 of the standard that "specifies the requirements of a performance indicator for libraries and establishes a set of performance indicators to be used by libraries of all types. It also provides guidance on how to implement performance indicators in libraries where such performance indicators are not already in use."

ISO 4217:2008, Codes for the representation of currencies and funds

Seventh edition of the standard that specifies the structure for unique three-letter alphabetic codes (alpha-3) and equivalent three-digit numeric codes that represent global currencies and funds. These codes are widely used in database applications.

Call for Comments: INCITS 453-200x, Information technology – North American Profile of ISO 19115:2003 – Geographic information – Metadata (NAP – Metadata, version 1.2)

This proposed new standard intends to be an inclusive document addressing ISO19115: 2003, Geographic information - Metadata, and the accepted modifications for use in North American applications. It includes best practices to guide data providers in capturing geospatial metadata. Obtain an electronic copy from: INCITS or ANSI. Send comments to: Barbara Bennett, ITI (INCITS).

Call for Comments: NFPA 909-200x, Code for the Protection of Cultural Resources Properties – Museums, Libraries, and Places of Worship

This proposed revision to the 1995 version applies to culturally significant structures and to their contents. Such structures include, but are not limited to, buildings that store or display museum or library collections, historic buildings, and places of worship. These structures also include spaces within other buildings used for such culturally significant purposes. For a review copy, contact the National Fire Protection Association (617) 984-7248, www.nfpa.org.

Call for Participation: IMS Global Learning Consortium, Next Generation IMS ePortfolio Work

The IMS ePortfolio specification currently plays an important role in enabling the standards-based transfer and archiving of complete portfolios in higher education. The proposed new work on the specification will take two forms: a maintenance release and a mapping between educational and workplace formats. A core group of IMS members, preferably from both education and industry, along with other partners within the larger ePortfolio community, are needed. If you are interested in joining this effort or learning more about it, please contact: PortfolioCall@imsglobal.org.

Media Stories

The Importance of Identifiers
Metalogue (8/22/2008) ; Gatenby, Janifer

Author Janifer Gatenby uses the analogy of a passport number as an individual's unique identifier, comparable to identifiers used for information resources. ISBN (International Standard Book Number), one of the most widely known identifiers, is only one of a number of identifier standards published by the International Standards Organization (ISO). Other ISO identifier standards include ISSN for serials, ISMN for music, ISRC for sound recordings, ISAN for audio-visual works, ISIL for libraries, and several projects still underway such as ISNI, a name identifier, ISCI, a collections identifier, and DOI, a digital object identifier. With the exception of ISIL and ISCI, all of these identifiers are used in commercial trade, which is why only some 30% of records in the WorldCat database have international identifiers. In the Internet world, identifiers are critical to accessing identical resources on multiple sites and also have the requirement of embedding a URL. Gatenby points out that URLs, which are addresses, make poor identifiers due to their frequent changes. Resolution systems, such as the DOI, have emerged to address this problem. Identifier registration does not always require simultaneous registration of the associated metadata, which can create problems. OCLC is implementing two identifier services, xISBN and xISSN, that will allow linking and retrieval between records in the WorldCat database and related resources in outside services. (Link to Web Source)

NISO Note: See the story in this issue containing an update of the work of the ISO Identification and description committee.

Member States Drag Feet on European Digital Library
EU Observer (08/12/08) ; Mahoney, Honor

The European Commission (EC) is urging member states to strengthen efforts to make Europe's cultural heritage available online. Although plans to make a European digital library are already underway, efforts to make works digitally available have been plagued by funding programs and a lack of technical know-how. The EC says that European libraries contain over 2.5 billion books but only about 1 percent of archival content has been made available online. The EC says that more funding needs to be allocated to the digitalization effort, calling the 27 member states' efforts to date "small scale" and "fragmented." There are also significant discrepancies between member states progress toward the goal. Some countries, such as Slovenia, are making "exemplary" progress on the project, while only one in four German museums that have digitized material offer online access to that material. In Poland, only 1 percent of digitized material is available online. Copyright issues also are a problem that still needs to be solved, particularly in regard to orphan works where artists cannot be found to give consent to digitalization. Despite lagging efforts from some member states, the commission says it is determined to push ahead with plans for a European Digital Library by the end of 2008. (Link to Web Source)

The OAI2LOD Server: Exposing OAI-PMH Metadata
as Linked Data

XML Daily Newslink (08/06/08) ; Haslhofer, Bernhard; Schandl, Bernhard

The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PHM) is experiencing a period of growing popularity among digital libraries and archives. Institutions acting as data providers can easily expose their metadata through OAI-PHM by implementing lightweight wrapper components on top of existing metadata repositories. OAI-PHM's design is based on the Web Architecture, but it does not treat its conceptual entities as dereferencable resources. Furthermore, selective access to metadata is still unavailable. For example, one can retrieve metadata for a certain digital item, but cannot retrieve all digital items that have been created by the author. The OAI2LOD Server provides a possible solution to such shortcomings by following the Linked Data design principles and by providing SPARQL access to metadata. The ongoing Object Reuse and Exchange (OAI-ORE) standardization, a set of standards for the description and exchange of aggregations of Web resources, indicates that Linked Data will play a significant role in the context of digital libraries and archives. OAI-PHM and OAI-ORE overlap in the fact that Resource Maps can be included as OAI-PHM responses, which allows for group retrieval and harvesting or aggregation information. A tighter integration of these two standards could provide several benefits. For example, if OAI-PHM metadata repositories display their content at Web resources by assigning HTTP-dereferencable URIs, these items could be used in OAI-ORE aggregations. The OAI2LOD Server could act as a conduit between the two standards. (Link to Web Source)

NISO Note: The Dublin Core Metadata Element Set (ANSI/NISO Z39.85:2007), mentioned in this article, is a NISO standard.

The Networked Library Service Layer: Sharing Data for More Effective Management and Co-operation
Ariadne (07/08) Vol. 56, ; Gatenby, Janifer

Current trends demand a reexamination of the architecture for the integrated library management system (ILS) for smoother integration into today's electronic records management (ERM) environment, writes OCLC's Janifer Gatenby. The Digital Library Foundation has launched two initiatives, the ILS Discovery Interface Group and the Electronic Resource Management Initiative, to produce recommendations for interoperability of ILS and ERM systems. One way of reexamining the architecture of the ILS and other library systems is to examine the data held and assess what would be the optimum level of storage for that data. Certain characteristics, such as needing more effective exposure on a Web-scale site, the ability to attract users and user contributions, and the ability to make tasks less complicated or more accurate are strong indications that sharing the data would be beneficial. As library collections are increasingly shared, there may be significant advantages in cost and efficiency in more acquisitions and licensing data and processes to the network level, where they can be shared among the ILS, ERM, and other repositories within the library. Placing data at the network level will help disentangle the ILS, ERM, resolver, digital management, digital repository, and reference systems, and make their data accessible to all systems. However, disentangling data requires a standards layer that does not yet exist, which may be best achieved by encouraging adoption through the use of existing extensible standards. (Link to Web Source)

NISO Note: NISO is addressing ILS/ERM integration with its new CORE (Cost of Resource Exchange) project to develop a protocol for the transfer of cost and related financial information from an ILS Acquisitions module to an ERMS. The VIEWS project mentioned in this article, after transfer to NISO, resulted in a recommended practice, Best Practices for Designing Web Services in the Library Context. The NISO standards mentioned in this article are available for free download: Z39.50, OpenURL, and NCIP. (Note that NCIP is undergoing a revision.) OCLC is a NISO voting member.

Creating Scholarly Tools and Resources for the Digital Ecosystem: Building Connections in the Zotero Project
First Monday (08/08) Vol. 13, No. 8 ; Cohen, Daniel J.

George Mason University professor Daniel J. Cohen cites the open source Zotero Project at the university's Center for History and New Media as a guide for what Web 2.0 should be for universities, museums, and libraries. The Zotero Project seeks to deliver a high-quality bibliographic, research, and note-taking tool, and Cohen believes that "one critical element of the Zotero Project has been the way in which we think of any scholarly tool or resource as existing in an interconnected digital ecosystem—that is, the way in which the Project looks beyond itself." He notes that the exponential growth of online resources available from the library and museum community over the last 10 years made it clear that the premier site for research was the Web browser, which inspired the notion that Zotero should be embedded in the browser. "Existing on top of the browser as an extension (rather than within it as a Web application) means that unlike traditional standalone applications, Zotero can 'read' what is going on in the browser and act on that information," Cohen says. The technology used to link with the digital ecosystem's natural scholarly resources takes the form of "translators," which are small bits of code generated by development teams both inside and increasingly outside the Zotero Project so that the tool can know when it is studying a digital object and what actions can be taken on that object. The support of many interchange formats, including MARC, MODS, COinS, and others, also has been a focus of the Zotero Project, one that establishes communication between Zotero and many other services and software. Cohen says making tools aware of existing standards and other broadly implemented technologies improves their usefulness without the tool-builder taking any action. (Link to Web Source)

NISO Note: RefWorks is a business unit of ProQuest, a NISO voting member.

At Libraries, Taking the (Really) Long View
Inside Higher Ed (07/23/08) ; Guess, Andy

Librarians from research universities and other institutions are working to solve a variety of problems related to digital content preservation, such as constructing storage devices that monitor and repair data while remaining easily scalable, redundancy measures, distributing and duplicating data cross storage devices and across the country, universal standards to keep formats readable in the distant future, and interfaces such as open software protocols that manage digital holdings and make content accessible to the public. Some solutions are still in development, and various institutions are trying different approaches, with some corporations competing with each other while others collaborate on open source solutions. "For the most part, they're all untested," says Purdue University professor of library science and interdisciplinary research librarian Michael Witt. "None of the solutions have withstood the test of time yet." Stanford University associate director of digital library systems Tom Cramer has been working on the Stanford Digital Repository (SDR), which currently hosts geospacial data and content from other scholarly sources. The SDR provides a "trusted environment for long-term digital information storage and preservation activities," according to the project's website. Meanwhile, Sun Microsystems has established the Sun Preservation and Archiving Special Interest Group, a collaboration between universities, research librarians, and the government that will periodically discuss digital archiving issues. The fragility of corporate partnerships was highlighted with Microsoft's discontinuation of the Live Search Book project, further encouraging libraries to go with open-source software. Since many of the challenges are human rather than machine, libraries are beginning to create new roles, such as "digital preservation officer."
(Link to Web Source)

NISO Note: For more on preservation, view the presentation slides from NISO's March 2008 forum on Digital Preservation: Planning Today for Tomorrow's Resources. Stanford University is a NISO Library Standards Alliance member.

Web-Security Inventor Charts a Squigglier Course
Wall Street Journal (08/13/08) P. B5 ; Smith, Ethan

Carnegie Mellon University professor Luis von Ahn, the primary inventor of the Captcha online security test, has updated the system to make the test more secure. The new ReCaptcha system would also have users assist in the digitalization of old books and newspapers. The new system presents users with two words containing distorted characters. Both words are taken from an old book or newspaper article that has been scanned into an online library. One of the words was recognized by the scanning software, while the other was unrecognizable to the computer, possibly because of a smudge or some other imperfection on the original document. The user tries to decipher the distorted characters of both words, and if the user matches the first word correctly, which the computer already knows, then the user's reading of the unknown word is recorded. Multiple Web users will be shown the same unknown word as part of different tests, and when three people have submitted the same answer for the unknown word, it is considered solved and added to the library database to be inserted into the digital version of the document. Deciphering these unknown words is one of the greatest challenges for the Internet Archive library digitalization effort, since scanning software generally has an accuracy rating of only about 80 percent for books published before 1900. About 40,000 websites now use the free ReCaptcha system, and when fully operational, von Ahn expects it to process about 160 books a day for the Internet Archive. "It's a really mind-blowing application," says Internet Archive founder Brewster Kahle.
(Link to Web Source)