Headshot of NISO Managing Director, Todd Carpenter

August 2011

We're deep into the summer doldrums—and looking for heat-induced escapes from the steamy heat wave over much of the U.S.—but the pace of projects doesn't slow despite the heat, baseball season, or family vacations. Given the typically slow nature of standards development, one might be surprised that things at NISO seem to move so quickly we can barely catch our breath at times. In June during the NISO update at the ALA conference in New Orleans we discussed the pace of work underway at NISO.

This month represents my fifth anniversary as the Director of NISO and reflecting on our accomplishments since 2006, a lot has changed. The leadership infrastructure launched by the Board of Directors tripled NISO's capacity to get projects moving and keep them on track. Presently, there are more projects underway than at any time in NISO's history—17 active groups. We have decreased the average time from launch to publication to just under two years. Understanding that NISO's job is not only to develop and publish standards and recommended practices, but also extends post-publication to education and training, we have expanded that work as well with webinars, teleconferences, and in-person forums. Through these educational programs, a revamped and now open access Information Standards Quarterly magazine, the monthly Newsline e-newsletter, and the quarterly Working Group Connection update, we are building awareness and understanding about our work. The working groups are all taking a more active role in making presentations and writing about their projects and many projects continue after publication with a Standing Committee to promote adoption and provide implementation support. We're also well positioned to add new initiatives and new projects, with a new E-book SIG underway and four proposals under consideration at the moment.

Five years could seem a lifetime or it could fly by like a sparrow. Fortunately for me, the last five years at NISO have been much more the latter than the former. I'm looking forward to the next five years with equal excitement to the first day when I walked into the NISO office. I expect that we will continue to make progress in our five key areas: incubation, inclusion, development, consensus, and education. It will be exciting to continue to lead and participate in this pursuit. Thank you to everyone who has made this ride possible, productive, and enjoyable.

Enjoy the rest of your summer, everyone.

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

August Webinar: Tangible Assets – Management of Physical Library Resources

Although access to digital information is ubiquitous today, there is still a very solid demand for material in physical form. Libraries spend significant time and resources in the storage, preservation, and delivery of physical items, which remain critical to the core library values of user access and resource sharing. Yet in today's financial environment, libraries must find ways to be more cost-effective, ideally without reducing service levels. NISO's August webinar on Tangible Assets – Management of Physical Library Resources, to be held August 10, 2011 from 1:00 - 2:30 p.m. (Eastern time), will cover recent efforts in making work with physical materials as efficient as possible and will share creative solutions for managing these still-valuable library assets.

Speakers and topics include:

  • Driven to Distraction: Best Practice Recommendations on Library DeliveryValerie Horton (Executive Director, Colorado Library Consortium) will review the NISO Recommended Practice for Physical Delivery of Library Resources that is out for public comment until August 21, 2011. The document provides recommendations for improving performance and reducing the cost of moving materials between a library that owns an item and another library whose patron wants to use the item.

  • Collaborative Retention of MonographsTimothy Cherubini (Director, Regional Services, LYRASIS) discuss findings on the development of a national cooperative infrastructure for print monograph retention derived from a fall 2010 two-day IMLS-funded workshop.

  • Storage at the Joe and Rika Mansueto Library, University of ChicagoDavid Borycz (Special Projects Librarian, University of Chicago) will discuss the new innovative, onsite, underground library storage facility at the University of Chicago that utilizes automated technology to retrieve requested materials.

For more information or to register, visit the event webpage.

NISO/DCMI Webinar: International Bibliographic Standards, Linked Data, and the Impact on Library Cataloging

The International Federation of Library Associations and Institutions (IFLA) is responsible for the development and maintenance of International Standard Bibliographic Description (ISBD), UNIMARC, and the "Functional Requirements" family for bibliographic records (FRBR), authority data (FRAD), and subject authority data (FRSAD). ISBD underpins the MARC family of formats used by libraries world-wide for many millions of catalog records, while FRBR is a relatively new model optimized for users and the digital environment. These metadata models, schemas, and content rules are now being expressed in the Resource Description Framework language for use in the Semantic Web.

This joint NISO/DCMI webinar on International Bibliographic Standards, Linked Data, and the Impact on Library Cataloging, to be held August 24, 2011 from 1:00 - 2:00 p.m. (Eastern time), provides a general update on the work being undertaken. It describes the development of an Application Profile for ISBD to specify the sequence, repeatability, and mandatory status of its elements. It discusses issues involved in deriving linked data from legacy catalog records based on monolithic and multi-part schemas following ISBD and FRBR, such as the duplication which arises from copy cataloging and FRBRization. The webinar provides practical examples of deriving high-quality linked data from the vast numbers of records created by libraries, and demonstrates how a shift of focus from records to linked-data triples can provide more efficient and effective user-centered resource discovery services.

Speakers for the webinar are:

  • Gordon Dunsire, a freelance consultant with 25 years of experience working in academic libraries and ten years in digital library research. He is a member of IFLA's ISBD/XML Study Group and FRBR Review Group, and he chairs the IFLA Namespaces Task Group. He is currently a member of a W3C Incubator Group on Library Linked Data

  • Thomas Baker, Chief Information Officer of the Dublin Core Metadata Initiative, former co-chair of the W3C Semantic Web Deployment Working Group, and currently co-chair of a W3C Incubator Group on Library Linked Data

For more information or to register, visit the event webpage.

NISO Forum: The E-Book Renaissance: Exploring the Possibilities Exposed by Digital Books

E-books have existed in the library landscape for over a decade, but it is only in the last few years that their use has shifted to finally become the game-changer that all have anticipated for so long. Availability, distribution, licensing, discoverability, current and future access, and usage of e-books all require content providers and libraries to change many of their existing processes and find new ways to do business. Amidst this confusion is a wealth of opportunities for new collaborations and initiatives.

The NISO Forum, The E-Book Renaissance: Exploring the Possibilities Exposed by Digital Books, to be held October 24-25, 2011 in Baltimore, Maryland, will probe platform interoperability, archiving, and preservation issues from a variety of industry, scholarly, and consumer viewpoints.

Planned topics include opening and closing keynotes; panel discussions on publisher and content provider issues and from vendors and platform providers; discussion of libraries, librarians, and e-books; presentation on users, patrons, and devices in the hands of users; a review of e-book standards; roundtable discussions on topics from the new NISO E-book Special Interest Group; and an Ask Anything session for attendees.

Don't miss the chance to participate in this community discussion for advancing e-book development and support. Early bird discounts are available to those who register by October 12. For more information and to register, visit the event webpage.

Recommended Practice on Test Modes for SUSHI Servers Issued for Trial Use

A new NISO recommended practice, Providing a Test Mode for SUSHI Servers (NISO RP-13-201x), has been issued for a trial use period ending January 31, 2012. The Standardized Usage Statistics Harvesting Initiative (SUSHI) Protocol is a NISO standard (ANSI/NISO Z39.93-2007) that automates the retrieval of COUNTER usage statistics by libraries. The process of developing a SUSHI client requires testing against the SUSHI servers where usage data is expected to be harvested. The new Recommended Practice describes how content providers should provide access to their SUSHI Servers in a test mode so that clients can be set up easier and faster, which is of benefit to both libraries and content providers.

The draft Recommended Practice and an online comment form are available from the SUSHI Server Recommendation webpage. All content providers who provide COUNTER usage statistics are encouraged to implement the recommendations during the trial and provide their feedback.

New on the NISO Website

New Specs & Standards

JISC Committee for Development of RDA, DCMI/RDA Task Group, and ALA Publishing, First RDA Vocabularies Published

The first group of RDA controlled vocabularies has been reviewed, approved, and their status in the Open Metadata Registry (OMR) changed to published. This status change, from "new-proposed" to "published" signals that the final steps have begun in reviewing the work of the DCMI/RDA Task Group and ensuring that the RDA vocabularies (both elements and Controlled vocabularies/concepts) are available in a stable form for the builders of applications.

The finished vocabularies are: RDA Aspect Ratio, RDA Form of Musical Notation, RDA Format of Notated Music, RDA Layout of Cartographic Images, RDA Mode of Issuance, RDA Other Distinguishing Characteristic of the Expression of a Legal Work, RDA Production Method for Tactile Resource, RDA Reduction Ratio, RDA Scale, RDA Sound Content, and RDA Status of Identification.

W3C Working Draft, User Agent Accessibility Guidelines (UAAG) 2.0

The User Agent Accessibility Guidelines Working Group has published an updated Working Draft of the User Agent Accessibility Guidelines (UAAG) 2.0. UAAG defines how browsers, media players, and other "user agents" should support accessibility for people with disabilities and work with assistive technologies. The Working Group also published an updated Working Draft of Implementing UAAG 2.0. For information on changes and providing feedback, read the invitation to review the UAAG 2.0 Working Draft.

W3C Candidate Recommendation, Ontology for Media Resources 1.0

The Media Annotations Working Group is calling for implementation of the Candidate Recommendation Ontology for Media Resources 1.0. The term "Ontology" as used in this document is in its broadest possible definition: a core vocabulary. The intent of this vocabulary is to bridge the different descriptions of media resources and provide a core set of descriptive properties. This document defines a core set of metadata properties for media resources, along with their mappings to elements from a set of existing metadata formats. It also presents a Semantic Web compatible implementation of the abstract ontology using RDF/OWL. The document is mostly targeted towards media resources available on the Web, as opposed to media resources that are only accessible in local repositories.

Media Stories

The Journal Usage Factor Project: Results, Recommendations and Next Steps
COUNTER Project Report, July 2011; by Peter Shepherd

The Journal Usage Factor (JUF) is a proposed new measurement of journal impact and quality that will complement the journal Impact Factor (IF) from ISI and compensate for some of its weaknesses. Unlike the IF, which is based on citation data, the JUF looks at actual usage of an online journal and can begin collecting and reporting data immediately after publication. Phase 1 of the project looked at the usefulness and viability of a JUF. Librarians rated a potential JUF second in importance for acquisition decisions and third in importance for retention and renewal decisions. 62.5% of authors felt the IF was given too much weight in assessing authors' work and 70% welcomed an additional JUF measure. Phase 2 of the project tested the proposed JUF formula using real COUNTER data and included a statistical analysis performed by CIBER. Because the data showed high variance in usage between items, it was recommended that the formula be changed to use the median rather than the mean in calculating JUF. A mean usage factor should include a confidence level to address issues of statistical "noise." A maximum 24-month window for collecting data is sufficient and shorter windows of 6 or 12 months could be considered for the future. JUF comparisons need to be done within broad subject domains as the usage trend pattern over time varies by subject. The JUF data did not show a statistical association with citation impact and thus provides very different information. The measure is highly subject to "gaming" especially using software agents. The study showed that additional indicators for journal usage half-life or a reading immediacy index might also be useful. Phase 3 of the project will use these conclusions for the next steps, which will include preparation of a draft Code of Practice for the Journal Usage Factor, developing an updated subject journal classification taxonomy, and running a trial with a subset of publishers.
(Link to Web Source)

Joining an Open Source Community: Creating a Symphony Connector for the XC NCIP Toolkit
Code4Lib, Issue 14, 2011-07-25; by Michelle Suranofsky

Lehigh University, as a member of the Pennsylvania Academic Library Consortium Inc. (PALCI), was required to put in place an implementation of the NCIP protocol (ANSI/NISO Z39.83) to communicate with the new Relais EZ-Borrow resource sharing system that PALCI installed. Lehigh chose to use the eXtensible Catalog (XC) NCIP Toolkit even though an API bridge between it and their SirsiDynix Symphony ILS would have to be written. NCIP version 2 was chosen as the XC Toolkit new version was just becoming available and the Relais system supported it was well. The Toolkit core code was set up as a project in Eclipse on a Tomcat server. The connector code had to be written for each NCIP supported service; only four of the NCIP services were required by PALCI. Additionally, a remote service manager class and several error or constraint classes were required. XC documentation on connector development was an excellent aid. Testing was done by adding some JavaScript to the Toolkit's index.html file, which eliminated cutting and pasting. JMeter was used for performance testing. The Symphony Connector Java code that was developed was made available in open source from the XC NICP2 Toolkit Google Code website. The Perl scripts that do direct calls to the Symphony API are available on the Symphony "sirsiapi.org" website to other customers of the Symphony API. The production service has been used over 4,000 times since March 2011 without any problems. While the existing Symphony API code works nicely for the four NCIP services, a scale-up to all 50 services may need a different design to maximize Java class reuse. XC holds biweekly calls with those working with the XC NCIP2 Toolkit that is very beneficial to toolkit users. (Link to Web Source)

NISO Note: The NCIP Protocol is a NISO standard (ANSI/NISO Z39.83-2008) that is available for free download. Additional support information and an NCIP Interest Group email list are available from the NCIP Standing Committee workroom area. SirsiDynix is a NISO voting member.

The Open Annotation Collaboration Phase I: Towards a Shared, Interoperable Data Model for Scholarly Annotation
Journal of the Chicago Colloquium on Digital Humanities and Computer Science, v. 1, no. 3, 2011; by Timothy W. Cole, Myung-Ja Han

A number of annotation technologies exist, many of which are designed for specific disciplines and all are based on different models that may have limitations for specific purposes. The Open Annotations Collaboration (OAC) was formed in 2009 to develop a standard model and method for creating and distributing scholarly annotations of Web resources. A general model was first developed with the potential for domain-specific models based on the generic model to be developed in a later phase. The OAC work is built on a set of guiding principles that include interoperability across tools and collections, linking of an annotation body to an annotation target, separation of annotation and annotation body, use beyond text formats, accommodation of multiple body and/or target resources, and utilization of extensible classes, entities, properties, and relationships. A fall 2010 alpha model was consistent with Linked Data Initiative and Semantic Web Representing Content in RDF recommended practices. Scholarly annotation use cases were used to define the baseline requirements but have also identified further issues to pursue, such as allowing an annotation to refer to, for example, all instances of a digital novel, not just a particular edition. The specialized use case of the Emblematica Online project shows a number of requirements that are met with the draft OAC annotation model, such as grouping three related annotation instances into a multi-target annotation. However, some requirements such as contextual constraints, are not completely met with the current model. The model will be more refined in Phase 2 and identified issues further explored. (Link to Web Source)

NISO Note: NISO has received a Mellon Foundation grant to hold two pre-standards workshops on E-Book Annotation Sharing and Social Reading. The workshops will be open to the public but space is limited. Registration information will be available shortly.

Linked Data: A Way Out of the Information Chaos and toward the Semantic Web
EDUCAUSE Review, v. 46, no. 4, July/August 2011; by Michael A. Keller

The use of the Resource Description Framework (RDF) offers the possibility of addressing the four existing issues with discovery and access of information by students and faculty. These issues are: 1) too many independent silos of information; 2) discovery tools that are too imprecise and have inadequate recall; 3) library metadata that is not linked to the Web; and 4) competition from Google and other search engine offerings that obscure the many resources they do not search. The Semantic Web approach to information focuses on relationships between metadata. Library bibliographic data using authority files can translate well to RDF triple statements. Semantic web linked data can also use or build on web technologies such as URIs in place of URLs and RDF in place of HTML. To use the new linked data to enable improved search and discovery will require: translating existing metadata to RDF triples, using the new grid of these triples to "focus precision and expand recall," making the RDF-encoded data freely available to web crawlers, and allowing users to create annotations and other RDF-triples tied to data generated by others. A large prototype of RDF-coded linked data from existing library metadata could be a test bed for creating and testing new mechanisms for discovery. (Link to Web Source)

NISO Note: To learn more about linked data and library cataloging, register for the NISO/DCMI August 24 webinar on International Bibliographic Standards, Linked Data, and the Impact on Library Cataloging. For a related article, see MARC21 as Data: A Start by Karen Coyle in Issue 14 of Code4Lib.