Headshot of NISO Managing Director, Todd Carpenter

April 2011

Since its earliest days, publishing has been a business of technology and driven by content production standards that help make the process of creating and replicating content for distribution more streamlined. This is as true today as it was in the 15th century. Standards for digital content creation and production, which are just passing through their infancy, moved several steps forward recently.

Last week, NISO released two draft standards for trial use: the revised DAISY Authoring and Interchange specification and the Journal Article Tag Suite (JATS). Both of these standards go to the heart of content creation in ways that smooth production, enhance distribution, and improve access. Both standards, which have existed in other forms and saw significant adoption prior to the most recent revisions, should receive close attention, review, and adoption from the information community. Both working groups should be commended for their quick work to bring the projects to this stage.

For the journal community, the JATS standard-long known as the NLM Journal Archiving and Interchange Tag Suite (or the NLM DTD for short) after the organization that created and maintained it, the National Library of Medicine-has grown to be one of the foundational elements of a large segment of journal production since its first release in 2003. Its existence has allowed for the creation of standardized production tools, preservation systems, and provides a lingua franca of communicating journal content in XML. After a relatively quick development cycle, the latest revision of the tag suite, now standardized through NISO is available for trial use. For more information about JATS and to participate in the trial, please visit the working group page for the project.

The second draft released for trial use is a revision of the DAISY Digital Talking Book standard, now expanded beyond just books as the Authoring and Interchange Framework. This standard provides for the creation of electronic resources in formats that are usable by people who, for different reasons, have problems using regular printed media. The standard is a means defines how to use XML to represent different kinds of information resources for transformation into universally accessible formats. The DAISY standard is being incorporated into and closely aligned with the recently announced EPUB revision by the International Digital Publishing Forum (IDPF). For more information, visit the DAISY website http://www.daisy.org/z3986/2011/Z3986-2011A.html or the DAISY Revision working group page.

To understand the value of these two standards, one need only consider the costs of remodeling one's home. Anyone who has considered home renovations realizes that it is easier to have put plumbing during original construction than to change it in an existing structure. The same is true when buildings are retrofitted to provide access to people with disabilities. These potentially significant costs could be avoided if designers had put into place basic design principles from the outset. This is equally true with content creation and distribution. Creating content in XML from the outset, while adhering to community standards, reduces production costs and eases the challenges of content migration and preservation. In addition, when content is created with accessibility, preservation, and multiple distribution formats in mind not only reduces costs, but facilitates the one thing content providers seek: greater potential readership.

If you are involved in content creation, either for books or journals, do take some time to review these important projects and give us your feedback. There are several other NISO projects expected to be distributing documents this spring and we look forward to sharing more news to come in the coming weeks and months.

Todd Carpenter’s Signature

Todd Carpenter

Managing Director

NISO Reports

April Two-Part Webinar: RFID Systems for Libraries

A new three-part ISO standard on RFID in Libraries (ISO 28560) has been published (see related story in this issue of Newsline). NISO has a revision underway for the recommended practice, RFID in U.S. Libraries (NISO-RP-6-2008) to provide U.S. implementers with guidance on how to provide RFID in a way that adheres to the ISO work. NISO's two-part April webinar on RFID Systems will provide background on the use of RFID in libraries and bring attendees up-to-date on the recent standards and what they mean to both system vendors and libraries. Part 1 provides a more general overview while Part 2 gets into more technical details. You can register for either or both sessions. There is no requirement to attend Part 1 in order to attend Part 2.

The first part of the webinar, to be held on April 13 from 1:00 - 2:30 p.m. (Eastern time), will provide a broad look at RFID, giving libraries a better understanding of what RFID tags might do to help libraries and giving attendees some information about how what roles various players in the supply chain play in the provision of RFID tags and associated services. Speakers will provide both a library (user) perspective and the supply chain perspective from both a technology supplier and a service supplier.

The second part, to be held on April 20 from 1:00 - 2:30 p.m. (Eastern time), looks more closely at the ISO RFID standard and the NISO Recommended Practice on RFID in U.S. Libraries. This webinar will focus on key portions of the documents to help attendees better understand what they might need to know when implementing RFID locally in order to ensure interoperability. In particular, speakers from NISO's RFID working group will discuss the data model, security issues, and privacy and vandalism.

You may register for either or both parts of the webinar; registrants to both part of the webinar receive a 25% discount. NISO and NASIG members can register at the member rate. There is also a student discount. Can't make it for the live webinar date? Registrants get access to the recorded version for one year. For more information and to register, visit the event webpages: Part 1; Part 2

May Two-Part Webinar: The Future of ILS

Both the back-end and the front-end of the traditional Integrated Library System (ILS) are changing and evolving. The back-end is being impacted by the change to RDA cataloging that has only just begun. The front-end is changing through the integration of the ILS with other systems and a web interface layer as well as the interactivity driven by Web 2.0. RDA implementation is expected to drive additional changes to the user-side of the ILS in ways that are still not fully understood.

NISO's two-part May webinar looks at The Future of the ILS from both of these perspectives. On May 11, Part 1 considers RDA & Cataloging from the perspective of the cataloger and the system vendor. . On May 18, Part 2 looks at User Interaction. Both webinars will be held from 1:00 - 2:30 p.m. (Eastern time). You can register for either or both parts. There is no prerequisite to attend Part 1 if your interest is in Part 2.

RDA holds the promise to more closely align Function Requirements for Bibliographic Records (FRBR) and Functional Requirements for Authorities Data (FRAD) to library catalogs. While RDA poses a disruptive shift in current cataloging practices, it also carries with it tremendous potential to move the library catalog into the age of the semantic web. In Part 1 of the two-part webinar series, a cataloger and a systems vendor will talk about what an ideal ILS that incorporates RDA might look like.

In Part 2, the webinar shifts to what the future of the ILS will hold with respect to user interaction. Numerous studies have shown that the typical user goes to Google before going to the library's website. What changes are underway or envisioned that can drive the user back to using the library's system or better yet make the library's data work better in the greater web environment. We're already seeing a Library 2.0 world where many libraries encourage users to add to the bibliographic information supplied by the cataloger. How can we take it even further, going beyond the library system "silo". And what are the implications of an RDA-driven library catalog on the user interface? What happens in a world where RDA can set data free from the library system to be linked to and mashed-up with other data or applications? Part 2 of this webinar looks at developments that are both underway and envisioned for the future to radically change the way the user interacts with the library's data.

You may register for either or both parts of the webinar; registrants to both part of the webinar receive a 25% discount. NISO and NASIG members can register at the member rate. There is also a student discount. Can't make it for the live webinar date? Registrants get access to the recorded version for one year. For more information and to register, visit the event webpages: Part 1; Part 2

Mobile Technologies in Libraries Forum

NISO will be holding a one-day in person forum in Philadelphia on May 20, 2011 on the topic of Mobile Technologies in Libraries. The visibility and utility of mobile hardware, software, and connectivity continue their exponential increase. Libraries are finding it difficult to ignore the implications of a perpetually connected user base that want to use mobile devices to access information resources traditionally confined to desktop or laptop computer access. Library patrons stand to benefit enormously if libraries can effectively offer their information resources in the now-ubiquitous mobile medium.

Topics and speakers for the forum include:

  • Introduction: How Standards Fit (or don't fit) in Mobile Computing – Todd Carpenter, Managing Director, NISO

  • Opening Keynote – Brian O'Leary, Founder and Principal, Magellan Media

  • Using Surveys to Find out What Uses Want with Mobile Devices – Bennett Claire Ponsford, Digital Services Librarian, Texas A&M University Libraries

  • MedLine Mobile: The Why, What, and How – Loren Frant, Head of the Health Information Products Unit, National Library of Medicine (NLM)

  • Models for Mobile in Teaching and Learning – Chris Millet, Manager of Advanced Learning Projects at Education Technology Services, Penn State University

  • Embracing Mobile Devices: Libraries and Mobile Technology – Jason Casden, Digital Technologies Development Librarian, N.C. State University Libraries

  • Mobile Interfaces & the Impact on (and Opportunities for) Publisher Content – Speaker TBA

  • "Ask Anything" Session – Bring your questions, comments, and ideas to share with the entire group.

The agenda, registration, and hotel information are available on the event webpage. Get the early bird discount by registering before May 1. NISO members and students receive a discounted rate.

Two Draft Standards Issued For Trial Use: DAISY and JATS

Two NISO working groups have completed their development work and issued draft standards for trial use, following approval by the Content and Collection Development Topic Committee.

Authoring and Interchange Framework (NISO Z39.86-201x) is a revision of the 2005 standard, Specifications for the Digital Talking Book. The revised standard is a modular, extensible architecture create conformant profiles for representing digital information resources in XML to produce documents suitable for transformation into different universally accessible formats that are not limited to book content. The standard does not impose limitations on what distribution formats can be created from it; e-text, Braille, large print, and EPUB are among formats that can be produced in conformance with the standard. The standard will be of interest to any organization using an XML authoring workflow, developers and publishers of universally accessible digital publications, and agencies interested in creating profiles for new document types to integrate into distribution formats. The draft standard is available online. Trial users are encouraged to submit comments and report all issues. Identified issues will be evaluated and addressed as needed following the trial and prior to final publication of the standard.

JATS: Journal Article Tag Suite (NISO Z39.96-201x) is a new standard that revises and updates the NLM Archiving and Interchange Tag Suite, commonly referred to as the NLM DTDs. JATS provides a common XML format in which publishers and archives can exchange journal content by preserving the intellectual content of journals independent of the form in which that content was originally delivered. The draft standard defines elements and attributes for describing the textual and graphical content of journal articles as well as some non-article material, such as letters, editorials, and book and product reviews. The draft standard for trial use is available as both an online XML document and a downloadable PDF from the NISO website. An online commenting form is also available for trial users to provide feedback. Supporting documentation and schemas in DTD, RELAX NG, and W3C Schema formats are available on the NLM website.

Both standards have six month trials that end in late September. Following the trial, the respective working groups will revise the standards as needed based on feedback from the trial users and present the standards to the NISO voting members for approval.

Two NISO Standards Reaffirmed

NISO and the American National Standards Institute have approved the reaffirmation of two NISO standards. These standards underwent their regular five-year review where they were recommended for reaffirmation by the Content and Collection Management Topic Committee and approved by the NISO voting members in each of the standards' voting pools. The two standards are:

New on the NISO Website

New Specs & Standards

BSR/IEEE 2301-201x, Guide for Cloud Portability and Interoperability Profiles (CPIP)

A new project to develop a standard to advise cloud computing ecosystem participants (cloud vendors, service providers, and users) of standards-based choices in areas such as application interfaces, portability interfaces, management interfaces, interoperability interfaces, file formats, and operation conventions. For more information or to participate contact Lisa Yacone.

ISO/IEC 10646:2011, Information technology – Universal Coded Character Set (UCS)

This second edition consolidates all of the contents of Amendments 1 through 8 to the 2003 first edition. It specifies the Universal Coded Character Set (UCS) of over 109 000 characters from the world's scripts. It is applicable to the representation, transmission, interchange, processing, storage, input, and presentation of the written form of the languages of the world as well as additional symbols. ISO/IEC 10646:2011 is aligned with The Unicode Standard, Version 6.0, with the exception of U+20B9 Indian rupee sign that was accelerated into Unicode Version 6.0. The terminology for encoding forms (and encoding schemes) in ISO/IEC 10646 now matches exactly the terminology used in the Unicode Standard.

ISO 28560-:2011, Information and documentation – RFID in libraries

A new three-part standard that specifies a model for the use of radio frequency identification (RFID) tags for items appropriate for the needs of all types of libraries. It provides the framework to ensure interoperability between libraries in exchange of library items with RFID tags. Part 1: Data elements and general guidelines for implementation specifies the data model, system data elements and user data elements to be used on the RFID tags. Two encoding methods are defined. Part 2: Encoding of RFID data elements based on rules from ISO/IEC 15962 uses an object identifier structure to identify data elements. Part 3: Fixed length encoding deals with the encoding of a basic set of data elements in a fixed length format and the rest of the data elements in optional extension blocks. Parts 2 and 3 are mutually exclusive; the RFID tag would be encoded using only one of the two defined schemes.

COUNTER, Release 4 Code of Practice: Objectives, Timetable and Process for Development

The fourth release of the COUNTER Code of Practice will combine the two existing codes for Journals and Databases and Books and Reference Works. The objectives document describes the plans for that unified release, the inclusion of multimedia works, the functionality of the XML reports, and the use of SUSHI, as well as the implications of the PIRUS2 and Journal Usage Factor projects. A timetable is provided to reach the goal of release in early 2012 and vendor implementation by December 2013.

Microsoft, Intergen and the DAISY Consortium, Save as DAISY for Office 2010 Version 2.5 Beta

This updated version of the Save as DAISY add-in for Office 2010 incorporates a "Lite" version of the DAISY Pipeline transformation tool to create accessible versions of documents. Users can generate the DAISY XML for further processing, or you can generate a fully conforming DAISY file set with full navigation and full text synchronized with audio.

World Wide Web Consortium, Provenance Working Group

W3C has launched a new Provenance Working Group /, whose mission is to support the widespread publication and use of provenance information of Web documents, data, and resources. The Working Group will publish W3C Recommendations that define a language for exchanging provenance information among applications. See the Provenance Working Group Charter for more information.

Media Stories

Libraries and Mobile Services
American Libraries, posted 03/22/2011; by Cody W. Hanson

The Pew Internet & American Life Project reported that 82% of American adults own a mobile device that's a phone or used as a phone. This has important implications for librarians who may not realize the importance of spending their scarce resources on services for mobile devices. Libraries have been an advocate for more broadband Internet but the reality is that mobile may have superseded that need. The Pew report showed that ownership of mobile phones by African Americans and English-speaking Latinos is 7 greater than among Caucasians and 8-13% more likely to use it to access the Internet. In the last quarter of 2010, more smartphones were manufactured than PCs. Google and Apple are both targeting mobile users. Librarians need to be just as proactive in providing services to these users. (Link to Web Source)

NISO Note: Cody W. Hanson is also the author of the most recent Library Technology Reports (v.47, no. 2, February-March 2011) on Libraries and Mobile Services. The first chapter, Why Worry about Mobile?, is available for free online. NISO is holding a one-day forum on Mobile Technologies in Libraries on May 20 in Philadelphia, PA.

Building a Better ERMS
Library Journal, March 15, 2011; by Collins, Maria, and Jill E. Grogg

Libraries have been struggling for a decade to implement an electronic resource management system (ERMS) that will effectively manage the nonlinear and nonstandardized processes for electronic resources. "What has materialized, amid a patchwork of standards, is less like a silver bullet and more like a round of buckshot." The authors conducted two surveys, one for librarians and one for vendors (both commercial and open source). The six top ERM priorities identified by librarians were: workflow management, license management, statistics management, administrative information storage, acquisitions functionality, and interoperability. Several other features were also identified as lower priorities. Librarians have been particularly disappointed with the workflow management from acquisition to delivery-something they are quite used to in their ILS-in existing ERMS products. The NISO ERM Data Standards and Best Practices Review working group also identified workflow as a gap. Workflow management is further complicated because many other systems beyond the ERMS are involved, e.g. ILS, link resolvers, and discovery interfaces. The catch-22 of workflow support in ERMS is how to make it flexible enough for local differences and still be specific enough to address all areas of the workflow. License management has, in contrast, been a success with ERMS, not just in making the processing more efficient, but also in pushing the terms to the user. The ONIX for Publication Licenses standard was developed to automate the process of adding license terms to an ERMS but is not yet widely adopted. Statistics management is criticized for its lack of SUSHI (Standardized Usage Statistics Harvesting Initiative) support. More vendors are adding SUSHI as demonstrated in NISO's SUSHI server registry. Central storage of all the administrative information for e-resources is another ERMS success story. Greater acquisitions data is needed in the quest for cost-per-use. NISO's CORE (Cost of Resource Exchange) protocol to move acquisitions data between an ILS and ERMS has not yet been adopted by vendors. Lack of interoperability between ERMS and other library systems is the biggest problem. Data that resides in more than one system has to be manually maintained and is often out of synch. Standards to support interoperability are emerging but adoption varies widely. Some librarians are looking at more holistic systems that use service-oriented architectures and cross over the ILS and the ERMS. Two tables summarize and compare the existing proprietary and open source ERMS products.
(Link to Web Source)

NISO Note: For more information on the NISO project s and standards mentioned in this article, visit the following webpages: ERM Data Standards & Best Practices Review, CORE, KBART, and SUSHI. NISO members mentioned in this article are: Ex Libris, EBSCO Information Services, and OCLC.

Technology, Standards and Today's SDOs: An Interview with Andrew Updegrove ASTM Standardization News, March/April 2011

Andrew Updegrove, co-founder of the technology law firm, Gesmer Updegrove LLP has written extensively about standards and standards development organizations (SDOs). [See for example his Consortiuminfo website.] In this interview with ASTM Standardization News, Updegrove stated that the Internet has exposed the problems with the existing system of over 200 silo SDOs formed under ANSI. To create one initiative, such as the SmartGrid, dozens of different SDOs will need to create interoperable standards in parallel and the current system doesn't support that. Private industry and government also need to work in a way where government incentivizes, prioritizes foals, and facilitates standards development. ANSI and NIST (National Institute of Standards and Technology) need to have a closer relationship to target ambitious initiatives that require standards, such as the SmartGrid. Newer non-accredited standards consortia like W3C and OASIS were launched largely due to their impatience with the long development cycles of the traditional SDOs. They typically have released their standards internationally from the start rather than going through the traditional national/international (e.g. ANSI then ISO) cycle. The marketplace has driven the acceptance of these new consortium standards. These consortia and moves to open source software have created environments that are less receptive to the old model of paying royalties to patent owners of technology used in standards. Certification programs can be very useful to create consumer awareness of which products are "plug-and-play." The WiFi Alliance's certification program, as an example, let to the widespread adoption of the Bluetooth standard. A single searchable database of all standards, globally, could be useful for forming better relationships, avoiding duplication of effort, attracting contributors, reducing development costs, and encouraging adoption. One ironic lack is a standard for describing a standard; "there is no XML schema to make it easy to search, use and analyze data about standards." (Link to Web Source)

NISO Note: NISO is an ANSI-accredited standards development organization.

Distinguishing Published Scholarly Content With CrossMark Learned Publishing 24(2), April 2011: 87-93; by Carol Anne Meyer

Many versions of an article are easily found through Google searches including preprints and published versions of record. Also on the rise are retractions of peer-reviewed scholarly articles, 10 times more in the last 20 years according to a Thomson Reuters study. (See also the Retraction Watch blog.) Yet many of these retracted articles continue to be cited in current articles. Many articles also have corrections or other changes (corrigenda, errata, etc.) issued after publication. The National Library of Medicine reported in 1992 a record of 2000 corrections per year. How these corrections are published, communicated, and linked to the original article varies significantly. Often the corrections only are identified with the published version of record. Other versions such as an author's original manuscript, which may be preferred by some students and researchers because they are free, may have no indication of later corrections. CrossRef held focus group sessions in 2010 regarding "the issues of identifying versions of record and communicating post-publication changes" All three types of participants—researchers, librarians, and publishers—identified issues regarding different versions and lack of knowledge regarding post-publication changes. CrossMark is a forthcoming service from CrossRef that will "clearly identify published versions of scholarly content and provide a simple mechanism for readers to discover changes." By clicking on a CrossMark logo, researchers can determine if they are using the authoritative version and could optionally include additional information such as conflict of interest statements, locations of data depositories, the peer review process used, and any updates or corrections. The actual information could be determined by publishers or defined by specific communities of interest. CrossMark metadata would be deposited to CrossRef along with the DOI metadata. DOIs could also be used to connect researchers to corrected copy. CrossMark logos will only be available for "versions of record" where the publisher is committed to maintaining the content and any information about updates to it. There is no digital rights management associated with a CrossMark; it has nothing to do with copyright or license management. A small group of publishers will pilot CrossMark with a production roll-out scheduled for mid-2011. Future developments could include a CrossMark logo on relevant search results or in link resolvers to indicate the version of record and an API to aggregate data for metrics, e.g. quantifying number of revisions. (Link to Web Source)

NISO Note: The NISO Journal Article Versions recommended practice (NISO RP-8-2008) discussed in this article is available for free download from the NISO website. CrossRef is a NISO voting member.

Towards Transparent and Scalable OpenURL Quality Metrics
D-Lib Magazine, 17(3/4), March/April 2011; by Adam Chandler, Glen Wiley, and Jim LeBlanc

The OpenURL framework was a critical development in the provision of access to electronic full-text resources and for providing seamless linking between a citation and the actual referenced resource. The underlying assumption to making OpenURL work is that the citation metadata is consistent and accurate, which has not always been true in actual practice. Many articles and studies written since the publication of the standard have confirmed issues with bad OpenURL metadata. However, no systematic research or process exists to measure OpenURL metadata quality. The authors' investigation began with a project to analyze the linking problems from L'Année philologique database. An automated log processor and reporting software were developed to grade citation metadata for completeness and linking success. The original evaluation criteria were modified to focus on the elements and string patterns that appear most often in OpenURLs. For example, in the element frequency report for journal articles, 64% of link-tos recommend or require the starting page and 94% of the OpenURL providers in the Cornell sample provide this element, but in the L'Année philologique database, the element is never included, which can affect successful resolutions to full-text resources. For the report on string patterns, the study showed that whereas L'Année philologique used Roman numerals for volume numbers, 99% of the originating data used Arabic numbering. The success of this study for a single database supplier proved the usefulness of the model and its scalability. As a result, a NISO-sponsored working group, Improving OpenURLs Through Analytics (IOTA) was launched to investigate "the feasibility of creating industry-wide, transparent and scalable metrics for evaluating and comparing the quality of OpenURL implementations across content providers."
(Link to Web Source)

NISO Note: For more information on the IOTA project, visit the working group's webpage.