Noteworthy

NISO Receives Mellon Grant to Develop Community Resource of Digital License Encodings

The Andrew W. Mellon Foundation has awarded NISO a grant to develop a community resource of digital license encodings in the ONIX for Publications Licenses (ONIX-PL) format that will be freely available within the Global Open KnowledgeBase (GOKb). The encodings will allow libraries that license electronic content to take those encodings and import them into their own electronic resource management systems for further local customization and implementation. The project will also fund some publicly available training resources that will inform community members on how to use those encodings for their own purposes.

ONIX-PL, published in 2008 by EDItEUR, is an XML messaging format to encode and communicate license terms for digital publications in a structured and standardized way. In a “Catch 22” type of situation, publishers have not moved to use ONIX-PL to encode licenses because the ERM systems had not yet been set up to import them. (This is slowly changing; Serials Solutions, for example, has announced the ability to import ONIX-PL into their system.) Additionally, because many licenses were still customized for each library customer, the labor involved to encode them was more than most publishers wanted to undertake. Libraries were also sometimes reluctant to accept the publishers’ encoding as many terms are open for some interpretation and the libraries did not want to be bound by a publisher’s interpretation of the terms.

The Global Open Knowledgebase (GOKb) is an element of the larger Kuali OLE initiative to provide open source management systems to the library and academic communities. As announced, the “GOKb will be an open, community-based, international data repository that will provide libraries with publication information about electronic resources. This information will support libraries in providing efficient and effective services to their users and ensure that critical electronic collections are available to their students and researchers.” A similar KnowledgeBase Plus (KB+) project launched in 2011 in the UK by Joint Information Systems Committee (JISC) Collections has included in its repository license encodings of all the JISC Collections-subscribed content. However, since these encodings are restricted to JISC members’ usage, mainly for publisher confidentiality reasons, and the encodings in KB+ are specific to the terms that JISC and the publishers have negotiated, they have not been a resource for the broader community.

To address these gaps, NISO proposed a project to the Mellon Foundation—which was accepted and awarded the requested grant monies—to gather as many as fifty publisher and library template licenses, encode them using the ONIX for Publications Licenses format, and deposit them in GOKb for community-wide use under a Creative Commons Public Domain (CC-0) license. Library electronic resources staff could then export the encodings from GOKb and import them into their own electronic resource management system (ERMS). To ensure consistency with their existing encoding work and include deposits of the template licenses into KB+, JISC Collections is supporting the project with funding to train NISO’s consultant at EDItEUR on ONIX-PL and on
the JISC KB+ system. NISO will be contracting with Selden Lamoureux—Electronic Resources Librarian with SDLinform, a former Electronic Resources Librarian at both North Carolina State University (NCSU) and at University of North Carolina, and previous co-chair of NISO’s Shared Electronic Resource Understanding (SERU) initiative— as the consultant for the project.

To successfully promote the use of the encoded templates, the NISO project will include the development of video training resources for librarians and publishers. These will include tutorials on the ONIX-PL messaging specification, the encoded templates, and how to make adjustments to the encodings to reflect an institution’s specific, negotiated terms, as well as how to deposit those encodings into GOKb and KB+. The training materials will be posted on and freely accessible from the NISO website.

ONIX-PL: www.editeur.org/21/ONIX-PL/

Kuali OLE:
https://www.openlibraryenvironment.org/

GOKb press release: https://librarytechnology.org/news/pr.pl?id=16950

KnowledgeBase Plus (KB+): www.jisc-collections.ac.uk/KnowledgeBasePlus/

The ISO story

The International Organization for Standardization has published a timeline and slideshow of their history that began in 1946 in London when “65 delegates from 25 countries meet to discuss the future of international standardization.” A year later the organization was officially formed with 67 technical committees, followed in 1949 with the opening of the first office in a house in Geneva, Switzerland. It took until 1951 for the first standard to be published: ISO/R 1:1951, Standard reference temperature for industrial length measurements.

By the beginning of 2012, ISO had 163 members and over 19,000 published standards and offices in a modern high rise with almost 150 employees.

View the timeline and read the whole story at: www.iso.org/iso/home/about/the_iso_story.htm

The STM Report: An Overview of Scienti c and Scholarly Journal Publishing

  • The third edition of the report on the scientific, technical, and medical journal publishing industry discusses the latest trends and business models in scholarly communications.
  • Among the 32 summary points made in the report are:
  • The annual revenues generated from English-language STM journal publishing are estimated at about $9.4 billion in 2011.
  • There were about 28,100 active scholarly peer-reviewed journals in mid 2012, collectively publishing about 1.8–1.9 million articles a year.
  • The USA continues to dominate the global output of research papers with a share of about 21% but the most dramatic growth has been in China and East Asia.
  • Reading patterns are changing with researchers reading more, averaging 270 articles per year, but spending less time per article, with reported reading times down from 45–50 minutes in the mid-1990s to just over 30 minutes.
  • There is a significant amount of innovation in peer review, with the more evolutionary approaches gaining more support than the more radical....The most notable change in peer review practice, however, has been the spread of the “soundness not significance” peer review criterion adopted by open access “megajournals” like PLOS ONE and its imitators.
  • Social media and other “Web 2.0” tools have yet to make the impact on scholarly communication that they have done on the wider consumer web.
  • The explosion of data-intensive research is challenging publishers to create new solutions to link publications to data, to facilitate data mining and to manage the dataset as a potential unit of publication.
  • Semantic enrichment of content (typically using software tools for automatic extraction of metadata and identification and linking of entities) is now widely used to improve search and discovery; to enhance the user experience; to enable new products and services; and for internal productivity improvements.
  • Text and data mining are starting to emerge from niche use in the life sciences industry, with the potential to transform the way scientists use the literature.
  • While the value of the “Big Deal” and similar discounted packages...is recognised, the bundle model remains under pressure from librarians seeking greater flexibility and control, more rational pricing models and indeed lower prices.
  • Journal publishing has become more diverse and potentially more competitive with the emergence of new business models—open access publishing, delayed free access, and self-archiving.
  • Research funders are playing an increasingly important role in scholarly communication.
  • Green OA and the role of repositories remain controversial.

Ware, Mark, and Michael Mabe. The STM report: An overview of scientific and scholarly journal publishing. Third edition. The Hague, The Netherlands: International Association of Scientific, Technical and Medical Publishers, November 2012. www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf

Study Reveals How Readers Discover Content in Scholarly Journals

Renew Training, run by Simon Inger and Tracy Gardner, published the results of a 6-month research project comparing the changing behavior of readers between 2005 and 2012 in discovering scholarly journal content.

The survey, conducted during May, June, and July of 2012, received over 19,000 responses worldwide. The data was compared to previous surveys conducted in 2005 and 2008. Some 68% of the respondents were from academia and approximately 47% identified themselves as academic researchers.

Among the findings were:

  • Use of a specialist bibliographic database for citation searching, e.g., PubMed, continues to climb.
  • Academic search engines such as Google Scholar are more popular than general web search engines and are the second most popular source for looking up a citation, after the bibliographic databases.
  • Readers faced with a citation seem to know their subject areas well enough to go directly to the web site of the journal to follow up on the citation, whilst the use of library web pages in this regard is in steady decline over the period.
  • A&I databases continue to grow as a resource for readers who wish to discover the latest articles in their subject area.
  • Showing a significant downward trend is journal alerts, however it is still the second most popular resource for discovering latest articles.
  • Specialist bibliographic databases (A&Is) are still the most popular resource for searching for articles on a subject.
  • Library web pages have grown significantly in popularity [for searching for articles on a subject], possibly due to the introduction of web scale discovery services.
  • Library web pages are of most importance to people working in Education Research and Humanities followed by Social and Political Science and Agriculture.
  • Community web sites such as Mendeley and Researchgate are used much less than other starting points for all three behaviours (citation, article, and subject searches).
  • A publisher’s web site has become more important for looking at latest articles in core journals over time.
  • The journal’s homepage has remained important for looking up a citation.
  • Students use Google Scholar slightly more than Google, and perhaps surprisingly academic researchers use Google more than Google Scholar—maybe because they are higher users of A&I databases and will use search engines for a more general search, negating some of the need for Google Scholar.
  • The group [most] using tablets and phones to access online journal articles is the medical sector.
  • As metadata distribution is maximised and users are able to choose more freely their preferred routes to content, many of the advanced features that users require seem to be migrating to their chosen discovery platforms leaving the publisher site ever more as a content silo....However, publishers remain under pressure to maintain a high level of functionality to ensure that they engage with content buyers, authors and editorial boards.

More findings, explanations, and charts are available in the free summary report, and detailed data including demographic breakdowns are available in the full report (purchase required).

Free summary report available at: www.renewtraining.com/How-Readers-Discover-Content-in-Scholarly-Journals-summary- edition.pdf

Full study available from: www.renewtraining.com/publications.htm

ISAN International Agency and Entertainment ID Registry Enable Cooperative Registrations

The International Standard Audiovisual Number International Agency (ISAN-IA) and the Entertainment ID Registry (EIDR) have agreed on processes that will support the seamless registration of content identifiers in either system and leverage their respective capabilities. The ISAN-IA has a broad network of regional Registration Agencies and a centralized database to implement the ISAN identifier standard (ISO 15706). The EIDR has built an automated system designed to integrate with enterprise IT applications. Together, ISAN-IA and EIDR can offer a combined service to the content industry that meets the needs of the broadest spectrum of content producers and distributors.

ISAN-IA and EIDR plan to link their two systems so that any ISAN registrant can obtain alternate EIDR IDS whenever needed in EIDR-based solutions. Similarly, EDIR registrants should be able to obtain alternate ISAN IDs to link their EDIR ID hierarchies into ISAN-based solutions. The two IDs and ID systems will be linked and cross-mapped to ensure easy interoperability for all users.

The two organizations also have established focused working groups to address any ongoing technical and operational issues and have jointly published a mapping of their metadata schemas. A priority for both is to ensure that their respective registrants can maximize their investments in either, or both, systems by ensuring backward and forward compatibility and ultimately guaranteeing the ability of registrants in either system to obtain the full benefits of registration without incurring duplicate registration costs.

Jud Cary of EIDR and Keith Hill of ISAN have been designated as Board-level executives to work on these issues, together with the Executive Directors of both organizations.

ISAN-IA: www.isan.org

EIDR: www.eidr.org

ISAN & EIDR Metadata Schema Mapping: www.isan.org/docs/ ISAN-EIDR_Metadata_Schema_Mapping.pdf

searchRetrieve version 1.0 Approved as OASIS Standard

searchRetrieve version 1.0, a multi-part specification that defines a generic protocol for the interaction required between a client and server for performing searches, was approved and published by OASIS in February 2012. Developed as a web-based successor to the popular Z39.50 standard, searchRetrieve defines a generic protocol for the interaction required between a client and server for performing searches. The new specification draws heavily on the abstract models and functionality of Z39.50, but removes much of the complexity.

The published standard is available as eight documents that include: Overview, Abstract Protocol Definition (APD), Binding for SRU (Search/Retrieval via URL) 1.2, Binding for SRU 2.0, Binding for OpenSearch, Contextual Query Language (CQL), Scan, and SRU Explain. The APD serves as a guideline for the development of application protocol bindings. A binding indicates the corresponding actual names of the parameters and elements to be transmitted in a request or response. The Contextual Query Language (CQL) is a formal language for representing queries to information retrieval systems. Scan is a utility protocol that allows a client to request a range of the available terms at a given point within a list of indexed terms and to select terms for subsequent searching. Every SRU or Scan server is required to provide an associated Explain document that provides information about the server’s capabilities and is retrievable as the response of an HTTP GET command.

Included with the standard are eight XML schemas:

  1. SRU (the default format for an SRU response)
  2. Diagnostics (the format for presentation of a diagnostic within an SRU response)
  3. Explain (the Explain format for SRU 2.0)
  4. Faceted results (the format for presentation of faceted results within an SRU response)
  5. Search result analysis (the format for presentation of search result analysis within an SRU response)
  6. XCQL (CQL expressed in XML)
  7. Scan
  8. SOAP Support

The searchRetrieve specification is available from: www.loc.gov/standards/sru/oasis/

OECD Study Reports on E-book Developments and Policy Considerations

OECD’s Committee for Information, Computer and Communications Policy (ICCP) has been commissioning a series of studies related to digital content. The latest report in this series, E-books: Developments and Policy Considerations, describes the e-book ecosystem; discusses trends in e-book production, sales, and use; and concludes with a number of policy considerations.

These policy considerations are:

  • Consumer rights with e-books:
    Many consumers believe they have the same rights with e-books as they had with print documents, which is not the case. Publishers and sellers of e-books have a duty to “clearly and conspicuously” disclose any limitations of rights prior to purchase/licensing.
  • Interoperability and consumer lock-in:
    E-book users are frequently “locked-in” to a particular e-reader device or online platform, which either limits the availability of content or forces readers to have multiple devices and/or platform subscriptions. Additionally, many e-books use a proprietary DRM technology. Standards need to be developed, both for e-book interoperability across devices/platforms and for DRM.
  • Distribution rights and consumer “lock-out”:
    E-books have been dropped into the existing system for print publishing sales and distribution that is geographically defined and where “foreign” distribution rights need to be specifically purchased for each local market. A new model allowing worldwide distribution rights for e-books purchased online would be more beneficial to consumers.
  • Competitive structure for e-books:
    “The fixing of book prices by publishers, under the so-called ‘agency model’ for e-books, is under scrutiny by competition authorities in both the United States and the European Commission.” Consumers prefer that the retailers/sellers have the ability to discount e-books. Taxation is also an issue, since the VAT discount allowed for print books has not generally been extended to e-books, which puts them at a disadvantage.
  • Privacy:
    The technology used to store e-book libraries in the cloud and to offer capabilities such as shared highlighting allow the e-book providers to also track reading behavior, without the reader’s awareness. Greater transparency and consumer education needs to be provided about such monitoring.
  • Copyright and piracy:
    In an effort to make illegal copying and piracy of e-books difficult to impossible, technology restrictions are being introduced that interfere with valid uses of the e-book for “public, social, educational or research purposes.”
  • Consumer lending of their books:
    The DRM used with most e-books generally prevents them from being shared between consumers or devices. Users need to be informed about technology or license restrictions on lending and publishers should consider the competitive advantage of offering e-books that can be loaned to others.
  • Library lending:
    Licensing restrictions for e-book lending are imposing increased costs on libraries. DRM technology can make lending difficult due to both device compatibility and restrictions to a geographically-specific edition. This is an area where further study and potential government action may be warranted.
  • Accessibility:
    Few e-readers currently include the functionality needed for print-disabled readers and new multimedia formats may make it even more difficult to “translate” an e-book to an accessible format. “OECD governments should consider options for encouraging publishers to make e-books available in formats (such as EPUB3) which support the software developed for accessibility for people with print disabilities.”
  • The need for more data:
    To ensure that policymakers have the information needed for the growing e-book market, “the organisation and co- ordination of relevant data, at both national and international level, should be considered a priority.”

OECD. E-books: Developments and Policy Considerations. OECD Digital Economy Papers, No. 208. OECD Publishing, October 29, 2012. doi: 10.1787/5k912zxg5svh-en http://www.oecd-ilibrary.org/science-and-technology/e-books-developments-and-policy-considerations_ 5k912zxg5svh-en