Last week, I had the opportunity to speak at the HighWire Press publishers meeting in sunny Palo Alto, CA. The Director of HighWire, John Sack had asked me to present on NISO's work in the areas of Journal Article Versions. He also asked that I discuss potential new work areas related to research data that came out of the Thought Leader meeting we'd held in November of last year. There is a continuing need to build awareness of the Journal Article Versions recommended practice (NISO RP-8-2008) that was published last year and to encourage its use in the community. If you work with authors, repositories, or publishers, it is worth reflecting on the recommendations in that report. As to the research data, there are a host of standards-related issues that our Architecture Committee is reviewing to determine the best areas for NISO to tackle.
Also, while I was in California I took the opportunity to meet with Peter Brantley and Brewster Kahle at the Internet Archive. We spoke at length on the Google Book Settlement, about which there were several interesting developments the next day. The first was the settlement delay of four months to allow more time for members of the publisher or author classes to decide if they want to opt into or out of the settlement. Also, word leaked out that the Justice Department was undertaking a review of the settlement on anti-trust grounds. We've included a story about that Justice Department review in this issue. As I touched on in March's issue of Newsline, this settlement has broad implications for our community, both publishers and libraries. I encourage you all to become familiar with the issues and implications.
Continuing on the theme of travel, NISO is helping to coordinate the annual meeting of ISO's Technical Committee (TC) 46 on Information and Documentation. This international meeting moves from country to country every year and this year's meeting is in Nairobi, Kenya on May 11-15. NISO acts as the Secretariat for Subcommittee (SC) 9 on Identification and Description. That group will also be meeting next week and pushing forward some of the key projects underway, such as the DOI, the ISNI, and the work on mono- and multi-lingual thesauri. A report of the TC46 meeting will be included in next month's Newsline.
Finally, last month NISO launched a new work project on improving Single Sign-On (SSO) Authentication. This is the Chair's initiative for 2009, proposed by Oliver Pesch, NISO's current Board Chair. We plan to develop best practices that will improve the transmission of authentication method information in order to provide a seamless experience for the user. More about this new project is available below. We are presently calling for working group participants and would welcome your engagement.
With kindest regards,
May Two-Part Webinar: COUNTER and Usage Data
NISO and COUNTER are jointly holding a two-part webinar on COUNTER and Usage Data on the first and second Wednesdays in May. Swets is sponsoring the two-part webinar.
Part 1 held on May 6 was A How-To Guide to COUNTER that introduced librarians to COUNTER reports, definitions, and formats. Last summer COUNTER released the third version of the Code of Practice for Journals and Databases, which has an August 31, 2009 deadline for implementation. In addition to providing new consortium reports, the new release requires data providers to offer reports in XML format and to support the SUSHI protocol for automating the retrieval of COUNTER reports. Speakers for the How-To webinar were Peter Shepherd (Project Director, Project COUNTER), Tansey Matthews (Associate Director, VIVA, George Mason University), and Susan Golden (Product Manager, Serials Solutions).
Part 2 on May 13, 2009 from 1:00 - 2:30 p.m. (Eastern Time), is on New Applications of Usage Data and will explore broader issues surrounding usage data reports, the transfer of usage data via SUSHI, and developing issues related to e-books and multimedia. Peter Shepherd will also speak in Part 2 along with Peter Bienfield (Public Library of Science) and John McDonald (Libraries of the Claremont Colleges).
Registration, which is still open for Part 2, is per site (access for one computer) and includes access to the online recorded archive of the webinar. NISO and NASIG members receive a discounted member rate. A student discount is also available. For information and related resources, visit the event webpages: Part 1, COUNTER; Part 2, Usage.
June Webinar: Library Systems & Interoperability: Breaking Down Silos
In today's information environment, libraries work with a slew of systems from different vendors to manage, develop, distribute, and track their resources, and to provide rich navigational and discovery tools to the end users. Data needs to be re-used in multiple places and continually synchronized. Records from one system must link seamlessly to records in another.
Key to making this work effectively is interoperability. And standards are critical to successful, cost effective, and vendor-neutral interoperability. This webinar will provide a sampling of new work that is taking place to enable information about library resources to be shared between systems.
Topics to be covered include:
- CORE: Exchanging Cost Information Between Library Systems
Learn more about how NISO's CORE (Cost of Resource Exchange) project provides a solution to sharing cost information between library systems. The CORE draft standard is currently in trial use.
- Discovery Systems and Interoperability
Users today expect that the library will be able to provide them not only with a wealth of rich information about in-house resources, but a doorway to content that can be found in online databases, through institutional repositories, and beyond. Learn more about one project looking at how to address the issue of interoperability in this environment.
Assessment and Performance Measurement Forum –
Register by May 11 to Receive Early Bird Discounts
NISO will be holding a one day forum on Performance Measures and Assessment for Libraries:
Critical Tools During Challenging Times on June 1, 2009 at the Radisson Plaza Lord Baltimore in Baltimore, MD.
In times like these when belt-tightening is necessary, we especially need to find ways to measure our performance to improve our outcomes. This one-day, in-person seminar will focus on quantitative measures by which libraries can measure their performance and compare it with others. Usage versus cost measurement, the impact of changing delivery methods, and end-users' outcome satisfaction will be explored.
Topics and speakers are:
- Opening Keynote – Steve Hiller, Director of Assessment and Planning, University of Washington
- Restructuring the New Library to Succeed: Assessment and Performance Measures From a Dean's Perspective – Susan Gibbons, Vice Provost and Dean, River Campus Libraries, University of Rochester
- Retaining and Cutting: Collections Development in Tight Times – Mike Poulin, Digital Resources Librarian and Coordinator of Digital Initiatives, Colgate University Libraries
- Considering User Experiences to Assess Services and Facilities at the Library of Virginia – Suzy Szasz Palmer, Director of Research & Information Services, Library of Virginia
- MISO (Merged Information Services Organizations) Survey – David Consiglio, Statistics & Research Methods Support Specialist, Coordinator of Information Services for the Social Sciences and Administration Offices, Bryn Mawr College
- Building Your Own Assessment Plans – Larry Nash White, Assistant Professor, Department of Library Science & Instructional Technology, East Carolina University
Early-Bird registration ends May 11, 2009. Early bird rate is $135 for NISO members, $165 for non-members, and $60 for students.
For more information and to register, visit the event website.
NISO@ALA: Forum with BISG and NISO Standards Update
NISO and the Book Industry Study Group (BISG) are holding their third annual free forum on The Changing Standards Landscape prior to the ALA Annual Conference in Chicago. This year's forum, scheduled for Friday, July 10 from 12:30 - 4:00 p.m., will focus on standards initiatives and needs in the e-book marketplace.
Scheduled sessions and speakers for the forum are:
- Identify & Describe
- The New ISTC Agency: An Emerging Standard – Andy Weissberg, VP of Identifier Services & Corporate Marketing, Bowker
- ISBN and E-Books: The Use of ISBN for Electronic Texts – Mark Bide, Executive Director, EDItEUR
- Format, Discover and Retrieve
- Toward a Common E-Book Format Standard: EPUB – Michael Smith, Executive Director, International Digital Publishing Forum
- Discovering Online Book Content: BISG's BookDROP – Michael Healy, Executive Director, BISG
- Purchase and Use
- DRM Use in E-Books – Suzanne Kemperman, Director, Publisher Relations, OCLC NetLibrary
- Developing E-Book Sales Models – John Cox, Managing Director, John Cox Associates
- Use of E-books in a Library Context – Sue Polanka, Head, Reference and Instruction, Paul Laurence Dunbar Library, Wright State University
Fore more information, visit the NISO/BISG Forum event webpage No registration is required for this free forum, but prospective attendees are asked to RSVP online at www.niso.org/contact.
NISO will also be holding a Standards Update session at ALA on July 12 from 1:30-3:30 p.m. On the agenda are Oliver Pesch, with an update about NISO Board activities, and presentations on the Architecture Committee and the three Topic Committees. These committees set the direction for all of NISO's work and manage the portfolio of standards, recommended practices, and work in development. They will provide a review of work underway and discuss the 2009 goals and future directions.
More information on this meeting and other standards-related sessions at ALA can be found on the NISO@ALA Annual 2009 webpage.
Be sure to stop in and visit NISO at Booth #931.
NISO Announces New Work on Single Sign-On Authentication
NISO Voting Members have approved a new work item on perfecting single-sign-on (SSO) authentication to achieve seamless item-level linking in a networked information environment. A new working group will be formed under the auspices of NISO's Discovery to Delivery Topic Committee to create one or more recommended practices that will explore practical solutions for improving the success of SSO authentication technologies and to promote the adoption of one or more of these solutions to make the access improvements a reality.
This work item is the outcome of NISO's new Chair's Initiative, an annual project of the chair of NISO's Board of Directors. NISO's current Chair, Oliver Pesch (Chief Strategist, EBSCO Information Services), has identified single-sign-on authentication as an area that would benefit greatly from study and development within NISO, with a focus on a solution that will allow a content site to know which authentication method to use without special login URLs in order to provide a seamless experience for the user. "By developing recommended practices that will help make the SSO environment work better (smarter)," said Pesch, "libraries and information providers will improve the ability for users to successfully and seamlessly access the content to which they are entitled."
This new work follows on NISO's February 11th webinar on this topic, where the issues and potential benefits of SSO authentication were looked at from library, authentication tool, and content provider perspectives. The webinar was the first step in addressing the issue of SSO authentication; the new working group will enable all these perspectives to come together to focus on the topic as a community. (You can view the presenters' slides from this webinar online.)
In addition to forming the working group, NISO will be establishing an "interest group" e-mail list. If anyone would like to be a part of this new working group or to join the affiliated interest group, contact the NISO office at www.niso.org/contact.
ISQ: Bigger and Better
The Winter 2009 issue of Information Standards Quarterly (ISQ) is our biggest ISQ issue ever (46 pages) and has a completely new design that we hope you will find as awesome as we do. NISO is celebrating its 70th anniversary this year and ISQ has just passed its 20th year. We will be highlighting milestones in NISO's history all year in ISQ. In this issue our timeline begins with the inception of the Z39 committee in the 1930s through its formal ratification as a members' standards development organization in 1979-80.
Features in this issue include Karen Coyle's take on the need to evolve from a bibliographic "record" approach to one that is "data" driven, allowing Metadata Mix and Match. Abigail Bordeaux teaches us how to Use Standards to Tame Electronic Resource Management.
And there is more. Jay Datema provides an opinion piece on being "hyper" attentive. The new chair of the NCIP Implementers Group, Gail Wanner, tells us of the group's plans for encouraging adoption of revision 2 of NCIP. There are NISO reports, conference reports, and articles on the latest noteworthy news in the information standards world.
The issue concludes with our annual State of the Standards and Year in Review with reports of NISO and ISO TC46 accomplishments in 2008 and a complete portfolio listing of NISO standards, recommended practices, and technical reports.
Everyone can view the Table of Contents online; members and subscribers can download a PDF version in addition to their mailed print copy. If non-members would like information on how to subscribe or to receive a copy of the Winter 2009 issue, contact the NISO office. firstname.lastname@example.org
As always, we are very much interested in your feedback and suggestions on how we can improve ISQ. We are also seeking contributions for future issues.
Our next issue of Information Standards Quarterly (Spring 2009) will be distributed in the NISO booth at ALA. This is a perfect opportunity for you to reach the NISO community by advertising in ISQ. For more information see: www.niso.org/publications/isq/ads/
New Specs & Standards
This Web Service API facilitates all interactions between a reading system and a service provider, including the downloading and streaming of content. It also allows for content streaming directly over http and https. Available for public comment through May 15, 2009.
Discusses the design choices involved in designing applications for different types of interoperability. At Level 1, applications use data components with shared natural- language definitions. At Level 2, data is based on the formal-semantic model of the W3C Resource Description Framework. At Level 3, data is structured as Description Sets (records). At Level 4, data content is subject to a shared set of constraints (described in a Description Set Profile). Conformance tests and examples are provided for each level.
A major new version of the ONIX for Books standard with eight key areas of change: removal of redundant elements, description of digital products, handling of multiple-item products and series, publishers' marketing collateral, sales and distribution in international markets, products related to a single work, blocked records for more efficient updating, and new schema options. This version is not backwards-compatible with its predecessors.
Establishes the performance indicators for national libraries. It is also applicable to libraries with regional tasks and without a defined population to be served, as many of their evaluation problems correspond to those of national libraries.
Specifies the professional archival application format (PA-AF) with the purpose of providing a standardized packaging format for digital files. This packaging format can also serve as an implementation of the information package specified by the reference model of the open archival information system (OAIS). Also specifies a metadata format.
Announced on April 20, 2009, this new initiative is intended to be a focal point for collaboration to foster identity community harmonization, interoperability, innovation, and broad adoption through the development of open identity specifications, operational frameworks, education programs, deployment, and usage best practices for privacy-respecting, secure access to online services.
Justice Dept. Opens Antitrust Inquiry Into Google
New York Times (04/28/09) ; Helft, Miguel
The U.S. Justice Department is investigating the antitrust implications
of the settlement between the Association of American Publishers and the
Authors Guild and Google over its Google Book Search service. Justice is
responding to complaints from groups opposed to the settlement, which
includes the Internet Archive and Consumer Watchdog. Justice has not
decided whether it will oppose the settlement, which is still subject to
court review. The far-reaching agreement between Google and authors and
publishers would give Google the right to display copyrighted books online
and sell access to specific texts or entire collections. Libraries and
other institutions also will be able to buy access to the online
collections, with the revenues split between authors, publishers, and
Google. The settlement was agreed to last October after the Authors Guild
sued Google for copyright violations that occurred as part of its Google
Book Search project. The Authors Guild, which represents approximately
8,000 authors, filed the lawsuit in 2005, arguing that Google had no right
to digitize and display books without express consent. Critics say the
agreement gives Google exclusive access to orphan works, whose authors are
unknown or cannot be found. Librarians are concerned that without
competition Google could raise prices. Google says the agreement will
provide access to many out-of-print books as well as provide financial
benefits to authors and publishers. Separately, a federal judge has
postponed until September 5 the deadline for authors to opt out of the
(Link to Web Source)
Adding eScience Assets to Data Web
Linked Data on the Web (LDOW) (04/20/09) ; Van de Sompel, Herbert;
Lagoze, Carl; Nelson, Michael L.
Web resource aggregations are increasingly essential to scholarship as
it embraces new data-intensive, collaborative, and network-based
techniques. In this paper, researchers present a methodology for
identifying and describing Web resource aggregations stemming from the Open
Archives Initiative-Object Reuse and Exchange (OAI-ORE) project. The
precepts of the architecture of the World Wide Web, the Semantic Web, and
the Linked Data initiative serve as the platform of the OAI-ORE
specifications, and thus their incorporation into the eScholarship
cyberinfrastructure guarantees the integration of scholarly research
projects into the Data Web. The OAI-ORE solution tackles the resource
aggregation challenge by expressing the data model in terms of the
primitives of Web Architecture and the Semantic Web--namely, as Resources,
Representations, uniform resource identifiers (URIs), and Resource
Description Framework triples. The aggregation that comprises the central
entity in the data model represents a set of other resources, and this
approach dovetails with the way in which real-world entities or concepts
are included in the Web through the mechanisms proposed by the Linked Data
effort. The resource map has a representation that describes the
aggregation, and the map can be accessed through the aggregation's URI
using the mechanisms defined for Cool URIs for the Semantic Web. Finally,
the resource map's representation consists of a serialization of the
triples describing the aggregation. "While OAI-ORE was motivated by
scholarly communication, we believe that the proposed solution has broader
applicability," write the researchers. "Aggregations, sets, and
collections are as common on the Web as they are in the everyday physical
world. In many situations it would benefit agents and services if
aggregations were unambiguously enumerated and described, essentially
layering an additional level of resource granularity upon the Web."
(Link to Web Source)
Discovering Linked Data
Library Journal (04/15/09) P. 48 ; Bradley, Fiona
The future of library services is directly connected to discovery
capabilities, and Linked Data can improve discovery performance. Libraries
are focusing on discovery to provide more meaningful and helpful search
results, and to give users more information on the materials they are
trying to find. Linked Data gives libraries an opportunity to make new
connections between collections and the world, and can expand discovery
platforms to explore library data, broaden benchmark services, and support
the role of libraries as creators and publishers. Linked Data uses uniform
resource identifiers (URIs) to create connections. Similar to content in a
database, it is possible to assign unique keys to distinguish pieces of
data, and Linked Data does so through HTTP URIs and the Resource
Description Framework. Numerous data sets have already been published as
Linked Data using URIs, including photographs, bibliographic information,
and bibliographic records for library catalogs. The World Wide Web
Foundation's Sir Tim Berners-Lee says there are only four simple principles
to creating Linked Data: Use URIs as names for things; use HTTP URIs so
people can look up those names; provide useful information when someone
looks up a URI; and including links to other URIs so users can find
additional information. To unify data sets that follow these rules,
developers have created several ontologies, which provide a framework to
represent concepts and their relationships to one another. Several
ontologies are already in use, including Friend of a Friend (FOAF), which
describes people and their relationships. In FOAF, each person is unique
identified by a URI, which could link to a homepage, email address, or
identity profile on a service such as OpenID.
(Link to Web Source)
NISO Note: Innovative Interfaces, Ex Libris, and Library of Congress are NISO voting members.
The Dublin Core Data Element Set is a NISO standard
Researchers Work on Website Credibility
Agence France Presse (04/21/09)
Scientists at the Know-Center, a technology research center in Austria,
are developing software capable of quickly determining if a Web site is a
credible source with reliable information. The program automatically
analyzes and ranks blogs as being of high, average, or little credibility.
Blogs are ranked by comparing statistical properties, such as the
distribution of certain words over time, with news articles on the same
topic from mainstream news sources previously determined to be credible.
Know-Center researcher Andreas Juffinger says the program has delivered
promising results and appears to be on the right track. Similarly,
Japanese researchers are developing a program that mines the Web for
different viewpoints on an issue and presents them to Web users, along with
supporting evidence as part of a "statement map" that illustrates how
different opinions are related. "We really believe that 'statement maps'
can help users come to conclusions about the reliability of a Web site,"
says Nara Institute of Science researcher Koji Murakami. Meanwhile,
researchers at Italy's University of Udine are developing an algorithm to
assign quality scores to Wikipedia articles and contributors. "Preliminary
results demonstrate that the proposed algorithm seems to appropriately
identify high and low quality articles," the research team writes in a
paper presented at a recent World Wide Web conference in Madrid.
(Link to Web Source)
New Metasearch Engines Leaves Google, Yahoo
Binghamton University (03/26/09)
Binghamton University professor Weiyi Meng, along with researchers at
the University of Illinois at Chicago and the University of Louisiana at
Lafayette, have developed metasearch-engine technology that uses many small
search engines to generate results that are more accurate and complete than
traditional search engines. Meng says Web users will soon be able to
submit a question to an Internet search engine and receive an actual
answer, rather than a link to a Web page. He says that most of the pages
of the deep Web are not directly crawlable, so his metasearch technology
connects to small search engines to reach into the deep Web. "In
principle, small guys are much better able to maintain the freshness of
their data," Meng says. "Google has a program to 'crawl' all over the
world. Depending on when the crawler has last visited your server, there's
a delay of days or weeks before a new page will show up in that search. We
can get fresher results." He has developed prototype technology that, for
example, would allow for a search of all 64 campuses in the State
University of New York (SUNY) system, as well as the SUNY central
administration. Meng says people could use this to find collaborators
working on similar projects or help prospective students find programs they
are interested in. The technology also could be adapted for large
companies or the government.
(Link to Web Source)
In Challenge to ILS Industry, OCLC Extends
WorldCat Local to Launch New Library System
Library Journal (04/23/09) ; Breeding, Marshall
OCLC has expanded WorldCat Local's existing cataloging and discovery
tools by adding new circulation, delivery, and acquisition features, a bold
move that could potentially revolutionize the field of library automation,
writes Vanderbilt University Library's Marshall Breeding. OCLC calls
the new project the "first Web-scale, cooperative library management
service." The project will ultimately bring into WorldCat Local the full
complement of functions traditionally performed by locally installed
integrated library systems (ILS). OCLC has positioned itself for this move with a
number of strategic acquisitions in recent years of companies with different
types of library automation products. The cloud computing model of this service
is presented as a significant change from the existing models for automation.
At no extra cost, libraries
that subscribe to FirstSearch WorldCat will get the WorldCat Local "quick
start" service, which is a locally branded catalog interface and search box
that provides localized search results for print and electronic content and
the ability to search the entire WorldCat database and other resources.
OCLC wants to increase activity managed library-by-library through locally
or consortially implemented automation systems to the larger network level,
classified under the global WorldCat infrastructure. OCLC says WorldCat
Local can help libraries improve efficiency. In July, OCLC will expand
WorldCat Local to include a wide view of content through an integrated
metasearch capability and through large collections of articles and content
indexed directly in WorldCat.org.
(Link to Web Source)