The Paradox of Use
There is a paradox sometimes encountered with new technologies; a technology that may appear harmless or more positively helpful when used by a small group or an individual, but when fully operating at scale, the same technology may be seen to have created new and potentially significant problems. The danger posed is not obvious until the technology becomes ubiquitous.
This concept was outlined by Dr. Melvin Kranzberg, a professor of the history of technology at the Georgia Institute of Technology and the founding editor of the journal, Technology and Culture. In 1985, Dr. Kranzberg presented his Presidential Address to the Society for the History of Technology in which he put forward his Six Laws of Technology. The first of his Laws was that “Technology is neither good nor bad; nor is it neutral.” He described this as:
“technology’s interaction with the social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves, and the same technology can have quite different results when introduced into different contexts or under different circumstances.”
Simply put, technology in and of itself is neither good nor bad. How it is applied and in what context creates a new environment. That newly created environment fuels perception of the technology as a long-term benefit or, conversely, as an off-putting danger. Perceptions of it change as unintended consequences come to light. Kranzberg offered as an example the rise and widespread use of DDT in the 1940s and 1950s. Initially, the pesticide was perceived as a useful means of reducing the risk of malaria and typhus during World War II among civilians and troops; however, once its environmental dangers and health risks were recognized, the chemical was banned worldwide in the ’70’s.
India ignores that ban, being the only country to continue manufacture and use of the chemical. The regional use of DDT is on-going because such use significantly reduces the volume of malaria cases each year. In the view of Indian leadership, despite the documented risks, the reduction justifies the chemical’s continued use. The intransigence of the Indian government on this matter is illustrative of Kranzberg’s point.
Federated identity might not leap to mind in terms of technologies that could stir up significant controversy or cause societal problems, certainly nothing along the lines of DDT. The systems that support federated identity have been in use for 30 years. In the early 2000s, institutions were increasingly met with the challenges of a centralized, but granular login system that could support authentication use of software and systems across the entire user population of an institution.
Adopting the approach of federated identity allows an institution to centrally manage user login structures. An institution can recognize who a person is (identity management) while managing customizable rights of access across an entire body of users (access control). Today, the eduGain system, which provides services to institutional networks of SSO services, supports nearly 90 million post-secondary users from 75 countries.
Because the identity federation system was designed to accommodate the needs of the entire institution, the system had to be flexible. It had to handle a spectrum of institutional requirements for a variety of systems. Some systems required a clearly recognizable identity (student courseware) where other systems that an identity be verified but still be shielded (use of library information resources). This flexibility was made possible as a result of splitting the three core components of authentication;
- the identity of the user;
- the attributes associated with that individual user, indicating assigned rights of access; and
- what a particular authenticated system needed to know about the specific user.
Practically, the system needs to a way to recognize that a user is verifiably the user already known by the system. This is a security process and can involve passwords, tokens, key cards, multi-factor-authentication, or as a simple process, I see Sara at the desk and I recognize Sara, so I let her pass.
The second element of this system are the details that the system knows about the recognized user. Sara might be a chemist, she may be a faculty member, she may be working on this particular research project. She may have inter-institutional credentials to work on a joint project with another institution. Sara may be chair of the department, needing access to certain management systems that might not be available to all (such as a financial or budgeting system). Each instance of permissible use serves as an element of a metadata, a specific attribute associated with the individual identity.
Such attributes are used to facilitate access to specific resources, the final element of this chain. A system checks to see whether this person has the necessary attributes to access a particular service and allows entry. The authentication system for a course management system, for example, may authenticate Simon as a currently enrolled student. Additional attributes in the context of the course management system include the courses in which Simon is enrolled this semester. For good or ill, once access is permitted, there is the potential for more sensitive information about Simon (his completed assignments so far, the resultant grade, etc.) to be shared with others.
How can an online system that is designed to share information about the user protect that user’s privacy? The NISO/STM joint RA21 project has been in the forefront of such discussions as it moved forward in developing a Recommended Practice.
The RA21 Recommended Practice, published in June of 2019, takes both a broad, but also stringent approach to attribute release. Although, the RA21 system was originally conceived as a service that could provide access to library-subscribed resources, the project has grown in scope, if not complexity. The longer that the project was discussed and as more people in the identity federation community became engaged, it made sense to provide a more general solution, one that could be used for any institutional login purpose, from access to library resources, to shared research infrastructure, even when enabling access to educational discounts arranged by the institution. Each of these use cases entails varying degrees of necessary access to descriptive user attributes. Due to the flexibility of the identity-federation-based access control structure, there are elements in the RA21 recommendation that allow for customized exchange of user attributes.
First, the recommendation explicitly states that for the use case where access is provided to an information resource and where no personalization is needed, only an anonymous entitlement attribute (e.g., eduPersonEntitlement) should be used. This preserves the privacy and anonymity of the user as no personal or trackable information is shared by the institution.
If greater functionality is required, and if there is a specific contractual agreement between the service provider and the institution, then pseudonymous identifiers can be used, which allow tracking, but which still mask the user’s identity. Institutions may also share opaque reporting codes, which identify user groups such as faculty or departments, so that more granular data can be gathered for analysis of institutional resource usage.
A second element of the approach to protecting privacy outlined in the Recommended Practice is the adoption of the GEANT Data Protection Code of Conduct. Those principles include data minimization, limitations on data use and reuse, prohibitions on third-party data sharing, data security assurances, and approaches to addressing non-compliance. The policy is also further supported by the EU General Data Protection Regulation (GDPR), which throws legal weight and significant potential penalties for non-compliant activities.
There is certainly further collaborative work to be done to build on this infrastructure. The identity federation community has tools and approaches to enforce policies, including those regarding privacy. This is done through federation membership agreements, signed by all participants, and through the codes of conduct required of service providers. In addition, norms have been established for certain classes of resources through the definition of entity categories and attribute bundles.
One of the follow-up projects currently being organized will seek to define and specify attribute release policies for access to library resources that can be enforced systematically within the identity federation infrastructure. Another project stream that needs to be specified is the process by which users can consent to additional attribute release. The team at Duke University has done significant work on customizable attribute release and the experience of exposing those options to users. More user testing and development on this approach needs to be undertaken.
Implementation of the Recommended Practice is being advanced by a new coalition, the Coalition for Seamless Access, a partnership between service providers, identity providers, the publishing community and the library community. The Coalition is currently setting up a beta phase of service based on the RA21 recommendation. During the beta phase, the service will be tested on its overall functionality, whether it presents implementation challenges, how well it improves the user experiences, and for system security and stability. Also during the beta period, work will be advanced on consent, attribute release formalization, as well as overall outreach and education around the topic of single-sign-on services. Following the beta, there will be a review process; subsequently the Recommended Practice might need to be updated or amended to address identified problems.
While there has been much made of the potential problems with moving toward a single-sign-on basis for authentication for library services, this is no reason to abandon the approach. There are challenges and there is potential for misconfiguration or even abuse, but this is true of every technology system.
Quoting again from Kranzberg’s Sixth Law:
“Behind every machine, I see a face–indeed, many faces: the engineer, the worker, the businessman or businesswoman, and, sometimes, the general and admiral. Furthermore, the function of the technology is its use by human beings–and sometimes, alas, its abuse and misuse.”
It would be absurd to avoid the use of a technology, simply because there may be misuse. More sensibly, humans should recognize the time needed to identify appropriate and wise use of a particular technology and take steps to prevent misuse. To do so requires the education to grasp how systems work and the underlying design. There should be transparency about a system and the data fueling its use if individuals and institutions are to place any trust in claims of “informed consent”.