We scanned the horizon and found something interesting

Posted on by John Robinson

Several times recently we have been asked to explain how we arrived at our “decision in principle” to commission a new LMS based on Kuali OLE. At some point we will publish a full report on our project and the processes we used — once we are up and running — but it is worth taking stock at this point, where we are about a third of the way in to what is turning out to be quite an adventure (with — it seems — a lot of people cheering us on from the sidelines).

We started the BLMS project in June 2012 with the appointment of a project manager who worked with us to define an initial workplan. One of the first pieces of work was to undertake a “horizon scan” to determine which technology we should be considering and which process we should use to get that technology up and running. A key decision which we needed to make as early as possible was whether we were going to build, commission or procure the new system. Alongside this process we set about writing the detailed technical specification and functional requirements document against which the new system will be specified. The processes we used were designed to involve as many people as possible from the six potential partners in the proposed shared service.

As an aside, we wanted the technical work to be done as early as possible to provide each partner with something to take away from the project, should the shared approach prove unworkable for any reason. All of us have systems which are approaching end-of-life and in a traditional approach, would be planning for re-procurement. We wanted to be sure that the investment each partner was making into the project would not be wasted if any one of us had to fall back onto procuring a new system on our own. Given the amount of work involved in writing a detailed specification for a new system, the benefits of having the systems librarians from the six partners working together on the specification, led by the project manager, were considerable.

What is this “horizon scan” business?

A search for this term yields a rich harvest. On the Science and Technology Facilities Council (STFC) web site, for example:

The Government Office for Science defines Horizon scanning as “The systematic examination of potential threats, opportunities and likely developments including but not restricted to those at the margins of current thinking and planning”. Horizon scanning may explore novel and unexpected issues as well as persistent problems or trends.

The OECD also defines the term:

Horizon scanning is a technique for detecting early signs of potentially important developments through a systematic examination of potential threats and opportunities, with emphasis on new technology and its effects on the issue at hand. The method calls for determining what is constant, what changes, and what constantly changes. It explores novel and unexpected issues as well as persistent problems and trends, including matters at the margins of current thinking that challenge past assumptions.

Our horizon scan exercise turned out to be extremely useful. As Library Management Systems have — historically — tended to involve commitments of ten years or more (most of the systems we want to replace date from the late-1990s) we want to ensure that we select the best technology which can take advantage of the 21stC information environment. We wanted to know what is the vision of the vendors of our current systems and also to find out what options exist in the emerging open-source systems. We organised two separate days: a vendor day and an open-source day. We invited the four vendors with whom at least one partner has a current contractual relationship to present their next-generation LMS. We made it as clear as possible to the vendors that we were not in a procurement mode and this was not an invitation to make a sales pitch. All four of them managed to stick reasonably close to the brief. We gave each vendor a session lasting 90 minutes (45 min for presentation, 45 for questions) with a group of about 35 people including systems librarians, subject librarians, heads of service and library directors.

On the open-source day we had three sessions: one from a supplier who has built systems based on Koha (and who had previously demonstrated Evergreen to us); one from The Library Co-op who talked about Koha and also said some very interesting things about migration strategies should we decide to use Koha as a stepping stone to something else; one from Kuali OLE. All of our staff who attended the sessions filled in questionnaires in which they scored the presentations against a range of parameters derived from our strategic, operational and technical drivers.

This is not just about open-source

A number of colleagues have written extensively about the benefits of open-source software (or free-and-open-source aka FOSS) and there is no need to cover that ground here. We had spent quite some time during 2011 looking at Koha and Evergreen as prominent examples of open-source LMS, including a very encouraging presentation from the University of Staffordshire Libraries which operate “the first UK academic implementation of the Koha open source Library System“. It is clear that the emergence of viable open-source approaches to LMS is changing the field quite dramatically but the full extent of that change has yet to be realised. This is fundamentally a change in the technology and what it enables, but adopting a new LMS is about more than just the choice of technology. There are systems architectures, service models, support models, hosting arrangements and the complex business of data migration which must also be taken into account, not to mention the permutations which arise in seeking a consortium, shared-service approach.

One thing which is clear is that a choice between an open-source or vendor-based, proprietary solution is not about saving money. As consortium partners we made a decision early-on that our project is designed to obtain better outcomes for the same levels of expenditure, through a shared-service approach and an open mind about the new possibilities which modern technologies offer. We are also clear that a significant capital investment will be required to realise our vision.

A choice of systems architecture — vendor approach

The primary outcome of our horizon-scanning sessions was to demonstrate that we have a choice of systems architecture. Between the four vendors (which we can’t name because they have asked us not to), three clear and quite different models emerged:

  • a business-as-usual approach in which the primary concern of the vendor is to retain its existing customer base, consolidating onto a single, traditional LMS (the outcome of a number of mergers and acquisitions over the past decade) which operates on the proprietary, “silo” model with the only variation being a choice between customer-based or vendor-based hosting arrangements;
  • a propriety system which attempts to take on board some of the opportunities offered by the new information environment, most notably the use of a fully-hosted solution where customer bibliographic data is combined into a large-scale database which, combined with proprietary resource discovery tools, offers a much richer environment for the end-user (provided it is happy to move from the silo to a walled garden);
  • a novel approach in which a supplier which already owns a vast bibliographic database proposes to wrap around it an open-API (Applications Programming Interface) which enables customers to build or commission services.

What we observed was that suppliers of software seem to have woken up to the idea that the main focus is now data and content (as evidenced by the supplier who starts from this position). We gained the distinct impression that the suppliers who start from the position of owning software are as interested in us for our data as for our money (although they obviously want both). Given that, between us, the consortium holds millions of monographs, serials and special collections, we represent an enticing prospect.

The model of an API wrapped around billions of bibliographic records is a fascinating one, but the lack of mature services suggests that a major programming and development effort is required to benefit from this (the opportunities created by the existence of the open-APIs should not, however, be underestimated).

Vendors which offer the business-as-usual approach seem to be relying primarily upon the inertia of their customer bases.

A choice of systems architecture — open-source approach

We saw three models, which we can name because they have all published their details on-line:

  • Koha (the name means “gift”), originally written by a consortium of New Zealand local libraries who were not satisfied with the commercial offerings — see LibLime Koha, or the Koha community — which is a fairly traditional LMS but with the many benefits of the open-source approach;
  • Evergreen, developed by the US State of Georgia as a consortial LMS for its state library system, which is built around a “main library, branch library” model, again with open-source benefits although (problematically for some of us) lacking in robust Unicode support;
  • Kuali OLE, based in the Kuali Foundation, about which (as we had not looked closely at it before) we were genuinely astonished.

Until we saw Kuali OLE, it is fair to say that many colleagues were inclined to the view that our best option would be to go with the vendor who has the most persuasive vision of its future systems. Koha worried us because there has been a fork in the code base (“fork” is open-source jargon for a split between two rival versions of the system, supported by rival teams of developers, which has plagued open source for decades, as anyone with experience of the many “flavours” of UNIX operating systems will testify). The two links above lead to the two Koha camps. Despite the obvious success of the Staffordshire implementation, it is very hard to see where this system will be in five or ten years’ time. Whilst we want to build a consortium system, Evergreen is written for a very different type of consortium and would require considerable re-engineering. Even if this were easy, the worries about Unicode rule it out for those of our libraries which have large holdings of material in non-western scripts and languages.

And then there was Kuali OLE

Curiously, none of us had looked at Kuali OLE until about a week before the horizon-scanning exercise. The Kuali Foundation has been working on open-source systems for university administration for quite some time now, but the OLE (Open Library Environment) initiative is relatively new and still at a beta release (“beta” is software jargon meaning a system which is under development, useable for testing and evaluation but not certified for full production services).

A librarian from Penn State University happened to be in London to visit JISC (Kuali OLE is collaborating on the Knowledge Base Plus project) and was able to come and make a presentation. It is no exaggeration to say that, by the time he sat down, a fair number of us were thinking that here, finally, is a system which makes sense. Here is something which genuinely changes the way in which we can think about our library systems and offers us opportunities which the other options barely touch. Here is a paradigm shift.

Three things about Kuali OLE persuaded us to make it our preferred option:

  1. a group of large university libraries looked at the LMS field and, having decided that nothing on offer addressed their functional requirements, set about building one based on a systematic and detailed analysis of their library workflows, using a combination of their own resources and a large grant from the Mellon Foundation;
  2. the Kuali system, developed and maintained by the Kuali Foundation, has interoperability at its core, offering extensive modularity based on the primary principle that data should be managed once in the appropriate place and software modules should be able to address that data directly rather than importing and replicating it;
  3. whilst offering its software through the open-source model, Kuali is a membership organisation with strong governance and high levels of assurance about both the quality and longevity of its systems with members able to have direct and continuing input into the choices about the development of the software and systems.

(Anyone who has struggled to get a vendor interested in its requirements for changes to or developments of its LMS will understand why the third point is so significant.)

In summary, Kuali OLE provides us with the opportunity not only to build a truly next-generation LMS, but to approach levels of cooperation (through the focus on interoperability and data-sharing) which go well beyond the scope of a simple “shared-service LMS”.

Where next?

We’ve made our decision in principle. Looking back, it has been quite an adventure, with an outcome none of us expected. What is clear though, is that we have done the easy part. Building the service comes next. Watch this space …

John Robinson
Director of Library & Information Services
SOAS, University of London

Writing on behalf of the BLMS Consortium

This entry was posted in BLMS by John Robinson. Bookmark the permalink.

One thought on “We scanned the horizon and found something interesting

  1. Pingback: We scanned the horizon and found something interesting | Bloomsbury Library Management System consortium | kuali ole | Scoop.it

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a class="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>