Category Archives: BLMS

SOAS Library is the third live installation of Kuali OLE

On Monday 13th April 2015, the SOAS Library became the third library in the world and the first library outside the US to move to the Kuali Open Library Environment (OLE).  This is one of the first major outcomes of the BLMS project.

Other libraries in the UK, the US and Germany will soon be implementing this new system, which represents a new approach to the design, delivery and support of library management systems.

More details can be found on the Kuali Foundation web site.

John Robinson
Director of Library & Information Services
SOAS, University of London

The development of our LMS specification. Part 2: Why?

This is part 2 of an update from the Bloomsbury LMS project team about the development of our LMS specification and is especially relevant if you are planning an Open Source solution for your library management system. A PDF version is available for printing and reading off line.


1. Context

In 2012 a group of the Bloomsbury Colleges and Senate House Libraries found they had a common vision for a 21st Century LMS shared service.

As captured in Part 1 of this case study, the project team worked through a robust but agile process of developing a Functional Specification – agreed and prioritised to suit all 6 consortia partners.

Part 2 of this case study is a more generic view on the purpose of a Functional Specification. This is at a time when possibly HE libraries are questioning their value in a marketplace where all next generation systems can claim to deliver the same functionality.
Checking the value is more than valid – the time, effort and commitment required to develop a Specification is significant. Even adapting another institution’s or adopting something like the “Unified (‘Next Generation’) Library Resource Management System” still takes time to validate and tailor the content.

2. What is the purpose of a specification?

A 21st Century LMS is best seen in the setting of enterprise IT – part and parcel of the business and administration systems that underpin the purpose of a university.

A specification is not simply about the selection and tender process for a commercial supplier (whether for software, consultancy or other services).

The specification is an important part of any selection process, but offers far more to the institution in terms of a successful implementation and operational service.

Specifications underpin critical factors such as:

  • Process mapping and understanding
  • Ownership and stakeholder engagement
  • Cultural change
  • Testing, above all
  • Acceptance and sign off
  • Operational services

3. Does an Open Source solution need a specification?

Unequivocally, yes.

As above, specifications have a far wider role to play than basic selection of a new commercial system or service.

In some ways, the specification is more important as there may well be more development or implementation work to get an open source system fully operational. You as the owner of the system need to know your requirements inside out to make sure they work as
needed, or can be adapted within the constraints of the software – or developed using a third party product.

The benefits of having a specification as listed above are equally valid for open source, in particular:

  • How do you know what to implement without having clearly unpicked critical processes?
  • How do you know what to test?
  • How will you be able to support the operational system?

You don’t have the same safety net on the open source front that you have with a commercial system.

Commercial support and other services related to previous or next generation LMSs may be poor in reality, but you have a contract. In theory at least, there is legal comeback as a last resort.

That said, unless an institution is fortunate enough to have a full blown systems development capability, external support services in some form will be required for successful implementation of this enterprise solution. Even if that just takes the form of an IT contractor or three.

Once more the value of a specification comes into play – so all parties involved in developing the new system are working to the same script.

Open source or not.

Written by Sharon Penfold, BLMS Project Manager.

The development of our LMS specification. Part 1: What, how and why?

This is part 1 of an update from the Bloomsbury LMS project team about the development of our LMS specification, a PDF version is available for printing and reading off line.

What, how, and why?

1. What?

In 2012 a group of the Bloomsbury Colleges and Senate House Libraries found they shared a common vision for a 21st Century LMS. The goals were clear and simple:

  • A flagship shared service model
  • Shared access to resources
  • Interoperability

During the summer of 2012 a scoping, scanning and feasibility exercise was carried out – ending with a Functional Specification, Option Analysis and Options Costing for review in October.

Commercial and open source solutions and providers were assessed in the context of their closest match to the vision, goals and strategic directions of the consortium partners.

2. How did we do it?

  • Specification process and sources

The starting point to get the juices flowing on the Functional Specification was the “Unified (‘Next Generation’) Library Resource Management System” document. The project team used the text-based version of this ‘Unified Spec’ (our shorthand) to focus discussion.(This document generously shared by another university, courtesy of Ken Chad’s LibTechRFP site.)

Whilst in many ways superseded, the “United Kingdom Core Specification for Library Management Systems (LMS) UKCS Version 3” was also used on a few occasions, to fill in gaps on core library processes that weren’t reflected in the ‘Unified Spec’.

For the more 21st Century elements of our requirements, sources were more varied and heavily reliant on the skills, experience and vision of those involved. Whatever the source, it often acted as a prompt to say “No, we don’t want that, but we do want this”.

  • Starting point: structure

The first step in the “we do want this” process was agreeing the structure of the Specification – based on discussions around the concept of a 21st Century LMS.

Discovery and Resource Management were included on top of core library processes such as Cataloguing and Circulation. These basics of library operations are still critical, and effectively format agnostic, that is even if the formats and mechanisms of these processes and the resources concerned are fundamentally electronic – the principles don’t change.

  • End point: review and prioritisation

After a number of weeks of meetings of the six consortium Systems Librarians, a review of the Specification was used to prioritise each requirement.

This version was then used to check in with the various library specialisms, where staff had been involved along the way to ensure that wishlists and essentials were covered from the professional point of view.

  • Bigger picture

Going far wider than a traditional specification, the context of the Bloomsbury LMS was set and rounded by including aspects such as:

  • A BLMS concept diagram
  • Key usage statistics and existing library systems across the consortium
  • The enterprise context (business systems that interact with LMSs and beyond)
  • Technical requirements at a high level

3. Why?

The benefits of the process followed and its outputs are far wider than the goal of delivering a successful operational system.

The Functional Specification has a longer term and more extensive role to play in the project than a simple selection tool for the most appropriate system.

This 21st Century shared service LMS will succeed as much on the aspects that traditionally fail in major technology projects, as on the system itself i.e.

  • Ownership
  • Cultural change
  • Process understanding
  • Testing

The Functional Specification for the Bloomsbury LMS project is intended to support all of these aspects, and more – not just to select the software to deliver it.

Written by Sharon Penfold, BLMS Project Manager.

Jisc LMS Change guest blog post on the BIC ‘Battle of the Library Systems’

BLMS Project Manager Sharon Penfold has a guest blog post featured on the Jisc LMS Change blog: Highlights, lowlights and what next – Report from BIC’s Battle of the Library Systems, 28 November 2012.

This is a summary of the BIC ‘Battle of the Library Systems’ event. This was a debate around a house motion of “Open Source is about distributed innovation and will become the dominant way of producing software”, featuring expert panels representing an ‘open’ and a ‘proprietary’ team.

We scanned the horizon and found something interesting

Several times recently we have been asked to explain how we arrived at our “decision in principle” to commission a new LMS based on Kuali OLE. At some point we will publish a full report on our project and the processes we used — once we are up and running — but it is worth taking stock at this point, where we are about a third of the way in to what is turning out to be quite an adventure (with — it seems — a lot of people cheering us on from the sidelines).

We started the BLMS project in June 2012 with the appointment of a project manager who worked with us to define an initial workplan. One of the first pieces of work was to undertake a “horizon scan” to determine which technology we should be considering and which process we should use to get that technology up and running. A key decision which we needed to make as early as possible was whether we were going to build, commission or procure the new system. Alongside this process we set about writing the detailed technical specification and functional requirements document against which the new system will be specified. The processes we used were designed to involve as many people as possible from the six potential partners in the proposed shared service.

As an aside, we wanted the technical work to be done as early as possible to provide each partner with something to take away from the project, should the shared approach prove unworkable for any reason. All of us have systems which are approaching end-of-life and in a traditional approach, would be planning for re-procurement. We wanted to be sure that the investment each partner was making into the project would not be wasted if any one of us had to fall back onto procuring a new system on our own. Given the amount of work involved in writing a detailed specification for a new system, the benefits of having the systems librarians from the six partners working together on the specification, led by the project manager, were considerable.

What is this “horizon scan” business?

A search for this term yields a rich harvest. On the Science and Technology Facilities Council (STFC) web site, for example:

The Government Office for Science defines Horizon scanning as “The systematic examination of potential threats, opportunities and likely developments including but not restricted to those at the margins of current thinking and planning”. Horizon scanning may explore novel and unexpected issues as well as persistent problems or trends.

The OECD also defines the term:

Horizon scanning is a technique for detecting early signs of potentially important developments through a systematic examination of potential threats and opportunities, with emphasis on new technology and its effects on the issue at hand. The method calls for determining what is constant, what changes, and what constantly changes. It explores novel and unexpected issues as well as persistent problems and trends, including matters at the margins of current thinking that challenge past assumptions.

Our horizon scan exercise turned out to be extremely useful. As Library Management Systems have — historically — tended to involve commitments of ten years or more (most of the systems we want to replace date from the late-1990s) we want to ensure that we select the best technology which can take advantage of the 21stC information environment. We wanted to know what is the vision of the vendors of our current systems and also to find out what options exist in the emerging open-source systems. We organised two separate days: a vendor day and an open-source day. We invited the four vendors with whom at least one partner has a current contractual relationship to present their next-generation LMS. We made it as clear as possible to the vendors that we were not in a procurement mode and this was not an invitation to make a sales pitch. All four of them managed to stick reasonably close to the brief. We gave each vendor a session lasting 90 minutes (45 min for presentation, 45 for questions) with a group of about 35 people including systems librarians, subject librarians, heads of service and library directors.

On the open-source day we had three sessions: one from a supplier who has built systems based on Koha (and who had previously demonstrated Evergreen to us); one from The Library Co-op who talked about Koha and also said some very interesting things about migration strategies should we decide to use Koha as a stepping stone to something else; one from Kuali OLE. All of our staff who attended the sessions filled in questionnaires in which they scored the presentations against a range of parameters derived from our strategic, operational and technical drivers.

This is not just about open-source

A number of colleagues have written extensively about the benefits of open-source software (or free-and-open-source aka FOSS) and there is no need to cover that ground here. We had spent quite some time during 2011 looking at Koha and Evergreen as prominent examples of open-source LMS, including a very encouraging presentation from the University of Staffordshire Libraries which operate “the first UK academic implementation of the Koha open source Library System“. It is clear that the emergence of viable open-source approaches to LMS is changing the field quite dramatically but the full extent of that change has yet to be realised. This is fundamentally a change in the technology and what it enables, but adopting a new LMS is about more than just the choice of technology. There are systems architectures, service models, support models, hosting arrangements and the complex business of data migration which must also be taken into account, not to mention the permutations which arise in seeking a consortium, shared-service approach.

One thing which is clear is that a choice between an open-source or vendor-based, proprietary solution is not about saving money. As consortium partners we made a decision early-on that our project is designed to obtain better outcomes for the same levels of expenditure, through a shared-service approach and an open mind about the new possibilities which modern technologies offer. We are also clear that a significant capital investment will be required to realise our vision.

A choice of systems architecture — vendor approach

The primary outcome of our horizon-scanning sessions was to demonstrate that we have a choice of systems architecture. Between the four vendors (which we can’t name because they have asked us not to), three clear and quite different models emerged:

  • a business-as-usual approach in which the primary concern of the vendor is to retain its existing customer base, consolidating onto a single, traditional LMS (the outcome of a number of mergers and acquisitions over the past decade) which operates on the proprietary, “silo” model with the only variation being a choice between customer-based or vendor-based hosting arrangements;
  • a propriety system which attempts to take on board some of the opportunities offered by the new information environment, most notably the use of a fully-hosted solution where customer bibliographic data is combined into a large-scale database which, combined with proprietary resource discovery tools, offers a much richer environment for the end-user (provided it is happy to move from the silo to a walled garden);
  • a novel approach in which a supplier which already owns a vast bibliographic database proposes to wrap around it an open-API (Applications Programming Interface) which enables customers to build or commission services.

What we observed was that suppliers of software seem to have woken up to the idea that the main focus is now data and content (as evidenced by the supplier who starts from this position). We gained the distinct impression that the suppliers who start from the position of owning software are as interested in us for our data as for our money (although they obviously want both). Given that, between us, the consortium holds millions of monographs, serials and special collections, we represent an enticing prospect.

The model of an API wrapped around billions of bibliographic records is a fascinating one, but the lack of mature services suggests that a major programming and development effort is required to benefit from this (the opportunities created by the existence of the open-APIs should not, however, be underestimated).

Vendors which offer the business-as-usual approach seem to be relying primarily upon the inertia of their customer bases.

A choice of systems architecture — open-source approach

We saw three models, which we can name because they have all published their details on-line:

  • Koha (the name means “gift”), originally written by a consortium of New Zealand local libraries who were not satisfied with the commercial offerings — see LibLime Koha, or the Koha community — which is a fairly traditional LMS but with the many benefits of the open-source approach;
  • Evergreen, developed by the US State of Georgia as a consortial LMS for its state library system, which is built around a “main library, branch library” model, again with open-source benefits although (problematically for some of us) lacking in robust Unicode support;
  • Kuali OLE, based in the Kuali Foundation, about which (as we had not looked closely at it before) we were genuinely astonished.

Until we saw Kuali OLE, it is fair to say that many colleagues were inclined to the view that our best option would be to go with the vendor who has the most persuasive vision of its future systems. Koha worried us because there has been a fork in the code base (“fork” is open-source jargon for a split between two rival versions of the system, supported by rival teams of developers, which has plagued open source for decades, as anyone with experience of the many “flavours” of UNIX operating systems will testify). The two links above lead to the two Koha camps. Despite the obvious success of the Staffordshire implementation, it is very hard to see where this system will be in five or ten years’ time. Whilst we want to build a consortium system, Evergreen is written for a very different type of consortium and would require considerable re-engineering. Even if this were easy, the worries about Unicode rule it out for those of our libraries which have large holdings of material in non-western scripts and languages.

And then there was Kuali OLE

Curiously, none of us had looked at Kuali OLE until about a week before the horizon-scanning exercise. The Kuali Foundation has been working on open-source systems for university administration for quite some time now, but the OLE (Open Library Environment) initiative is relatively new and still at a beta release (“beta” is software jargon meaning a system which is under development, useable for testing and evaluation but not certified for full production services).

A librarian from the University of Pennsylvania happened to be in London to visit JISC (Kuali OLE is collaborating on the Knowledge Base Plus project) and was able to come and make a presentation. It is no exaggeration to say that, by the time he sat down, a fair number of us were thinking that here, finally, is a system which makes sense. Here is something which genuinely changes the way in which we can think about our library systems and offers us opportunities which the other options barely touch. Here is a paradigm shift.

Three things about Kuali OLE persuaded us to make it our preferred option:

  1. a group of large university libraries looked at the LMS field and, having decided that nothing on offer addressed their functional requirements, set about building one based on a systematic and detailed analysis of their library workflows, using a combination of their own resources and a large grant from the Mellon Foundation;
  2. the Kuali system, developed and maintained by the Kuali Foundation, has interoperability at its core, offering extensive modularity based on the primary principle that data should be managed once in the appropriate place and software modules should be able to address that data directly rather than importing and replicating it;
  3. whilst offering its software through the open-source model, Kuali is a membership organisation with strong governance and high levels of assurance about both the quality and longevity of its systems with members able to have direct and continuing input into the choices about the development of the software and systems.

(Anyone who has struggled to get a vendor interested in its requirements for changes to or developments of its LMS will understand why the third point is so significant.)

In summary, Kuali OLE provides us with the opportunity not only to build a truly next-generation LMS, but to approach levels of cooperation (through the focus on interoperability and data-sharing) which go well beyond the scope of a simple “shared-service LMS”.

Where next?

We’ve made our decision in principle. Looking back, it has been quite an adventure, with an outcome none of us expected. What is clear though, is that we have done the easy part. Building the service comes next. Watch this space …

John Robinson
Director of Library & Information Services
SOAS, University of London

Writing on behalf of the BLMS Consortium

BLMS solution – decision in principle

The Bloomsbury Library Management System consortium has made a decision in principle to develop its 21st Century LMS using Kuali OLE open source software as a platform.

Extensive options analysis and specification work over summer 2012 have indicated that an open source solution will offer the most flexible and future-proof direction to deliver the visionary shared service.

Strategically, Kuali OLE fully supports the direction and goals of the consortium members whilst also providing best value for money in terms of project and recurrent costs.

Technologically, the roadmap for Kuali OLE and the underlying enterprise technology, is delivering a truly next generation system.

Detailed planning and specification work will continue during the remainder of 2012. The programme of development work will continue during 2013, with a pilot service targeted to go live in late 2013.

For further information, please contact the Project Manager:

Sharon Penfold
25th October 2012