Kuali OLE at the University of Chicago Library

Last updated October 17, 2014.

Download 'Kuali OLE at the University of Chicago Library' as ePUB Download 'Kuali OLE at the University of Chicago Library' as PDF

Case Study on Open Source Selection and Application

Frances McNamara, Director Integrated Library Systems and Administrative and Desktop Systems
Stuart Miller, Library Systems Analyst
Tod Olson, Systems Librarian
University of Chicago Library


This case study documents the processes at the University of Chicago Library (Library) that ultimately resulted in the decision to replace the Library’s proprietary systems – SirsiDynix’s Horizon, Innovative Interfaces’ Millennium Acquisitions and Serials Solution’s AquaBrowser – with the open source VuFind user interface and Kuali OLE (Open Library Environment).


This study is based on internal documentation and interviews with Library staff, complemented by links to documents available online.


This study will seek to highlight the successes, failures and lessons learned during the replacement project.

Research Limitation

The Library is not moving its new systems into production until July 2014. Final determination of success depends on the Library’s planned goal of shutting down its legacy systems by January 2015.

Practical Implications

Other academic libraries considering open source systems may find the University of Chicago Library’s experience of interest.


The Library will be one of the two first adopters of Kuali OLE. This study may help other institutions evaluate and plan for similar system migrations.

In addition to this web format, the case study is available as an ePub and a PDF file.

Background and Origins

Last updated September 27, 2014.

The University of Chicago Library (the Library) houses a large research collection of 11.9 million volumes in six libraries on the university’s main campus in Hyde Park, a neighborhood of Chicago. Details on the size and content can be found at http://www.lib.uchicago.edu/e/about/.

Basic facts about the University of Chicago (UChicago) can be found at http://www.uchicago.edu/about/. The Library serves a student body of 5,692 undergraduate and 9,502 graduate, professional, and other students; 2,186 full time faculty; and 9,500 staff members at the University of Chicago Medicine (details can be found at http://www.uchospitals.edu/about/fact/hospitals-sheet.html).

In the early days of library automation beginning in the 1970s, the Library developed its own, mainframe-based system (Library Data Management System or LDMS), run on an IBM machine housed in the campus computing data center. When it was decided to retire the mainframe, the Library concluded that purchasing a commercial system was the most viable option and migrated to the Horizon system in 1995. (Horizon was originally owned by a division of Ameritech, which sold the software company to Epixtech, eventually changing its name to Dynix, bought by Sirsi to become SirsiDynix, the current owner.) The acquisitions portion of Horizon never completely met the needs of the Library, so those functions were migrated in 1997 to what is now Innovative Interfaces’ Millennium system (although the Library uses only its acquisitions functions). Providing users with information regarding on-order titles was accomplished by regularly scheduled exports from Millennium that were then batch imported into Horizon.

Like any client-server system, Horizon used a relational database management system to handle data storage functions; in this case, Sybase was the product and its cost was typically bundled into the overall cost of Horizon. However, the Library was able to obtain better pricing through the University’s existing campus-wide Sybase license. The University’s IT Services unit also negotiated the annual Sybase maintenance cost for the Library. This allowed the Library direct access to Sybase support, which was often helpful for debugging and local development efforts.

As with the LDMS hardware, the Library entered into an agreement with IT Services to run Horizon within the university’s enterprise environment to take advantage of its Storage Area Network, Tivoli Management System for tape backups, and system administration support. Although the Library still had to pay for its hardware and for the support services, it was far more economical than hosting hardware within the Library, which would have resulted in the need to add at least one full-time system administrator. The Library meets regularly with IT Services to deal with any issues. The Library has direct, root access to its Horizon servers and installed software upgrades in coordination with SirsiDynix and IT Services.

The Library also maintained separate test and reporting environments of the Horizon software that allowed for installation and testing of new software releases prior to moving them into production. This could be done under both the Horizon and Sybase software licenses at no additional charge. This also allowed Library staff to develop specialized, local applications “around the edges” of Horizon, frequently without the vendor. For this reason, Horizon, although proprietary, was a comparatively “open” system as it was installed at the University of Chicago.

In contrast, Innovative Interfaces’ Millennium was a black box. It resided on its own server in the Library’s system environment with its own tape backup. Upgrades and customizations all had to be performed by vendor support staff, who required direct access to the system. While the Library would have preferred to have a test environment (as it did with Horizon and Sybase), Innovative Interfaces charged a comparatively high price for a second copy so it was decided not to try to support a test environment.

The Library had always been aware of the cost of supporting multiple systems and it had always been clear that continued support for multiple systems was not in the Library’s best interests, either functionally or financially. So when SirsiDynix began work on a completely new system called Horizon 8.0 (also known as “Corinthian”) to replace the existing Horizon AND include desired acquisitions features, the Library agreed to be a development partner, working on specifications and testing with vendor staff from 2005 to 2007. Library staff were trained on Corinthian in the summer of 2006, but the lack of some critical features caused a postponement of the migration. During this period, ownership of SirsiDynix changed twice, but there were repeated commitments to finish Corinthian. However, in February 2007, SirsiDynix announced it would stop Horizon 8.0/Corinthian development. Experience gained by library staff working on this project was later helpful in work on the Kuali OLE project. During the Corinthian project, the Library contributed to and reviewed functional specifications. They also participated in testing of the software during development. They became familiar with the bug tracking JIRA system used to communicate with software developers to fix problems, and they became used to the process of frequent upgrades to a test system to test fixes. Weekly calls with the product manager and various project management tracking tools were used. Also, analysis for data conversion to that system was done and data clean up issues that needed to be addressed prior to migration to any other system were identified.

Horizon had several online public access catalog (OPAC) modules over the years until 2002, when the Horizon Information Portal (HIP) was introduced to replace all earlier modules. Horizon Information Portal (HIP) became the sole end-user interface for the Library. By 2006, there were technical advances in other library search retrieval systems. The Library began an investigation of newer, “faceted browse” search systems. In November 2006, the Library issued an RFP and eventually selected AquaBrowser after an extensive review (see http://www.lib.uchicago.edu/staffweb/depts/ils/projects/faceted-browsing/ for more details).

While providing many advances over HIP, Lens (the locally adopted name for AquaBrowser) was never a full replacement due to its lack of functionality, e.g., no support for non-Roman searching, so the Library continued to support both HIP and Lens. Then, AquaBrowser was bought by Serials Solutions, which announced plans to offer the product only as a hosted service without any of the many customizations that the Library had already made to it. There were no plans to support non-Roman searching on the locally hosted version of the product. Also during this period, the Library made the EBSCO Discovery Service (EDS) available to users on a trial basis; their favorable reaction led the Library to select it as a replacement for the Ex Libris Metalib system which was a Z39.50 based system that sent searches across multiple databases and federated the results into a single list.

By the time the Library began to consider system replacements, the public catalog functions had been to some extent abstracted from the ILS to a discovery layer. Even HIP was a separate system on a separate server to which records were passed from the Horizon system to be indexed and displayed. AquaBrowser also worked by indexing records exported from Horizon and using servlets to reach into Horizon for some dynamic information such as item status.

However, supporting two separate user interfaces, HIP and Lens, plus Horizon and Millennium, was becoming increasingly onerous. In addition, with the demise of Horizon 8.0/Corinthian, the Library decided not to move beyond the Horizon 7.5 release; any new development planned for HIP would require moving to new Horizon releases. This effectively meant that the Library would not be able to install any new versions of HIP. Changes in the ownership of AquaBrowser and terms of service had made its continuation unattractive as well.

Integrated Library System Renewal

Last updated September 27, 2014.

In addition to the general undesirability of supporting multiple systems, the technical platform for Horizon became more expensive and more difficult to support as time went on. The Sybase license required an AIX platform and while that had previously been used extensively by many other university applications, thereby reducing everyone’s annual maintenance, most of those applications migrated to more cost-effective hardware and software. As a result, the Library’s annual maintenance for Sybase doubled. The only other relational database management system supported by Horizon is SQL Server and this was not considered a good choice for the size of the Library’s database or for the environment where we wanted to run it.

Another problem was that the Horizon database was never upgraded to Unicode and this caused an increasing number of issues as the Library has an extensive collection of titles in non-Roman alphabets. The Library had to run imported record files through conversion programs; while these generally worked, the pre-Unicode standards applicable to Horizon did not always result in completely correct translations. In addition, SirsiDynix, following the demise of Horizon 8.0/Corinthian (which was Unicode based) announced that it would NOT convert Horizon 7.x to Unicode. The only option was to move to the vendor’s Symphony system, which was Unicode-compliant. However, that system was – following an assessment by Library staff – more or less functionally equivalent to Horizon 7.x; the complexity and cost of a system migration could not be justified if the new system was merely “equivalent” to the old, so this was not considered a good solution.

Environmental scan of commercial systems

Last updated September 27, 2014.

Following the halt of Horizon 8.0/Corinthian development, the Library began an immediate scan of the ILS environment in early 2007. To do a quick survey of possible replacements, the Library’s systems staff arranged for conference calls with the University of North Carolina at Chapel Hill to discuss their implementation of Innovative Interfaces’ Millennium and with Boston College Library for a discussion of its implementation of Ex Libris’ Aleph system. A tabulation of the results of those discussions is at: http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/systemsevaluation.html.

Contacts were also made with Oxford University and New York University about their work with VTLS on its new system, but the discussions led to the conclusion that this was not a production-ready option.

Of all the options available at that time, only Aleph seemed reasonable given our requirements, and Ex Libris did an on-site, day-long demo for the Library. Its preliminary, informal price quote was surprisingly low, but a later formal quote was significantly higher. In the end, the Library could not justify such a significant investment on a system that, while meeting most of the functional requirements, would not give the Library a more technologically up-to-date platform than it already had.

Ex Libris invited the Library to be a development partner on what eventually became Alma, its next generation ILS. However, the Library was reluctant to commit to another vendor’s new system development project. The fact that Ex Libris’ ownership changed in 2007 (bought by a private equity firm, as SirsiDynix had been) was also not encouraging. So the Library declined to participate.

By this time, other developments were suggesting other alternatives.

Requirements documents from Library functional groups

Last updated September 27, 2014.

Around 2007, open source ILS products were just beginning to attract attention as alternatives to the commercial products. Evergreen and Koha were the open source library systems that seemed to have the most promise, and information about both were actively collected by the Library. Since systems staff were aware that any new ILS would undoubtedly have certain functional gaps, groups of Library staff were asked to identify the core functional requirements for their area. Work had already begun on this analysis during the Horizon 8.0/Corinthian development as it was necessary to enumerate critical requirements in order to verify the system would be able to replace existing systems. Before doing a “gap analysis” on the existing open source systems, we asked staff to formalize these lists of requirements. These lists can be found in a chart that links to Word documents at: http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/.

Technical investigation of open source

Last updated September 27, 2014.

Investigation of Evergreen proceeded with discussions with its developers and some academic libraries including McMaster University and Project Conifer, a group of academic libraries in Ontario that were considering developing Evergreen sufficiently to enable its use in academic libraries. While Evergreen was developed by and for a consortium of Georgia public libraries, it seemed to have the potential of being scalable and workable as a platform for development of additional features. It was promising enough and mature enough that the Library’s systems staff installed the software and loaded a full copy of the Library database to allow staff to do a gap analysis. Results of that analysis can be found in the chart at: http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/ with reports by various functional groups. Note that there are comments that sometimes give an assessment of what it would take to remedy a gap. [A gap assessment of Koha was not done to this level of detail so that column does not contain this type of report]. Evergreen looked promising but it definitely lacked some features essential for academic libraries. For instance, academics have a changing patron file that needs to be updated to reflect enrollment and library privileges; fixed due dates are common; and the capability to recall items is often required. Interface with various university systems is also needed, e.g., the ability to export voucher data to university payment systems.

The library sent Stuart Miller to the VALE (NJ) “Next Generation Academic Library System Symposium” in March 2008, which discussed Evergreen, Koha and some other library open source systems, including VuFind. It was openly acknowledged that SirsiDynix’s decision to abandon Horizon 8.0/Corinthian had created a sudden increase of interest in open source products among libraries.

Contact was also made with WALDO, a consortium of fifteen academic libraries in the New York area (the largest being St. John’s in Queens, NY) that was implementing Koha. LibLime, the support company and the North American release manager for Koha, had completed several enhancement projects funded by WALDO; St. John’s was then in production. A conference call with Joshua Ferrero and John Rose of LibLime was held on April 10, 2008. (See “Some Basics on Koha Discussion with LibLime April 2008” on http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/.)

Koha’s drawbacks were found to be primarily technical – there were questions about its scalability to much larger databases. No large ARL library had been involved, and there was a suggestion the software might need to branch to support ARL library workflows. Library systems staff continued to monitor developments, but it was eventually dropped as an option. It was considered better to wait for changes to support academic library work to be completed before expending the effort for a complete evaluation that involved installing a local copy with the full library database. By the time those changes were available, the library had already decided to participate in the Kuali OLE project.

In 2009, the Library also thought it should review the state of the new OCLC Worldcat library management system; a summary report can be found at: http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/. Due to many non-existent features at that time, a detailed gap analysis was not thought to be necessary. In addition, there were doubts that this new product could achieve acceptable performance levels, and the Library’s past experience with the quality of OCLC support and services led to skepticism about its ability to provide even an acceptable level of support for a mission-critical service such as an ILS. The fact that the library is open until 1 AM weekdays and is heavily used late at night and on weekends tends to influence decisions about whether trying to use a hosted service makes sense. Library systems staff do provide support nights and weekends, and it was felt this level of support would be difficult to afford via a hosted service.

Invitation to participate in OLE project phase 1

Last updated September 27, 2014.

Meanwhile in 2008, the Andrew W. Mellon Foundation funded a project to design an “Open Library Environment” and work began in August 2008. University of Chicago participated in this project, along with several other large academic libraries, and hosted one of the regional meetings in December 2008. Documents from this meeting including a useful OLE Overview presented by Jim Mouw, Associate Director for Collection Services, University of Chicago Library, can be found at: http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/OLEUChicworkshop.html. Work on the first phase of this project included training in Business Process Management and production of Tasks and Process Maps for each module of a library system. This resulted in a design document that was presented to Mellon in early summer of 2009.

Interest in OLE also came from Greg Jackson, then director of the University’s IT Services. The OLE Project had discussed the need for a new ILS to work at an enterprise level, e.g., a library’s need for patron data would be satisfied by being able to pull data from a campus database rather than having to maintain a set of patron records like all current ILS systems. Inspired by a conversation with Library Director Judi Nadler, Jackson raised this issue at a meeting of the Common Solutions Group (CSG – http://www.stonesoup.org/ – a network of campus CIOs). Clearly, an “enterprise level” ILS capable of direct interface with other campus applications would require cooperation and input beyond libraries. CSG, while interested, did not “adopt” this as something that the group could fully support, not because it was undesirable, but because most institutions were just not ready for it. Any OLE design would need to recognize this. This did prompt OLE to think that repurposing what already existed might also be the most practical way to approach a new ILS, e.g., taking Evergreen and adding development suitable for academics might be a possibility (although after review, OLE decided to use Kuali Rice instead). Also, OLE began to consider whether or not it should ultimately become a Kuali Foundation project, partly in order to facilitate cooperation with other open source campus application developers.

Decision to become Kuali OLE partner

Last updated September 27, 2014.

At that point, a “build” project for OLE had not been funded, but was in the offering. The Library was concerned at that time that the Horizon system might not be supported by SirsiDynix beyond 2013. [Note: As it turned out, Horizon is still supported by SirsiDynix, and there is, as of March 2014, no announced “end of life” for Horizon.]

Because the library was not ready to choose a true replacement system for the ILS, a “bridge” solution was proposed in October 2009. The Bridge to the Future recommendation (http://www.lib.uchicago.edu/staffweb/depts/ils/projects/ilsreplacement/) summarized the possible options: to implement Evergreen; stay on Horizon as is; or upgrade to Sybase 15 and upgrade to Horizon 7.5.1. It was later found that the cost for the Sybase license was half what was originally estimated, and that final option was eventually implemented. An added concern for the Library at that time was the construction of the Mansueto Library planned to open in 2011. The Mansueto Library was built to house lesser used portions of the collection in a structure that is physically connected to the Regenstein Library. Transfer of parts of the collection to this library allowed newer materials to continue to be shelved in the open stacks, while avoiding the problems associated with offsite storage. There was a requirement that the automated storage and retrieval system in Mansueto interface with the Library system to allow users to seamlessly request items in storage. This would have to be done first for Horizon and then with whatever its replacement would be.

In order to apply for Mellon Foundation funding to build the OLE software, it was necessary for a group of libraries to agree to be founding partners and to contribute matching funds. In 2009, the Library decided to become a founding partner, and the grant proposal to build the OLE software was funded by Mellon to run from July 2010 to June 2012 (additional funding was later secured from Mellon through 2014).

The OLE project joined the Kuali Foundation in December 2009. The Kuali Foundation provides the legal and financial framework that is needed to sustain an open source project, and because all of the Kuali projects that come out of the academic environment, it is set up in a way that works well for an academic library.

As with all previous efforts concerning system replacements, the Library’s systems staff recruited assistance from the University’s IT Services department in considering open source software. The Kuali project was already being monitored because of the university systems that fall under its umbrella. While the University of Chicago had no plans to implement other Kuali software at this time, they are considered viable options. Some work was moving forward by the identity management staff in IT Services to discuss a potential open source identity management system that might be developed.

Decision to become an early adopter

Last updated September 27, 2014.

With the decision to become a full partner in the “build” of OLE, the Library had a plan to implement OLE in the summer of 2013. It was thought that the first usable version of OLE would be released in the summer of 2012, and then it would take roughly a year to install, customize, convert data and integrate the system. (Note that a full year was projected because the software would be so new; later implementers should be able to do this in less time.) Not too surprisingly, there were development delays and setbacks, so the Library’s projected implementation date is now July 2014.

The Library was willing to become one of the first OLE adopters for a number of reasons. First, the clock was running out for support for the hardware and software of the legacy systems. In addition to annual support costs for the two systems and the increasing costs for the Sybase license, the Horizon AIX server would soon need to be replaced. To migrate to a more supportable hardware platform would require another license upgrade to run on Linux and that would be costly. As it turned out, the Library had to replace the HIP server in 2013 because of its advanced age, a project the Library hoped to have avoided.

Second, the Library, as an early adopter, would be able to work directly with the developers to ensure that OLE would work with its large database and that its essential requirements (referred to as its “drop dead” list) would be met. In addition, the Library was extremely eager to move to a Unicode-based system; this basic lack in Horizon was contributing more and more to the overhead support costs.

While any system migration is challenging and complex, the Library believes a migration here is somewhat easier due to a variety of factors. For one, Chicago is far more centralized than many comparable university libraries; all six libraries on campus have all used the same system for many years, and all libraries report into one management structure with one director. In addition, there is a single identity management system for users. Library staff are accustomed to participating in review of specifications and beta testing, having done so for Horizon in 1995 and more recently for Horizon 8.0/Corinthian and AquaBrowser. Staff had long been involved in planning for system migration and many were acting as subject matter experts writing functional requirements for OLE.

Another factor simplifying an OLE implementation was that the Library’s legacy systems had never included Electronic Resource Management (ERM). While OLE includes ERM, it is possible to migrate to OLE and implement the ERM features later.

Discovery Layer Online Catalog

Last updated September 27, 2014.

From its inception, Kuali OLE decided NOT to include an online catalog module in its design, due to the wide availability of several open source interfaces such as Blacklight and VuFind, two of the most popular. It was assumed that Kuali OLE would support any necessary protocols to allow for connectivity and the ability for users to access “my account”-type features (e.g., items checked out, renewals, lists of requested items, updating addresses, etc.).

So beginning in 2011, the Library began a technical evaluation of possible user interfaces as well as initiating an assessment of user requirements.

Under the leadership of Tod Olson, Systems Librarian, Library systems staff did an investigation of Solr based on the open source systems Blacklight and VuFind to assess the technical feasibility of implementing one of these systems with a Kuali OLE-type database. In January 2012, this produced the Solr Catalog Technical Report (see http://www.lib.uchicago.edu/staffweb/groups/disctools/index.html). To quote the summary findings:

  • Implementation language (user interface level). Blacklight is a Ruby on Rails application. We have no local experience with this platform, and there is a significant learning curve. VuFind is a PHP application, the Library has in-house experience with PHP and there is a lot of support on campus for PHP.
  • VuFind has more of our critical features out of the box, notably browse indexes (title, author, subject, and call no.), a built-in architecture for integrating live circulation data, and a built-in authentication framework. Blacklight is working to close this gap, but currently Blacklight supplies fewer needed features than VuFind.

While either platform would be suitable for building our public front-end for Kuali OLE, VuFind appears to be the better match for the Library. VuFind is the recommended platform.

Note that the recommendation was not so much to implement VuFind as is, but to use it as a platform to develop a user interface for Kuali OLE and replace both HIP and Lens.

Meanwhile, a group of library staff was appointed to collect information from users about their needs and desires for an ideal user interface. Following the agile development concept of collecting user stories, the Library did just that with a series of activities involving a cross-section of Library users. Findings were published in a report (see http://www.lib.uchicago.edu/staffweb/groups/cus/). A quote from the full report describes the methodology:

…With the User Stories method, investigators seek to solicit statements of user need. These stories are typically written in plain, rather than technical, language. Stories are constructed so as to avoid complex or interdependent requirements. Stories do not specify a particular solution, but rather they seek simply to describe the need.

The project team began by reviewing existing sources of data to identify user stories. Sources reviewed include comments from multiple LibQual and Library surveys, user email comments in Knowledge Tracker and Bugzilla, requirements documents from Lens development and from a requirements list developed by Stanford, and from prior usability studies. Approximately one hundred unique stories were drawn from these sources. These stories were categorized, and they informed the design of interview questions and research instruments that were used in subsequent data collection.

Beyond the mining of existing data sources, the project team used several methods to collect data specifically for this study. Library staff conducted twenty individual and group interview sessions, involving a total of twenty-seven participants. Seven interviews were conducted by bibliographers, who recruited faculty from contacts. The remaining interviews were conducted with students from a variety of disciplines and programs, and with one College graduate working as a clinical researcher. These participants were recruited using ads on the UChicago Marketplace site, and on the Library web site, and the Library offered a $15 Amazon.com gift card as an incentive.

Taking these two reports together led the Library to select VuFind as its new “front end” with Kuali OLE as the back end to replace both HIP and Lens. It was decided to introduce VuFind as a “beta test” to library users in early 2014 using the Horizon database. User feedback would be used to fine tune the product. In the meantime, VuFind would be tested against Kuali OLE to ensure that once the latter went into production, VuFind would work with it. Since they would already be familiar with VuFind, library users would not even notice the switch on the “back end” and Library systems staff would have some breathing room between implementing Kuali OLE and implementing VuFind.

Chicago Migration

Last updated September 27, 2014.

The Library is currently in the midst of our migration project with the goal of bringing Kuali OLE and VuFind into production in July 2014. VuFind was introduced to the public in January 2014 as a “beta” system that was connected to the production Horizon system. It was decided to wait for the OLE implementation to make VuFind the production public interface, since that implementation was pending and it is usually preferable to implement substantial changes over the summer in an academic environment. While VuFind was essentially production ready against Horizon in spring 2014, actual migration to it as the production OPAC and retirement of HIP and Lens was delayed until summer 2014 to coincide with the OLE migration.

In addition to the internal processes to migrate systems, there is a substantial amount of staff time contributed to the open source Kuali OLE project itself. In planning for staff resource allocation, it was necessary to recognize commitment to the OLE project as well as to internal activities. Appendix A lists the required staff contributions by partner sites during this phase of the project. Participation in the project required Library staff to become proficient in use of the project collaboration tools, including WebEx, Google Docs and JIRA. Some staff were trained in creation of Selenium scripts although that effort was abandoned eventually due to the difficulty of the constant changes to database structures and screen displays during the very active development cycle for version 1.5. The scripts could not be maintained until more of the development was complete and the User Interface was changing less frequently. During this part of the development the OLE central project, QA staff were charged with doing the scripting where feasible.

Functional Migration Activities

Last updated September 27, 2014.

To facilitate the Library’s migration process, the University of Chicago Integrated Library System unit (ILS) established an ILS Migration Steering Committee (IMSC) made up of key decision-makers in the Library. The IMSC reports directly to the Library’s Administrative Committee (AdCom); two members of the IMSC are also members of AdCom. The IMSC’s charge and duties are as follows:

IMSC, under the guidance and direction of ILS, exists to make decisions about any matters pertaining to the migration from Horizon and Millennium to Kuali OLE. These matters include, but are not limited to:

  • pre-migration data cleanup
  • data mapping from old to new systems
  • data archiving of historical information NOT to be migrated to OLE
  • OLE configuration settings
  • testing preliminary and final release versions of OLE
  • developing training programs for staff
  • training staff
  • scheduling
  • internal communications

IMSC is the decision-making body for the migration process as a whole and will meet as needed to discuss cross-functional concerns, hear status reports from lead members for the various functional areas, and/or resolve specific functional issues referred to it. It, in conjunction with ILS, will identify and assign migration-related tasks and projects. The IMSC will document its activities on Staffweb and/or Basecamp and provide regular status reports to AdCom (schedule TBD). AdCom provides IMSC with input on any general matters referred to it, and may raise questions, intervene on any issue, or give directives as needed.

IMSC members acting as leads for a functional area are expected to speak for all staff stakeholders in that function (regardless of work unit). As such, each lead member will form an informal subgroup of stakeholders for the purpose of soliciting input and reaching decisions on migration matters related to the functional area. Should a subgroup be unable to reach decisions on any matter, IMSC as a whole will make the decision. Subgroups will meet on their own (with the ILS liaison) as needed and may be invited to meet with the full IMSC as required.

IMSC will also coordinate its work as needed with the Discovery Tools Group as it works to implement the Library's new user interface in conjunction with OLE. The Web Program Director as an IMSC member-at-large will be the Group's liaison.

Following OLE implementation, it is envisioned that IMSC will be replaced by a group to advise ILS on implementation of new OLE releases, define and prioritize enhancements, etc.


All members will be expected to:

  • attend meetings (frequency to depend on nature/scope of pending/current projects)
  • respond to questions, make recommendations, etc., as requested by ILS concerning specific matters related to the migration
  • assist ILS in communicating major decisions to staff
  • perform and/or coordinate assigned tasks

In addition, lead members for a functional area will be expected to:

  • represent the requirements/concerns of staff stakeholders of the functional area (regardless of library work unit)
  • recruit staff stakeholders
  • lead discussions with stakeholders on migration matters and make decisions as required
  • perform and/or coordinate assigned tasks such as data cleanup, review of data mapping documents, test software, etc.
  • determine training requirements for the functional area and assist ILS in identifying trainers and developing training classes

“Lead members for a functional area” refer to those IMSC members who lead “working groups” in the following areas:

  1. Cataloging
  2. Acquisitions
  3. Circulation
  4. Serials Receiving
  5. Reporting

The working groups’ members are key staff members in their respective functional areas and work with a designated ILS staff member on any matters related to moving data and operations from Horizon and Millennium to Kuali OLE for the respective functional areas. VuFind implementation is managed by a separate group that includes technical support from ILS; insofar as a migration of VuFind from Horizon to Kuali OLE is concerned, ILS’ role is primarily technical, although the circulation working group will be involved in testing “my account” functions when these are moved to VuFind/Kuali OLE.

Most of the actual work of performing gap analyses, testing, developing training, etc. is done by members of the IMSC working groups. It is important to note that since ALL Library units use one system, we have explicitly made clear that the working groups “represent the requirements/concerns of staff stakeholders of the functional area (regardless of library work unit)”. This is an effort to make clear that while individual library departments performing the same function (e.g., the Library has four cataloging departments) may have differing policies or procedures, use of the ILS must be the same.

One of the most important tasks of the IMSC is to ensure that staff receive the necessary training in Kuali OLE that includes both the instruction on how to use the new ILS AND any changes to policies/procedures/workflows that the new system will require. The IMSC has adopted a training plan included as Appendix B.

Details on our migration – along with links to the IMSC charge, training plan and other documents – can be found at: http://www.lib.uchicago.edu/staffweb/depts/ils/kuali/index.html.

Data Migration and Integration Issues

Last updated September 27, 2014.

For bibliographic and holdings data migration, the fact that the library was coming from a non-Unicode system required additional work. Kuali OLE also introduced a new type of holding called an EHolding record. Over time the library had dealt with electronic books, journals and other resources in a variety of ways. In some cases there were 865 fields in the MARC records, and in others there were also copy or item records. When we went to represent these materials in VuFind, it was decided to take advantage of the migration to OLE to regularize how we represented these materials by always using an EHolding record in OLE.

There were some special problems in data conversion due to the fact that we were moving from two separate systems, Horizon and III. These required custom data conversion scripts developed by in-house programmers.

MARC Authorities records were not planned to be supported in Kuali OLE until after the implementation of the early adopter sites. For a number of years, the library had sent new bibliographic records to Backstage Library Works and received corrected bibliographic and new and corrected authority headings. It was necessary to adapt this practice to continue to receive corrections for bibliographic records for loading into Kuali OLE, but to load authority records only into VuFind where they are used for cross references in the browse indexes.

Certain other features were not planned to be part of the initial Kuali OLE release and required that other local custom helper applications be developed. So, for instance, the Library used the spine label printing in Horizon, although not all libraries use that feature of an ILS. Also tracking payments accepted at circulation desks had some limited support in Horizon that was not in the new system, so some supplemental helper applications were required to support local workflows. The Library had developed many customized reports by pointing MS Access databases at the Sybase Relational Database Management System (RDBMS) of the Horizon system. Gradual adaptation of these reports to the Kuali OLE RDBMS is expected to take some work by the staff involved.

In addition, there are critical integrations required for the system to be functional. Specific Kuali OLE application programming interfaces (APIs) were developed to allow integration with the Dematic Automated Storage Retrieval System used in the new Mansueto Library. It was necessary to contract with the vendor for them to rewrite their side of that integration. The Library had moved Course Reserve functions to the Atlas ARES system and written a custom integration for the “place on reserve” and “remove from reserve” functions. Kuali OLE docstore APIs will be used to replace the custom integration at the RDBMS level used for Horizon. We are working with Atlas to implement this connection. NISO Circulation Interchange Protocol (NCIP) messaging to support participation in the UBorrow and Borrow Direct projects were also necessary parts of the project and require testing with the vendor systems. Finally, the ability to continue to extract payment information and format it correctly to load to the University Comptroller’s system in order to pay our vendors was another critical integration that is required for going into production.

It was particularly critical that the VuFind Horizon connector be replaced by an OLE Connector in order to have the public catalog work correctly and provide the My Account features to allow self-service requesting and renewals. Because Villanova, where the primary VuFind development is done, was a Kuali OLE partner, they provided the OLE Connector based on the OLE APIs developed for this purpose.

A Basecamp project was used to track work on these various projects and a snapshot of that project is provided in Appendix C.

Technical Migration Activities

Last updated September 27, 2014.


The Integrated Library Systems group included a manager, Library Systems Analyst, Systems Librarian, Database Administrator, Senior Programmer Analyst and Library Operations Assistant, who also did some web programming. To accommodate the need to support the open source systems, another entry level Programmer Analyst position was added to help support VuFind and a Senior Programmer Analyst with Java skills was hired to help support Kuali OLE. This staff would support the migration, as well as the integrations with other systems and the add-on custom applications needed for optimal use of the system. In addition, web programming staff and the Web Program Director for the Digital Development Library Center were used in the VuFind project to customize that system.


While a number of the commercial library systems are moving to “cloud-based” systems, there was no real impetus to consider such a solution at UChicago. Indeed, in a university setting there can be obstacles to such an implementation. For instance, during the course of the project, Lehigh University – another Kuali OLE partner – made a decision that university financial data should not reside in the cloud. Chicago had not made any such general policy, but major system implementations do require security and architecture reviews. Patron database information in particular would be problematic to be made available on a commercial, vended system. VuFind and OLE will be implemented on virtual servers hosted by university computing. In the future, if the university offers cloud-based hosting services, it will be possible to take advantage of that. OLE itself is being developed on equipment that is in the Amazon cloud, so it is demonstrated that it can be run in that environment. At the moment, universities have some legal reservations about agreements to run on cloud-based commercial systems. It was seen as reasonable for the Library to follow university policies in this area and not to attempt to run the library systems separately. The intention is to take advantage of the university enterprise systems for storage, backup and system administration.

Plans for migration of the library system will undergo a review by the university ITS Technical Architecture Committee and also a security review. Appendix D contains the representative list of questions for these reviews. A separate PDF is attached which contains the diagram that the Library provided for that review.

A basic difference in implementing open source software is the need to pull down source code to a development environment and to develop a process to deploy new versions and fixes. This is true for both VuFind and Kuali OLE. This required some upgrades of equipment in the library for development and testing before deployment to production environments in the university data center.

Post-Migration Planning/Sustainability

Last updated September 27, 2014.

Based on previous experience, the Library is well aware of the possibility of system instability during the first few weeks of production. Library staff will be forewarned of this; contingency plans allow the Library to resume using Horizon or Millennium if that should become necessary. “Hope for the best and plan for the worst” is a prudent guiding principle for any new software implementation.

During the first few months of production, Library staff will undoubtedly find bugs and discover gaps in our training. ILS staff will be prepared to spend most of its time with intensive troubleshooting.

The Kuali OLE software development schedule calls for an implementable version to be completed in late spring 2014 and for patches to be available during the summer of 2014 for the two early implementer libraries. Experience of the other Kuali projects has been that early implementers developed a number of bug fixes and customizations for local implementations that were difficult to eventually include in the codebase. As a result, the OLE project will plan to support and merge the patches for the early implementers into the 2015 version of the software that will include additional functionality desired by all of the partner libraries. Implementation at the University of Chicago is planned with the intention of continuing to participate in development and testing of this next version. For the first year, the approach for local functionality to supplement Kuali OLE features will rely on helper applications separate from OLE, rather than modifications to the code itself.

Procedures for code contributions are being developed by the Kuali OLE Technical Council, and it is anticipated that there will be ways to do this by the time more partners implement in 2015. Appendix E contains the draft of these procedures. These processes are already in place for VuFind software and the library has contributed code where appropriate during the customization of VuFind for local use. While it has always been the plan to customize VuFind for local usage, early and intense involvement with the OLE development has resulted in a strategy to rely on local customizations as little as possible in the ILS implementation.

Our training plan calls for staff meetings about a month after the initial production date, organized by functional areas, to provide an opportunity for training refreshers and discussion of any ongoing issues. Additional follow-up meetings may be required, depending on the volume of issues.

Once Kuali OLE is reasonably stable, ILS staff will need to migrate seven years’ worth of past acquisitions data from Millennium to a database of some kind. Horizon historical circulation data will also be migrated to a data warehouse to support reporting. The intention is to reactivate a project to implement a data warehouse and a more sophisticated system for reports and analytics in the year following implementation of Kuali OLE.

During the six months after implementation, any remaining useful data will be moved from Horizon and Millennium. Those systems will cease to be updated at the time of cutover. They will be available for consulting until they are retired in January 2015. HIP and AquaBrowser will be unavailable once the cutover to OLE happens, as they do not point to OLE. They will be retired as soon as the OLE system is stable.

Lessons Learned

Last updated September 27, 2014.

It probably goes without saying that any system migration is a difficult and time-consuming process for all library staff. It is even more so when the new system is still in active development, which of course complicates matters. We would certainly advise any library to wait for a completed product before attempting a migration unless: (1) you feel confident that your staff can handle the situation of being beta testers at the same time; AND (2) there are compelling reasons – financial, functional, or other – to move off your present system ASAP. Both of these applied to the University of Chicago.

Commitment to developing a community-sourced project such as Kuali OLE requires a library to take a hard look at its staffing levels and the available skill sets. The ability to draft functional requirements, write test scripts and perform Quality Assurance/Quality Control (QA/QC) work, and write coherent documentation are necessary for any software development project and are not necessarily widespread among library staff. There is sometimes also a perception that this work must come “after” other duties – which it cannot if development schedules are to be met. Libraries used to complaining about the inadequacies of their vendors need to realize that with community-source developed products, what was formerly “them” is now “us”, and whatever inadequacies exist can only be traced back to our own doorsteps. In other words, a commitment to develop software cannot be taken lightly. At least as Kuali OLE is financed, these efforts depend on library staff, not paid employees as is the case with commercial vendors.

We anticipated that running open source would require more technical resources; we hired two additional programmers, and we believe that this turned out to be a wise move on our part. We could not possibly have gotten as far as we have with either our VuFind or Kuali OLE implementation without these additional resources. Even if we had decided to run open source without being a development partner, we would still have needed at least one more programmer.

Because of the active development, it turned out to be impractical to contract out for data conversion and training. If we had waited for software development to be complete it probably would have been cost effective to contract out some of that work. Also, the timing of internal work for the project was approximately a full year. There were many iterations of data conversions and installation and setup because we were not working with the final, stable version of the software. Implementation probably could have taken half the time if not for this situation. On the other hand, it was necessary to forge ahead with figuring out setup issues and working on data cleanup and conversion issues in order to meet the desired schedule.

Work with the Kuali project has been useful in forging alliances with our university computing groups. The Library has benefited by association with an open source project that is contributing to other areas of academic computing, particularly in the area of identity management.


Last updated September 27, 2014.

APPENDIX A. Kuali OLE staff contributions

Last updated September 27, 2014.

OLE Partner Resource Requirements

These are the MINIMUM required resources for each partner.

(Contribution includes meetings, travel, decision making)

  • OLE Board Voting Member: 1 one-hour meeting per month; 2 one-day face-to-face meetings per year
  • OLE Functional Council Voting Member: 2 two-hour meetings per month
    • Ad-hoc Workgroup Activities: 4 two-day face-to-face meetings per year
  • OLE Technical Council Voting Member: 1 one-hour meeting per month
    • Ad-hoc Workgroup Activities: 2 two-day face-to-face meetings per year

(Contribution includes meetings, travel, workgroups, decision making)

  • Subject Matter Expert Deliver Module 10%
  • Subject Matter Expert Describe Module 10%
  • Subject Matter Select & Acquire Module 10%
  • Technical Matter Expert System Integration & Infrastructure 10%

(Contribution includes meetings, travel, script writing, test execution)

  • Deliver Module Tester 15%
  • Describe Module Tester 15%
  • Select & Acquire Module Tester 15%
  • System Integration & Infrastructure Tester 15%

These are the CRITICAL resources required for each module and may be in addition to, or a part of, the above MINIMUM required resources.

(Coordination of meetings, spec writing leadership and delivery, JIRA management)

  • Module Lead Subject Matter Expert 30%
  • Module Business Analyst 50%

(Coordination of meetings, test script writing, testing execution, JIRA management)

  • Module Testing Coordinator 25%

APPENDIX B. Training Plan

Last updated September 27, 2014.

General Overview

When transitioning from Horizon/Millennium to OLE, most of the Library’s departments will need to determine what staff training will be needed in order for staff to continue to perform their primary responsibilities without undue interruption.

Note: While the Library may have multiple departments performing the same general function(s), e.g., we have four cataloging departments in D’Angelo Law Library, East Asia, Regenstein Library and Maps, the ultimate goal is to have ONE training class (or series of classes if needed) that meet the needs of ALL departments performing the same function, e.g., one “Introduction to Cataloging on OLE”, not four.

All Library staff members will need to learn basic features of OLE in order to perform any function now done in Horizon or Millennium, e.g., how to login; searching the Library’s database of bibliographic, holdings and item-level data; navigating from one type of record to another; recognizing common characteristics of screen layouts; permissions; etc.

In addition to this “basic training” (which becomes the foundation on which topic-specific training can be developed), Library staff will need to learn to use OLE to perform tasks specific to their responsibilities, e.g., create an order, loan a book to a patron, create metadata for a new title, receive a serial issue, etc. In addition to learning the functional “mechanics” of OLE, supervisors must also determine if their current basic polices/procedures (based on Horizon and/or Millennium) will also need to be altered. The “mechanics” plus the “policies/procedures” result in “workflows”, which may be defined as both:

  1. A sequence of steps to perform a standard task, some of which are done by operating a portion of the ILS
    1. Example: when Cataloging needs to create a new bibliographic record, an operator imports to OLE a record created in OCLC Connexion.
    2. Example: when Acquisitions must order a title, an operator creates a Requisition and then approves it to create a Purchase Order.
    3. Example: when a patron presents an item to borrow at a Circulation Desk, the operator scans the patron’s barcode from the ID and then scans the item’s barcode.
    4. Example: a Serials Receiving operator locates the correct record for a serial title and records receipt of a specific issue.
  2. Policies/procedures guiding decisions that must be made when performing any step in a sequence
    1. Example: a Cataloging operator must know when NOT to create a new bibliographic record, e.g., when the item in hand is an additional copy.
    2. Example: an Acquisitions operator should be able to determine when a pre-order search is or is not necessary.
    3. Example: when a patron is blocked from borrowing an item, the Circulation operator without override capability must be able to either explain the situation to the patron OR refer the patron to the appropriate staff person. The Circulation operator with override capability must know when it is appropriate to exercise that function.
    4. Example: the Serials Receiving operator must know what to do when the serial issue in hand has already been received.

Note: A “step” as used above can be defined very broadly (e.g., “create a Purchase Order), and when that it is the case, there may be a set of “sub-steps” that in turn could be a combination of “mechanics” (e.g., “select a Vendor from the list”), some of which in turn may require a policy/procedure decision (e.g., “always order from Vendor X unless told otherwise”). You might also argue that making a “policy/procedure”-based decision is, in itself, a “step”. The division of the two may be artificial in practice, but the distinction can often be helpful when trying to determine training content for learning a new system or for defining a logical structure for a training class.

Most importantly, OLE offers features not present in Horizon or Millennium. In addition, there may be certain features that we have in our older systems, but that will not be in OLE 1.5 (some of which are deferred to OLE 2.0). So, effective OLE training will require coverage of new policies/procedures along with the “mechanics” of the features. Some of these are: 

  1. OLE permits us to link one item record to multiple bibliographic records. This will mean changes in how the Library handles analyzed series because we will no longer create “dummy” items for the analyzed title record. This also affects the treatment of “bound with” items.
  2. OLE provides two types of holdings records: “holdings” (or “instance”) for titles in a physical format (print, microform, DVD, etc.) and “eholdings” (or “einstance”) for virtual titles. Rules for when to use one or the other will need to be defined.
  3. OLE allows for acquisitions to be integrated with the other functions. Workflows currently designed to use Horizon and Millennium will need to be changed and these should be reflected in training.
  4. OLE has a circulation feature that alerts an operator to count the number of pieces in a single barcoded item during the Return process whenever the piece count is greater than “1”. The Library does not currently enter Piece counts into its item records. If it decides to do so, this should obviously be incorporated into training for anyone who creates/updates OLE item records and for those who process Returns. If the feature is not to be used until a later date, the training may decide to cover this feature only in passing.
  5. OLE allows for four different types of requests (recalls, holds, paging, copying) along with the ability to allow some patrons to have requested items physically delivered to them (as opposed to having items placed on hold and picked up at Circulation Desks). How or if the Library will use these must be decided along with how a staff operator can place or fulfill these requests using OLE. OLE circulation operators should also be acquainted with how patrons will place their requests using VuFind.
  6. While serials receiving will be part of OLE 1.5, the ability to claim individual serial issues will not arrive until OLE 2.0. Instructions on an interim solution/workaround may or may not need to be part of the initial training; if not, then follow-up training will be required. [This assumes staff who receive serials also claim missing issues; in fact, there may be staff who ONLY claim missing serials in which case it, of course, makes sense to have separate training for them.] 

Note: The above list may not be complete. Each functional area will need to identify those OLE features/functions we intend to implement now (and those we will implement later) for which there are no existing counterparts in Horizon or Millennium.

Those developing training should obviously seek guidance from supervisors or other decision makers about how their respective departments intend to use (or not) these new features.

As a resource, class developers may find existing OLE documentation to be useful. The draft material (eventually to be finalized) for 1.5 can be found at:  http://site.kuali.org/ole/1.5.0-M2-SNAPSHOT/reference/html/Index.html.

Training Principals

The ILS Migration Steering Committee (IMSC) has four Working Groups (Cataloging, Acquisitions, Serials Receiving, and Circulation) with a designated Lead (Janet Fox, Head, Data Management Services, Scott Perry, Head, Collections Support, Julie Stauffer, Head of Law Acquisitions & Electronic Resources, and David Larsen, Head of Access Services & Assessment, respectively). Each Group has a membership of key staff (as named by the Leads) that are identifying gaps in OLE functionality, testing  functions by performing daily work in the local test environment, and are obviously key resources for developing the training in the four areas. 

Working Group Leads have the general responsibility for making sure that appropriate training – combining both OLE mechanics and the policies/procedures of the Library – is developed for the four major areas. [ILS takes responsibility for development of the “basics” class.] Leads should decide who attends the “train the trainer” sessions (see below for details), and they identify who will develop content and who will deliver the training (these could all be the same people or could be different people as decided by the Working Groups).

Note: All Leads and many in the Working Groups were involved in the development of Corinthian training. Any training material developed for that effort (and still extant) may prove to be useful with, of course, appropriate updating and changes.

Stuart Miller, Library Systems Analyst in ILS, and Jane Ciacci, Staff and Organization Development Librarian in Library Human Resources, are resources for anyone developing training. 

Developing OLE Training: Major Tasks

  1. Have key staff from the Working Groups attend the “train the trainer” session on April 9th. Appendix 1 contains the outline of this course, the purpose of which is to acquaint attendees with basic principles of designing training for an adult/professional audience.
  2. On April 11th (after the first “train the trainer” session), ILS will run-through the content of its “Introduction to OLE Basics” so Working Group members developing other OLE training classes will have a sense of what they need NOT cover in any detail. The “basics” class is a prerequisite for all other OLE training. [Note: With a few notable exceptions (e.g., use of “pop-ups”), this “basics” class and all other classes should assume that library staff are already familiar with Internet browsers. If anyone believes a staff member needs browser training, the Library has Firefox for Dummies available online; if staff are unfamiliar with the “Dummies” series, the word is meant to be humorous, not insulting, and titles in this series are very good for explaining the basics.]
  3. Between April 11th and May 22nd, develop training sessions (using skills/directions/suggestions from the April 9th training) for each functional area, including (but not necessarily limited to) determination of the following:
    1. Identify overall content to be covered and intended audience(s). [Note: As part of audience determination, each Working Group needs to consider whether student or part-time employees should be trained along with full-time staff OR whether it would be best to develop specially focused training for part-timers. This will of course depend upon the functional area and the extent to which part-timers have distinctly different work assignments from full-timers.]
    2. Determine if multiple audiences require multiple classes, e.g., a class “Introduction to Basic OLE Acquisitions” for all with another class “Introduction to Advanced OLE Acquisitions” for a smaller audience OR (depending on amount of material) “Introduction to OLE Acquisitions” covering everything in one class of n hours, first half for all, second half for selected staff. [Note: For staff not in one of the four areas: Acquisitions has assumed responsibility for determining the training needs of bibliographers; Stuart Miller has contacted Amy Mantrone, Head Binding and Shelf Preparation, about training needs for shelf prep staff. If there are other staff that may not “fit” into one of the four functional areas, send this information to Stuart Miller.]
    3. Identify content/length of each class along with mode of instruction (e.g., lecture/demo, hands-on, etc.). [Note: All classes should include a brief mention of how to obtain support/get answers about OLE when the Library moves into production; ILS will provide all trainers with this information.]
    4. Compile a list of which individuals (from any department) must attend which class. These lists should eventually be checked against the registration lists so that we are sure ALL affected staff take the training prior to moving OLE into production. Please make sure to consult with the appropriate managers or supervisors in ALL departments in ALL libraries when compiling these lists.
    5. Identify tasks to be done in OLE by only a few staff (e.g., updating foreign currency exchange rates; printing spine labels; etc.), make a list of points to be covered, and make arrangements to provide this training outside of a classroom setting, making sure to include ALL affected staff; share this info with the other Working Groups.
    6. Prepare any materials, e.g., PowerPoint, “cheat sheets”, handouts, documentation, etc.
    7. Decide on how many sessions of each class and what days in June 2014. The lists compiled as part of “d” (above) will help determine how many sessions will be required.
    8. As soon as possible, book a room for each class/each session; the Crerar training rooms (one seats 20, one seats 10) are being held open for any OLE hands-on training. Contact the Public Services Assistant at Crerar Library. Any other room, follow booking normal procedures. [If you book a room such as JRL A-11, you will need to contact Building Services about room set-up.]
    9. Decide on who will present the training.
    10. As practical/possible, arrange for a rehearsal of each class with an audience of the Working Groups members to give feedback.
    11. Submit registration information for each class to Jane Ciacci (see Appendix 2 for a format to follow) as soon as you have all the necessary details. Make your description of the intended audience as clear and as exact as possible. Please do this ASAP and certainly no later than May 9th. Earlier is better.
  4. Attend a “train the trainer” follow-up session in May (date TBD) to present and discuss work-to-date, get fine-tuning advice, ask questions, etc.

Post-Implementation Training

After OLE has been in production for approximately a month, ILS – in conjunction with the IMSC Working Group Leads – will schedule sessions for each of the four functional areas. Leads will use these sessions for any follow-up training that may be required. The sessions will also provide staff with an opportunity to ask questions, verify facts, suggest needed enhancements, etc. [Note: This is a supplement to the expected trouble-shooting/follow-up activities that we anticipate during the first month of OLE production that will be handled as needs arise.]

Schedule (to be expanded)

April 9: Train the trainer session

April 11: Attend Introduction to OLE Basics preview (for those developing other OLE training)

April 11-May 22: Develop training content

April 11-May 22: Submit training registration information to Jane Ciacci for posting to all staff

May [TBD]: Train the trainer follow-up session

May 22-May 30: Introduction to OLE Basics (5 sessions)

June: Deliver OLE training

Outline for April 9th “train the trainer” session

Last updated September 27, 2014.

Session Objectives:

Participants will be able to:

  1. Identify training outcomes and link to learner motivation
  2. Support learners before, during and after training
  3. Create effective learning objectives to drive training design
  4. Identify and utilize principles of effective training design for adult learners and learning styles

Pre-Work: Complete Learning Styles Assessment

Participant Materials: Designing Engaging Software Training (ASTD 16 pgs, $20/per participant)

Agenda Section/Sequence

Learning Objective

Learning Style

  • Auditory
  • Kinesthetic
  • Visual
  • Combo


  • Instructor Lecture
  • Instructor Demo
  • Read/review material or job aid
  • Participant Demo
  • Pairs instruct/drive
  • Case Study
  • Application Project
  • Quiz




Session Purpose & Agenda



Instructor welcomes/introduces self

Purpose/Agenda/Ground rules/Parking Lot

Participants identify elements of effective or ineffective training, based on their experience.

Participants work in small groups to identify the stakeholder benefits of ILS software implementation.



Setting the Stage for Effective Training

Identify Training Outcomes (Skills/behaviors used)

Create Effective Learning Objectives

Plan for Support



In ILS module teams, participants complete Tool:
Analyzing Audience Needs and Motivations Planning Tool

Lecture/discuss: Qualities of Effective Learning Objectives

Application: In ILS Module team create learning objectives

Lecture/Discussion: What activities and which roles have most impact on training application?

Application: Participants identify activities/planning elements to consider in design, delivery and post training to ensure transfer of training.

Job Aid:
Effective Pre/Post Training Support




Participants identify Learning Style (if not completed)



Learning Styles, Learning Design and Interactivity



Lecture/discuss: Principles of Adult Learning

Discuss:  Impact of Learning Styles and Adult Learning Principles on training design

Video: Mel Silberman on Interactivity in Adult Learning Design

Discuss: Interactivity in Technical Training design

Methods for Technical Training Interactivity

Study/Teach Activity:  Use Training Resource Guides/Books, each of three teams becomes “expert” on methods for creating:  Case Studies, Partner Activities and Job Aids.

Job Aid/Tools:
Training Design Template & Training Design Competencies checklist



Documentation/ Support Material Design



Lecture/discussion: What makes support/documentation effective?

Review UChicago examples (Gems, Time)

Explore with group:

  • Training environment
  • Availability of documentation (cloud based?)
  • Kuali resources



Action Planning & Close



Review/Identify participant questions/concerns & possible methods to address (i.e. follow-up by Library trainees/Kuali Subject Matter Experts (SMEs), build into Facilitating Effective Training session.)

Discussion: What barriers do you anticipate? How can you use knowledge and tools from today to reduce their impact or eliminate?

Review Bibliography/Resources

Close: Motivation/call to action

Quiz: Knowledge Check

Session Evaluation



Sample Registration Information

Last updated September 27, 2014.

Note: “Introduction to OLE Basics” is the prerequisite for any other OLE training class.

Course Information:
Introduction to OLE Basics

Who Should Attend?
All Library Staff Currently Using Any Features/Functions of Horizon or III Millennium

What Will This Cover?
During this two-hour lecture/demonstration, you will learn how to access and navigate around Kuali OLE, the Library’s new integrated library system to be implemented this July. You will be introduced to design features common to all of OLE’s functional components. Some specific topics include:

  • Opening OLE with your login/password using the Internet browser of your choice
  • Using browser tabs to navigate and help with workflows
  • Perform searches for basic records
  • Display records from search result
  • Be introduced to standard navigation features
  • Perform basic create/edit functions for commonly-used records
  • Features/functions common to basic records
  • Where Horizon data is found in OLE

The two-hour time frame allows ample time for a question-and-answer period. The session may end early if there are few questions.

Is There a Prerequisite?
There is no prerequisite.
Note: This class is a prerequisite for additional classes that cover OLE acquisitions, cataloging, circulation and serials receiving to be held in June 2014. Watch email for registration announcements.

Registration is required; select one session. There is a maximum of 50 attendees per session, so sign up now for your preferred day/time.

APPENDIX C. Basecamp project snapshot April 2014

Last updated September 27, 2014.

OLE Data Migrations and Integrations


  • Acq Fit Gap analysis #1: Wed, Jan 1
  • Deliver Fit Gap analysis #2: Wed, Jan 1
  • Cataloging Fit Gap analysis #1: Wed, Jan 1
  • Full db and Fit Gap analysis by all modules: Fri, Apr 4
  • Monthly review of Decision to go for July 1: Wed, Feb 12
  • Performance test passed Dale Arntson: Fri, Apr 25
  • Training plan in place: Fri, Mar 28
  • Full VuFind running against full OLE db: Fri, Apr 25
  • Full test db with all integrations: Mon, Jun 2
  • Full VuFind db with incremental updates: Fri, Apr 25
  • GO LIVE: Tue, Jul 1

VuFind for OLE

  • Test 1.5 exports Frances McNamara: Fri, Apr 11
  • Test Villanova's OLE Connector: Fri, Apr 25
  • Test VuFind setup: Fri, Apr 18
  • VuFind My Account: Fri, Apr 25
  • Plan for switch of production VuFind: Fri, Apr 25
  • Test full db with incremental updates: Fri, Apr 25
  • Go live: Tue, Jul 1

Authentication and authorizations

  • Convene group on AuthZ/AuthN Tod Olson: Fri, Apr 11
  • Set up permissions in OLE Maura Byrne: Fri, Apr 18
  • Logons for non-university folks
  • Atlas ARES integration access issues
  • ITS security review

Describe Data conversions

  • Bib/Holding/Item Hzn conversion fixes for review
  • Bib Unicode conversion fixes for review Frances McNamara: Fri, Apr 11
  • Einstance conversion REVISIONS Frances McNamara: Mon, Feb 17
  • Serials Receiving conversion REVIEW Frances McNamara: Fri, Apr 11
  • Analytics and Bound with Conversions Frances McNamara: Fri, Apr 25
  • Linked Bibs Conversion Frances McNamara: Fri, Apr 25

Describe Integrations

  • Authorities workaround for 1.5 Frances McNamara: Fri, Apr 11
  • Spine Label Printing: Fri, Apr 4
  • Bindery Slip printing: Fri, Apr 18
  • Serial Label printing: Fri, Apr 18
  • Donor processing (in 1.5) Frances McNamara: Fri, Apr 18
  • Cataloging reports Frances McNamara: Fri, May 2
  • OCLC monthly processing Frances McNamara: Fri, Apr 25
  • Hathi monthly processing Frances McNamara: Fri, Apr 25
  • Other MS Access reports including SCRC Frances McNamara: Fri, May 23
  • New Acq List from VuFind Maura Byrne: Fri, May 16

Deliver Data Conversions

  • KRMS rules REVIEW Cheryl Malmborg: Fri, Apr 11
  • CKO's, Requests, Blocks: Fri, Apr 11
  • Saving any old data not moved to OLE: Fri, Jun 27
  • Fines and Fees conversion Cheryl Malmborg: Fri, Apr 18

Deliver Integrations

  • ARES integrations Cheryl Malmborg: Fri, Apr 25
  • Dematic integration (requires 1.5) Cheryl Malmborg: Fri, Apr 25
  • Circ Reports: Fri, Apr 25
  • Google Book Processing: Fri, Apr 25
  • Privileges lockers, visitors, etc.: Fri, Apr 25
  • Cash Drawer workaround: Fri, Apr 25
  • Registrar Blocks with David Larsen: Fri, May 9
  • Other MS Access dbs of staff: Fri, May 30
  • Lost/Missing processing Cheryl Malmborg: Fri, Aug 29

Select & Acquire

  • Specifications for data conversions
  • Vendor data conversion jemiller@uchicago.edu: Fri, Apr 18
  • Orders/Standing orders data conversion: Fri, Apr 11
  • Test III Firm orders conversion: Fri, Apr 11
  • Old financial data: Fri, Dec 5

Select & Acquire Integrations

  • 1.5 MARC 9xx processing/Preprocessor MARC/EDI: Fri, Apr 18
  • Evouchers: Fri, Apr 25
  • Patron request workflow for acq Frances McNamara: Fri, Apr 25
  • Filters Identify and provide replacements meet with acq supervisors Frances McNamara: Fri, Apr 25
  • Order/invoice/approval import profiles for vendors: Fri, May 9
  • Acq reports: Fri, May 9
  • Other MS Access/ III reports by staff: Fri, May 30
  • Annual Stewardship report (test in 1.5) Frances McNamara: Fri, Apr 11
  • Test 1.5 Milestone 1 order and invoice loads Frances McNamara: Fri, Apr 18
  • Web form PO requests create reqs in OLE: Fri, May 9

APPENDIX D. Questions from Technical Architecture Review Committee

Last updated September 27, 2014.

1. Functional purpose

  1. What is the functional purpose to be served by this design?
  2. Is this one element of a suite that together achieves the purpose?
  3. Are there particular use cases that especially shape the design? Please describe them.
  4. What are the overall system availability and time to recover requirements?
  5. What's the expected life cycle for the system to be built?
  6. Does this replace something else?
  7. Does it overlap with something else?

2. Data management, data security, and data retention

  1. What data must be provided from systems external to this design?
  2. What data will be provided by this system to systems external to it?
  3. By what technical means will data move into and out of the system?
  4. How sensitive is any of this data? Is any subject to specific compliance requirements?
  5. How will data be protected, both at rest and in transit?
  6. Are there specific retention requirements for any of the data produced by this design?
  7. Are the stewards of this data involved in the design process?
  8. Has a data usage agreement been prepared?
  9. If the data is going to be used outside the US borders, are there any restrictions, laws, policies, or practice that should be considered?

3. Users and access management

  1. Who are the intended users of this system? Are they all UChicago people (including UC Medical Center)?
  2. Will any VIPs be among the users? Senior faculty?
  3. What credentials will be used? How will non-UChicago people access the system?
  4. What authentication technology will be used, and how will the system be protected by it?
  5. Are there different roles (sets of access privileges) that a user may have with this system? How will those be managed?
  6. Will you use shibboleth?
  7. Will you use Grouper?
  8. Are there requirements for user audit (a record of who took which management action when) or point-in-time audit (what did things look like at a given time in the past)?

4. Client environment

  1. What technologies will users use to interact with the system?
  2. What platforms (desktop, laptop, mobile) are to be supported?
  3. Must client software be distributed and maintained in this design?
  4. What requirements must the client environment meet?
  5. How will security of the client software be validated?
  6. Are there any export laws or sourcing requirements for technology that is being considered for use outside the US?

5. Hosting requirements

  1. Servers & storage
    1. What set of servers with what operating systems are needed by the design?
    2. Will all servers be virtual? (And if a vendor supplied system, does the vendor support the system in a virtualized environment?)
    3. What are the storage, backup, and restoration requirements and how will those be met?
    4. Capacity planning and modeling? What’s the test plan to validate the ability to meet the required capacity?
    5. Performance goals?
  2. Databases
    1. What database technologies and versions will be used in the design?
    2. How many databases will be used, with what operational requirements (size, auditing, redundancy, etc.) etc.)?
    3. What is the average number of concurrent users?
    4. Are there any character set requirements?
    5. Is access to the database server required?
    6. What are the availability requirements?
    7. What is the maintenance window?
    8. Are there database options required as part of the install?
    9. What are the backup retention requirements?
    10. Will the database contain sensitive data?
  3. Platforms & Middleware
    1. What middleware technologies will be used in the design, e.g., application servers, .NET, web servers, integration brokers, ESBs, web services, etc.?
    2. Will this system be hosted on an existing in-house platform such as K-split?
    3. Does the design include or depend on Application Service Providers, SaaS, IaaS, PaaS, or any other sort of services operated externally to UChicago? Please describe.
    4. If not covered above, please describe how this system will leverage any existing infrastructure services operated by IT Services. Also, please identify any infrastructure service needs of the design that are not met by current ITS operated infrastructure services.

6. Network requirements

  1. Performance and functional requirements
    1. What will be the footprint on the network, i.e., number of physical ports, interface type and speed?
    2. What are the estimated bandwidth, latency, and jitter requirements?
    3. What load balancing requirements are there?
    4. Is a proxy of any sort (OSI layer 3 and upwards) needed (e.g.? E.G.: VPN, ssh bastion, port forwarding, HTTP proxy or reverse proxy)?
    5. Is the physical architecture documented?
  2. Firewall requirements
    1. What set of ports will provide user access (use placeholder IP addresses if they are not already assigned)?
    2. What set of ports will provide administrative access?
    3. What set of ports will provide access to back-end services such as storage, database servers, system monitoring, system management, syslog server, etc.?
    4. Who, either specifically or by role, will be authoritative for identifying the users permitted through firewalls to access user-facing ports?

7. Monitoring, metrics, and logging

  1. What metrics are needed for capacity planning, diagnostic, availability, and usage tracking needs?
  2. How will the system be monitored or instrumented to produce those metrics?
  3. What Key Performance Indicators are defined for the system, i.e., targets for performance, availability, etc.?
  4. What is the expected load in terms of concurrent users, transaction volume, etc.?
  5. What are the issues that will need ongoing governance to address, and what are the KPIs and metrics needed to enable those decision processes?
  6. What monitoring is needed to ensure the security and integrity of the system?

8. Reporting

  1. Are there particular reporting requirements?
  2. Does the design include appropriate integration with ITS operated business intelligence or reporting services?
  3. Have data confidentiality, classification, and sensitivity been considered in the reporting requirements?
  4. What type of authorization will be used to insure that data confidentiality is preserved where needed?

9. Workflow

  1. What workflow requirements are implemented by the design, and how?
  2. Are there business process implications that will be impacted by the technology, and vice versa?
  3. Is a new workflow engine being added where an existing one might be utilized?

10. Other dependencies and integrations

  1. What co-requisites or dependencies are integral to the design that have not been mentioned above? E.g., email, VoIP, SharePoint, webshare, IM, etc.
  2. Do you have the necessary documents, tools, and skills to do the integrations required?
  3. Have you discussed the integration requirements with other units to insure resource availability to do the necessary integrations?

11. Application development

  1. How is the system produced and maintained? Are we adequately staffed for that?
  2. How does this application's architecture relate to that of other applications we maintain?

12. Vendor support & viability

  1. How well does the vendor support this product or service?
  2. How viable is this vendor?
  3. How does this vendor align with ITS strategic vendor management?

13. Compliance

  1. How have you addressed accessibility and Section 508 compliance?
  2. Is there any PII (Personally Identifying Information) transmitted, processed, or stored in this system? Same question for ePHI (electronic Personal Health Information).
  3. Same question for payment card data or other personal financial account information.
  4. Are children under the age of 13 going to access this system?
  5. Will students use this? Is this application going to depend upon information about students in any way?

14. Mobile Technology

  1. Will authentication and authorization be required for the application?
  2. Are there privacy terms required for the application?
  3. Is this interface already usable on a small screen device (do not consider tablet devices)? If not sure, contact User Experience Consultant, Web Services.
  4. Is there already an app developed? If so, then complete the “Mobile App Disclosure Form” at: https://nsitwebservices.wufoo.com/forms/mobile-app-disclosure-form-for-v...
  5. If the current web interface isn't optimized for the web, then determine if there is a significant set of use cases which would compel us to optimize the interface for mobile.
    1. Would someone use this site while not at their desk to get work done? If yes, go to B.
    2. Would someone return to this site on an almost daily basis? If yes, go to C.
    3. What are the 3 things people use the site most for? Can those be fit onto a small screen?
  6. Compliance with the Mobile Security Guidelines at: https://mobile.uchicago.edu/page/security-guidelines

APPENDIX E: OLE Contribution Requirements

Last updated September 27, 2014.

Tested and formalized by Atlanta Kuali Community Workshop, April 2014

Adam Constabaris, NCSU +1
Dale Arntson, Chicago +1
John Pillans, IU +1
Grover McKenzie, Penn +1
Michelle Suranofsky, Lehigh +1
Jeff Fleming, Duke +1
David Lacy, Villanova +1
Shian Chang, UM +1

The JIRAs to be used for “testing” this process will be identified and added to this section.

These are the deliverables required for a source code contribution to the OLE codebase.

Any contribution to the OLE project must go through the standard approval process through the OLE Functional Council and scheduled for a release, regardless of issue type and managed through the OLE Project Manager.

  • Enhancement Process
  • Design Review
  • Code Review
  • Documentation
  • Unit Tests
  • Bug Fix Process
  • Code Review
  • Unit Tests (as needed)
  • Documentation (as needed)
  • Documentation
  • Documentation files or patch files for DocBook
  • Conversion to DocBook format is requested
  • Considerations for Contribution Review


  • Are the functional requirements and acceptance criteria documented somewhere so we know how it’s supposed to work?
  • What are the known issues gaps, bugs, etc.?
  • Does the contributor have a Roadmap for delivered functionality?
  • Is the intention for this to be maintained as core OLE?
  • How will the contributors be involved? Consider continuing interest, maintenance, expectations and desires.


  • Is the architecture documented?
  • What technologies are being used (KIM, KNS, KRAD, other libraries, and why?)
  • What automated tests are there? Any gaps in testing? Will manual testing be required?
  • What access would we have to the original team for questions?
  • Is the contributing team available to make updates as needed from code review?
  • Can the documentation be delivered in DocBook format?

The Contribution Process

1) Contribution Proposal

  • The first step in contributing to OLE is to create an issue in JIRA per the following guidelines. If an existing JIRA exists in the OLE Project queue these can be linked. The contribution will remain in OLE Feedback Queue until the contribution is adopted for integration to OLE.
  • Enhancements should be added to OLE Feedback Queue.
  • Attach all requirements or design documents to the JIRA.
  • Bug fixes should be added to the OLE Feedback Queue.
  • If possible, add a patch file to the JIRA.
  • If a Kuali partner institution is interested in contributing a large number of bug fixes, then contact the OLE Project Manager to see if we can arrange an easier way to get your fixes directly into the codebase.
  • Code Share (these are items that someone has developed and would like to make available to the community but not in the baseline code)
    • not a free-for-all (will require some sort of stewardship)
    • may be in language not suitable for inclusion in OLE core
    • may be a “hit and run” contribution (contributor will not be around to handle questions/enhancements)
    • may have a license/license requirements incompatible with ECL 2.0
    • may be useful but have incomplete/fuzzy requirements and acceptance criteria
    • there is a space in Kuali Github for contributions like this for some projects.
    • should involve documentation of criteria for these contributions (contact info, baseline requirements, etc.) 

2) Project Review

  • Review planned 3rd party licenses to ensure compatibility with ECL 2.0
  • See Kuali Foundation 3rd Party Licenses page
  • Enhancements are reviewed by the SME group of the appropriate module:
  • SME group reviews the new contributions.
  • New contributions are escalated by the SME group to the Functional and/or Technical Council for scoping or further analysis.
  • Development Managers review for integration to source code.
  • Bug fixes are reviewed by the QA Manager/Lead SME for placement.
  • Development Managers review the bug fixes for integration to source code.
  • Once an OLE Feedback Queue JIRA has been accepted as functionally to be included in a version of OLE, it is moved to the OLE Queue (Main) for work.

3) Contribution Development & Code Review

  • Ideally, the contribution development and code review process will be an iterative process to help ensure that the contribution can easily be incorporated into the core code.
  • Determine if the code is developed on a branch of the core OLE code.
  • Contributors develop the code.
  • Once code is complete, an OLE Development Manager will coordinate a code review.
  • If the enhancement involved DB changes, changes will be reviewed for compliance in a similar nature to the Kuali Rice Database Change and Migration Policy.
  • QA Lead will conduct a code review of the tests and ensure that your code is in compliance in a similar nature to the Kuali Rice Unit Testing and Build Policy.
  • Documentation Lead will conduct a review of the documentation to ensure documentation is complete in a similar nature to the Kuali Rice Documentation Policy.
  • Creating regression/functional acceptance tests is strongly encouraged and communication with the OLE QA team should be well under way. Contributions should be regression tested using the OLE QA suite.
  • We recommend that your code change has been exercised in an OLE testing environment.
  • There will probably be some changes recommended during the reviews before the contribution can be accepted. Any objections that come up during the review process must be resolved.
  • Will need to develop a style/conventions guide similar to/derived from the OLE Kuali Coding Conventions and Java Style Guidelines
  • The Code Steward is the role in the project which has the authority to review and accept the code. If a dispute arises, the Technical Council will vote.

4) Contribution Final Acceptance

  • Code (including tests) are merged back into the core OLE code.
  • Documentation is incorporated into core documentation code.

Checking In Contributions

  • Any time you check in code to the OLE project, you must specify the *associated JIRA key (e.g. OLE-XXX)* for the corresponding work that the check-in is related to in addition to a normal svn check-in comment.

Tips on How to Develop a Contribution Properly

Other Ways to Contribute

  • Code contributions are not the only things we might seek from project members or outside organizations, documentation and testing should also be accepted.