SirsiDynix Etherpad

From Code4Lib
Jump to: navigation, search

"Integrated Library System Platforms on Open Source"

by Stephen Abram, MLS, FSLA ~ Vice President of Innovation, SirsiDynix

Chief Strategist, SirsiDynix Institute


This document was copied from the original, leaked report posted on WikiLeaks (<http://wikileaks.org/wiki/SirsiDynix_Corp_restricted_lobby_paper_against_Open_Source_technologies,_Sep_2009>) for the purpose of collaborative annotation, fact-checking, and commentary/critique.

This is a snapshot of the Etherpad document taken 1-Nov-2009 at 6:45 EST and may not reflect more recent revisions of said document.



Introduction: caveat emptor

On February 18, 1815, Hector M. Organ purchased 111 hogsheads (111,000 pounds) of tobacco from Peter Laidlaw and Company. It was the same day that the news broke of the signing of the Treaty of Ghent between the United States and Britain, which ended the War of 1812 and lifted the naval embargo that had drastically depressed the price of American tobacco by 30 to 50 percent.


Organ, who had spoken of the news of the treaty with his brother, speculated that the price of tobacco would rise within the next two days. But Laidlaw was unaware of the news at the time of the sale. During the discussion of the contract, Laidlaw asked Organ if he was aware of any reason for the price to be higher. But Organ remained silent over the news of the embargo lifting, and kept his price low.


The next day, when prices rose, Laidlaw incurred a large loss on the sale relative to the previous day's price, and repossessed the tobacco by force.


A lawsuit ensued, which eventually reached the Supreme Court and a unanimous ruling from the John Marshall court establishing "caveat emptor", or "let the buyer beware" doctrine in the United States. Under this ruling, "the buyer cannot recover from the seller for defects on the property that rendered the property unfit for ordinary purposes." While this ruling happened almost two centuries ago, some buyers ignore some of the most critical facts of their purchases.

http://en.wikipedia.org/wiki/Laidlaw_v._Organ
Odd case to support "caveat emptor", when you think about it: it was the seller who was unhappy with the deal, not the buyer.


Today, we see that happening when libraries get into talks about moving their Integrated Library Systems to open source platforms systems. What we have found is that they often are not aware of the heavy drawbacks of what open source systems cannot offer at this point in time.


Therefore, to help buyers become aware of the limitations of open source, we set out to clarify what open source is, how it is different from proprietary software platforms, and why Integrated Library Systems (ILS) are not ready for open source at this point.


So what is open source?

The concept of open source is fairly misunderstood and quite vague.

I'd like some kind of citation with this. Open Source is not misunderstood or vague in the library community.
Most organizations courting the idea of open source development do so because they feel they can project their dreams and desires onto a blank slate and have the features they want sitting at their fingertips quickly and easily.
Actually I think most organizations see OSS as on the whole cheaper. This may or may not be the case at the outset, but I believe that's the most obvious belief. No one thinks one system is going to fulfill all their hopes and dreams, proprietary or otherwise. With a tiny bit of research, they also likely discover that there is someone in their area specializing in OSS customization and support. It's good to have someone local and in your time zone to call upon.


This is a misunderstanding of how open source software development works. By definition, "Open source is an approach to the design, development, and distribution of software, offering practical accessibility to a software's source code."

Source: http://en.wikipedia.org/wiki/Open_source


A more technically correct term to define what open source is would be "peer production development," meaning that the open source model allows concurrent input of differing agendas and ideas to the development of software. Essentially, anyone can join the collaboration effort with the goal of making it stronger and more feature-rich.

He quotes a definition, but never gets past the first point. OSS isn't only about design.


Some of the most successful open source developments include the Linux operating system, Apache HTTP Servers, the Internet address system Internet Protocol, and Mozilla's Firefox Internet Browser.

Also: Sakai, Moodle, Drupal, TYPO3


The open source community repeatedly points to these efforts as the poster children of how successful open source can be. However, each of these developments has a major issue in common: they were developed because the public demanded it--they each had a critical mass.


Nevertheless, it should be noted that it is rare for completely open source projects to be successful. Rather than focusing on best-in-class software choice decision-making, these projects often end up being archipelagos of systems driven by a philosophical principle that is anti-proprietary.  

Criteria for measuring success needed?


How are open source developments and proprietary platforms different?

There are a number of assertions that proponents of open source claim as the strengths of open source, including:

  • Total cost of ownership (TCO)
  • Opportunity costs
  • Software as a Service (SaaS)
  • Features and Functions
  • Customization
  • Security
  • Networking
  • Open Formats
  • Necessary expertise
  • Testing
  • Integration
  • Community-driven
  • Scalability
  • Speed
  • Reliability

There are many more arguments on behalf of the open source community, but we will focus our attention on these subjects due to the importance of these assertions.


Total Cost of Ownership (TCO): The Real Price

The open source proponents state that it has a much lower price and a much lower total cost of ownership (TCO). What they tend to leave out, however, are the entry costs of switching systems.

Does this suggest that if you're a brand new library, you should go Open Source? Is legacy ever a really good reason to use software?
This also hilights the danger of vendor lock, which a lot of libraries are trying to get away from with FLOSS
Especially in the library market, the two main open source players haven't been around long enough nor do they have enough clients to provide evidence for this argument.


There is a difference between price and TCO. Open source proponents tend to focus on the `free' license that commits them to the software. And if it is free, they are not committed to keeping it, since no costs are out of pocket. They can even switch freely if it does not work out for them.

I have never actually heard this argument. Would love a citation.


However, all software has a true TCO, which includes the sales price, initial implementation time and costs, any hardware and software upgrades, hosting costs, maintenance and technical support, upgrades, and training (or re-training). It is important to determine the overall costs of adopting a new mode.

In short: PLEASE DON'T JUMP SHIP. IT WILL HURT!


It is very unlikely that an open source solution is any less expensive than a proprietary solution. In fact, in all of the data SirsiDynix has collected, we are not seeing quotes that conflict with this assertion. Indeed there are very few green fields in the ILS marketplace. Most libraries already have an ILS and receive upgrades as part of their maintenance contract from us or other proprietary vendors. These maintenance contracts are a small percentage of the initial price. 

I am pretty sure that at the UK AGM, some people were told they'd be responsible for upgrades - can anyone else confirm?
As far as I understand it, Unicorn/Symphony customers have to apply their own upgrades and patches
Correct. Unless you are a SaaS (Software as a Service) customer, you are responsible for downloading patches / upgrades, applying them yourself and doing any cleanup afterward.
And if things go wrong, at least as of 2008, the last time I tried it, you can only get support for upgrades during Eastern US time zone working hours, *even if you have paid extra for 24-hour/critical support*.


To convert to an open source option like Evergreen or Koha using vendors like Equinox or LibLime the library must start over with conversions and implementations, usually paying another vendor or consultant to accomplish these. As open source companies assert, it is free like kittens, not free like beer.

This is also true for proprietary software. The difference is that with OSS you might have more freedom of choice as to whom to contract to do the above; or you could do it yourself if you understand the code and/or available documentation
Generally open source companies don't say "free like kittens" they say "free as in free speech, not free as in beer" http://www.gnu.org/philosophy/free-sw.html
prev. commentor, the free-as-in-kittens argument says that there are hidden costs after the free acquisition (taking care of the animal for the rest of its life), which isn't the case with the free-as-in-beer argument.  Technically speaking it is a valid point to make, but still very suspect...I can't find any indication of Autonomy Interwoven actually being an "open source" company, can someone cite one way or the other?
prev. post The point is he's wrong, open source companies don't say that, they say 'free like speech, not free like beer' because it may cost money.


Generally there will be significant limitations to the hardware and operating system options. This limits the ability to cooperate consortially or share resources with host cities or institutions that may use a different standard. The library is at risk of being an island in the community.

This is also true for proprietary software. The difference might be that with OSS you can take the available source code, and recompile it or otherwise make it work in whatever hardware or other restrictions; or possibly have more freedom in choosing it who can do it for you.


SirsiDynix offers--and has offered for many decades--a wide variety of options for servers, operating systems and plug-ins. Open source ILS offerings do not offer the diversity of choices that SirsiDynix offers.

But, as an SD client, do I have the freedom to adapt or extend the software to my own situation? I assume SD's software isn't OSS. =)
Where you can, e.g. with the OPAC products, you will normally forfeit technical support if you make changes that are outside of the ones allowed by the supplied admin tools.  The HIP documentation has a statement to that affect.

[ CHART ] "Open source proponents and proprietary companies disagree on the total cost of ownership."


Proponents claim that even if open source requires more expertise, the TCO is ultimately lower. Companies claim that the required expertise is daunting and the other costs of proprietary solutions are exaggerated. (These charts illustrate concepts, not actual numbers.)"


I'd agree that TCO *might* be higher in OSS vs. Propietary. It might also be higher in one propietary ILS vs. another one. It might also be in one OSS vs another OSS. The reasons given above don't seem to be dealing with TCO, but "look, OSS is limited", "look, we offer a wide variety of options".. not actually giving reasons why OSS TCO is higher vs. propietary overall.


Opportunity Costs

Some software isn't compatible with open source. Choosing any solution may foreclose on other software. This opportunity cost may not be apparent for years when the need for the other software emerges.

this is true for open source or proprietary software


In many markets, there are major systems in accounting, intranets, e-learning, and so on that must tie in to the ILS. In many cases, open source is still the minority solution because, for example, the number of Linux desktops is meager compared to Microsoft Windows desktops. By choosing a Linux desktop, a user closes the door on some software because it may never be created for or ported to Linux. Add to this the major changes in allied systems that require an adaptation for the ILS and the issue grows exponentially.

First, the Linux desktop is hardly the right comparison for the ILS. Second, the opportunity costs of a proprietary ILS can be enormous as well - for example, as proprietary ILSs have been slow to add modern search functionality (relevance, faceting), we've lost a couple of years at least in comparison to the current open-source offerings. We've watched helplessly as Amazon has lapped us. This is crucial, user-facing functionality; the cost in user trust of our systems has been huge. (Though I suppose I should add: citation needed).
Two words: fuzzy searches.
since we're talking about server software it might make more sense to compare apache to iis http://news.netcraft.com/archives/web_server_survey.html


So for libraries that choose an open source system, the opportunity to integrate different systems into the solution is limited, at best.

When the APIs are closed, the opportunities to integrate with other systems is null. With OSS, you can build bridges And work with other libraries to build those bridges together.


SaaS

Real cost savings in the ILS come from improving the architecture of the whole system. This can be done through Software as a Solution (SaaS), where a proprietary software developer like SirsiDynix hosts a library's ILS and takes over responsibility for upgrades to hardware, updating, backup, and hosting activities.

Some institutions already have people to do this on their own servers. Other proprietary vendors can do this too. OSS-support vendors can do this (e.g. Acquia support everything "drupal", even the custom bits you add on). Is this related to OSS at all?


The emergence of SaaS is growing very fast across all types of technology-dependent industries. It is cost effective, more flexible, and delivers significant benefits than traditional software installations, with few downsides.


This can result in total cost of ownership savings of nearly 50 percent. With the best professional hosting facilities available, SirsiDynix operates on a global basis. From the point of view of the end user, the ILS workflows and the Online Public Access Catalogue (OPAC) are invisible and are truly adaptable to the Internet.

These last two paragraphs name something that at-least two of the current OSS ILS vendors in the US are also offering--hosting, at world-class facilities, is not something you have to close the source to offer.  These two paragraphs, therefore, really don't relate to OSS ILS solutions *directly*, but more to the companies that offer them.


While some open source ILS companies are offering hosted solutions, these solutions are not at the scale or professionalism [low blow - OS companies lack professionalism?] of a proprietary SaaS solution, nor do they provide the service level agreements or service expectations that SirsiDynix commits to. Some open source SaaS services are hosted on servers in a small vendor's office, which are not professional hosting solutions and come with extremely high risk to the library.

Who, currently?  If you're going to accuse an organization of un-professional solutions, name names, so they can answer questions about it.
I believe that Dynix UK used to host customer systems from their small-ish offices in rural Chesham? Vaguely remember being given a tour of their server room circa 2003.


Features and Functions

When one is evaluating the differences between open source ILS and proprietary ILS, the theories need to be overridden by practical applications.


It is one thing to subscribe to a belief system when one is talking philosophy. It is quite another when the discussion turns to provable issues like specific ILS programs features, reliability, security, power, speed, and ease of customizing the software for specific needs.  The use of agreed international standards is essential to using the wide range of third-party products used in libraries as well as any that may be considered in the future.

In short: don't go OSS just for the sake of going OSS. Doesn't everyone look at the features and functionality before making a decision? Who are these people who do not do so?


Generally, the available open source ILS platforms have less than half of the features and functions of any SirsiDynix ILS

citation needed
. Some of these features and functions may not be essential to some clients, some will be.
A client is not "some". We do not buy software based on how well it serves other people's needs. It's about the our needs. If we don't need the functionality you're proud of, we don't need the functionality and don't need your solution.
  However on this order of scale, and with that potential number of needed features, SirsiDynix has the ability to offer libraries the most robust feature set on the market.
Citation needed
It becomes incumbent on the library's decision-making process to clearly outline what they are giving up or planning to develop on their own if they choose to go open source.
I agree you have to plan ahead. It's also relevant if you want to change to another propietary ILS.


When we compare where we are today with proprietary platforms versus where we are with open source systems the development priorities for Evergreen and Koha are the same priorities that SirsiDynix had in the 1980s. How many years will it take for them to achieve a full feature set, if ever?

"When will these systems be everything to everyone?" This seems like a strawman argument to me. I don't want a system that make toast when I don't want toast. I want the system I need, and that is all. If a current OSS can do that, why not use it?
Like true Evolution, there is no end result, just an adaptation to the current environment.


Proprietary software has more features. Period.

Citation, please.
Sometimes fewer features is a "feature".
Proprietary software is much more user-friendly.
Citation, please.
SirsiDynix has been building this ILS for more than 30 years
so, 30 year old code is, somehow by definition, superior to recently developed code?
. It has a feature set second to none.
Citation, please.
It is important to note that a SirsiDynix ILS has two main user groups ­ the library workers who process the resources for the library as acquisitions, cataloguing, circulation, ILL, etc. and the end-users who use the OPAC features and other add-ons like self-check. Open source software developers are spending the majority of their time and resources on getting the back room operations right, 30 years after we already completed the process.
Not to be tacky, but here, SA proves that he has not been involved in the Koha community much.  Some very subtle functionality features are being worked on quite a bit lately, not "back room operations."
Is this SD telling us that they've made it and don't intend to improve their systems? Since the process is "completed'? Do we want software that considers itself "completed" in 2009?
It is perhaps worth noting that the Symphony Development Forum for the UK (a group of customers charged by the company with prioritising the development of the product for the UK marketplace) recently disbanded itself.  The Forum members felt that they were wasting their time and that the company was generally unreceptive to their development requests.


Customization

Probably the most attractive claim by the open source community is its ability to be customized by anyone, for anyone. This claim is technically true. Much of the desire for customization comes from Innovative Interfaces Inc. (III) clients. However, III has a long history and tradition of not allowing its clients to write APIs to the underlying data and fields in the ILS.

Speaking as a vendor, many of the customization requests we are getting are from *Horizon* clients, not III.


Meanwhile, SirsiDynix consultants have written custom API programs since the company introduced the Application Programming Interface (API) nearly 20 years ago. Other proprietary software companies like LibLime and Equinox have always offered customization to their clients.

meant open source, or meant examples other than Liblime and Equinox?

However, it should be stated that customization is not without risk. Extensive customization, especially with potentially little or no documentation can make upgrades and changes increasingly difficult. SirsiDynix mitigates this with our API training as well as the option to have our consultants to review APIs for errors and bugs.

I believe you're required to sign an NDA if you take API training? I know the SirsiDynix customer email list for API discussion is restricted only to people who have taken the class.  Customers can't just go out and download scripts that make use of SirsiDynix's API.
That is essentially correct.  When I took API in 2002, you did *not* have to sign an NDA, but I understand they do now.  The list is closed.  API-enabled customers can download scripts from a repository set up a long time back, but non-API-trained customers are not supposed to.
It may also be worth adding in the approximate cost of taking the API course. I've heard five figures but cannot source this.
2002 price: $3,000...anyone have newer numbers?
A figure of $5,000 was being quoted by customers at the 2006 EMEA conference in Barcelona
Their "API" is essentially command-line tools for manipulating the database.  The documentation those tools spit out for themselves, at least in 2003, was inconsistent with their actual workings.
In that case, Horizon customers have always had a free (i.e. no charges for use or training) "API".  Command line tools for performing dozens of tasks (including bulk bib and item deletion, importing and exporting MARC records, (re)indexing tasks, consistency checking, etc) are (and always have been) freely available to Horizon users and these tools are updated (inc. usage docs) with each release of the product.


In the open source world anyone can make significant changes to open source code. This is often presented as a great option to management who don't completely understand the consequences of too much customization. Too much source code change can result in completely new versions that are neither forward nor backwards compatible with the innovations of the overall open source community. Rogue programming teams may decide to create a better version, while exclaiming "Damn the torpedoes."

The ability to fork software is a strength.  Actual forks may or may not be a good thing.
The company affectively "forked" the Horizon product by adding region specific customisations.  As these regional enhancements were not fully integrated back into the core codebase for the product, many international customers have been unable to upgrade to the version of Horizon currently used by US libraries. My understanding is that the Unicorn/Symphony product is also forked for the UK market, but that these cusomizations are implemented via the product's API.
The result is that the relationship to the core kernel of the software can be broken or made `odd'. In some of the big open source communities, there is an individual or group who gives permission to make change in the software.
This is a lesson that has been learned by many people who work in OSS development and support. Our Drupal developers Will Not Hack Core, and it's stated specifically in our contract.
For example, Linus Torvalds, the genius behind the Linux platform, is materially involved in every Linux code addition to protect the kernel.
Linus is not involved with every Linux addition. He explicitly does not want to oversee every change. [ Find Reference ]


Customization can be a risky undertaking. Again, customization comes with the caveat emptor warning.

How risky, though, is the inability to do customizations that are needed?
Agreed. We had to undertake customizations "in house" to deliver functionality that was missing from the product and that was unlikely to be delivered by the vendor in the forseeable future.
I imagine such in-house customization of a proprietary product is at risk of breakage come time to upgrade.
Indeed, but we felt that the benefits of the customizations outweighed the time required to re-apply them after upgrades. Obviously, had the functionality been added by the vendor themselves in a timely manner, customization by customers would not have been required.


Libraries considering an open source ILS should seriously review how they handle version control and customization, as well as who handles the responsibilities and contracts for customization. If they don't, they may end up being an outlier or be forced into a proprietary environment like Red Hat.

As opposed to a proprietary environment like SirsiDynix Unicorn/Symphony?  SA has an interesting definition of "proprietary", I am thinking, and it's probably not the same as the one the rest of us use.
This equally applies to vendors such as SirsiDynix -- see previous comments regarding the forking of Horizon for the non-US market.


Security

Open source is often represented as more secure. This, too, is debatable. Some of the most security-conscious entities, like the United States Department of Defense, restrict the use of open source software for fear that it could pose a terrorist opportunity.

http://www.informationweek.com/blog/main/archives/2009/10/dod_says_yes_to.html;jsessionid=K1DMBECLDPL1XQE1GHPSKH4ATMY32JVN : DOD Says Yes To More Open Source, 28-Oct-2009
http://www.cmswire.com/cms/enterprise-20/cia-invests-in-open-source-lucene-solr-search-004830.php : CIA Invests in Open Source Lucene, Solr Search, Jun 16, 2009


It is not an accident that SirsiDynix ILS systems and SaaS operations are the choice of the U.S. military ­ possibly the most security-conscious environment in the entire world.

No, it's not an accident--merely good marketing--but it may be an accident that they do not know that, without purchasing additional layered products, patron and staff userids and passwords are stored in the database and transmitted in plaintext (at least, as of Unicorn GL 3.1, this was true).  It seems to me that calling OSS ILS products insecure here is a pots-and-kettles problem.


In open source, anyone can release code. But extensive testing is needed to ensure those codes are secure. The three big open source applications--Firefox, Apache, and Linux--have communities large enough to do this to find and isolate threats.

Are they even able to do it better?
  It would be naive for the library market to think that the ILS community of open source programmers is large enough to create this assurance--it isn't even close.
**If** the community is not large enough, is this document intended to prevent it from ever becoming so?
Define "large enough", please.  I would contend that the Koha community, in particular, is plenty large enough to handle this; indeed, we are doing so.  It is naive of SA to think that all of those programmers have to be on paid staff or customer-beta-testers to produce valid test results.


To date the ILS has not been a target for security threats, although associated systems for servers and communication have. This may change if a large installed base of open source ILS platforms emerges.

http://www.librarytechnology.org/ltg-displaytext.pl?RC=10575. 
Breeding states "The article describes the author's observation that at least some implementations of the Dynix ILS lacked even the most basic security precautions. He provides a basic recipe for discovering servers that run Dynix and identifying ones with lax security, and what exploits might be successful in gaining access."
More recently (pos. 2006?), a method of gaining access to the entire operating system filesystem (via a directory traversal exploit in the outdated JBoss software used the Horizon/Dynix ILS) was posted by a customer.  Rather than roll out a more recent version of JBoss to customers via an upgrade, SirsiDynix instead released a document explaining how to implement a "quick fix" workaround. As per the Breeding document, it would be a safe assumption that many customers did not implement the fix.
The Horizon OPAC currently supplied to UK academic sites contains numerous security holes that, to date, have not been fixed by the company. At least one academic site has had their OPAC server compromised on multiple occasions.


Networking

Some open source vendors claim that open source is more network-friendly and relies on the Internet and other networks for its performance.

Citations?
Unfortunately for the ILS community, this is a grossly over-stated exaggeration.


The proprietary ILS market currently utilizes large-scale networks that work at speeds and performance measurements that far exceed any open source ILS installation anywhere.

Citation, please.
In fact, SirsiDynix SaaS solutions are world class, and our references in consortia and large complex accounts demonstrate the ability of a SirsiDynix ILS to perform on a network scale at excellent performance.
I thought Georgia developed Evergreen/Pines primarily because Unicorn failed to scale?
See #8 at http://evergreen-ils.org/dokuwiki/doku.php?id=faqs:evergreen_faq_1


Open Formats

An open format is a published specification for storing digital data, which basically can be used and implemented by anyone. For example, the format is interoperable among diverse internal and external platforms and applications.


The argument by the open source community is simply that open formats are better. SirsiDynix agrees. We try to use open formats and international standards as much as possible. Ideally, this would be all the time. But the reality is that open formats are not always the most "open" to formats that a host city or institution uses. It is our opinion that the ILS works with the formats that are needed by their clients rather than engaging in missionary work for greater openness.

How about something as basic as RSS feeds that aren't created via screen-scraping?


Data management and migration highlight a number of these issues. Open formats are helpful in this area but even accepted standards like MARC have many legacy issues, data quality repair issues, and a company like SirsiDynix has infinitely more experience in migration and implementation issues than any new vendor, open source or not. [If you wanted to argue that LibLime or Equinox do not respect the skills and depth at SirsiDynix, just ask why they have hired so many alumni from SirsiDynix.]

I think question is rather "why did so many experienced staff leave SirsiDynix in a relatively short period of time?" Whilst many high profile/senior staff joined LibLime and Equinox, many others were also snapped up by the other commercial vendors (III, Ex Libris, Polaris, etc). At the time, the customer base expressed concern about the apparent "brain drain" and loss of product knowledge from the company.


Necessary Expertise

Is open source harder to deploy? All software solutions require some expertise to deploy, secure, and maintain. Some open source software is technically challenging and requires considerable expertise. This is a particularly important point in the library market where there is rarely a large systems department with a variety of programming levels and skills quickly available internally.

Is this intended to convey that SirsiDynix's software does not require equal expertise to deploy?
One admin's experience--39 hours for a Unicorn upgrade (v 2001-2002), vs two hours for a Koha installation.  Which one sounds "harder?"
As a Horizon admin of 4 years and counting, I still feel that I barely understand how the system works. The product documentation, whilst comprehensive, is also volumous.


Libraries considering open source should clearly evaluate the skills required.

Should read: Libraries considering hosting and running software should clearly evaluate the skills required.
This might involve hiring an expensive consultant.
Consultancy charges made by SirsiDynix, e.g. for on-site training or customisation work, are on a par with the charges made by most commerical consultants and, one would imagine, prohibative for many smaller libraries.
Libraries would be well advised that they have a long tradition of working with application software and that the management of a proprietary ILS involves a different skill set than managing an ILS that must be extensively customized to assure performance. Application programming is different than development programming.
Open source ILSes are also customized to meet user needs


The employment market for development programmers is different than application programmers. It also requires a different type and level of project manager and software leadership. These people are extremely rare and cost more. And most libraries cannot cover the salaries required to retain the talent they need. Moreover, these programmers won't necessarily be in the library programming space, meaning that libraries will have to compete with a larger development market than the limited library programming space. Indeed it is an interesting strategy for some library programmers to upgrade their skills in the library open source environment and leave as their worth increases.

It would be interesting to note how SirsiDynix has coped with this very issue, as the company has changed hands twice in recent years and shifted operations from Huntsville, Alabama to Provo, Utah.
The anti-compete clause in SD employee contracts (oft rumored to be 2 years) would mean that SD developers would be unable to work within the "library programming space" once they leave the company.


Testing

SirsiDynix has rigorous testing procedures. These are brought about through large investments in automated professional testing programs and procedures, regression testing, a mature beta testing process, managed protocols, and testing with partners. We certify some third parties using actual tests to ensure that the customer experience is as seamless as possible. We test for scalability and for the stress of large numbers of users. We test for all major browsers. We test on all supported servers and operating systems. We test aggressively and well. We test at every step of the development process. We do all of this before we have actual clients partner with us to beta test the pre-release candidates. Over the past few years our product has arrived in new releases with a higher standard of performance and more features than ever before. We have released 20 major releases and upgrades in the past two years on time.

The Horizon upgrades in the last two years have been region-specific and focused on the US customer base. Many Horizon customers in Europe are running systems that have not seen an upgrade for several years. However, the company has assurred customers that the next release of Horizon will be a "global" release suitable for many European customers (there us no current date for this "global" release). The lack of upgrades means that libraries are tied to using older Windows OSs and outdated versions of Java (e.g. 1.4.2). It should also be noted that these libraries are still paying the same levels of maintenance costs, despite receiving no fixes to known security holes or upgrades to functionality. "Caveat emptor" indeed, Stephen!


This is not the pattern that open source initiatives follow.

Citation, please.
Testing is the responsibility of the original programmer and their buddies.
Inflammatory, not factual.
Very poor choice of phrase.
Then the philosophy is "caveat emptor", or "Installer beware!" And the testing heavily falls on the early adopters.
This, I acknowledge, is true, but early adopters of any software project, open or closed source, often bear the brunt of testing or bug identification.
SD should not try to present here that they have been the barely-adequate testers for the last three decades.  The memory of pre-2002 Unicorn upgrades is still present in people's minds.  We're old, but not that old.  Anecdotes from the period should adequately document that whatever "testing" they were doing was *not* adequate, in those days.
Dynix (pre-merger with Sirsi) was also regularly accussed of releasing untested and/or buggy upgrades. Jack Blount admitted as much and promised to change the testing regime.


Yet, when reviewing the list of bugs in the open source ILS software as compared to the same bugs for the proprietary software, investigators have to go back decades in the list to find the same bugs open source platforms are fixing today.

Surely that's simply because Unicorn is considerably older than the OS ILS products?
Seems so to me, too.  Age is not a virtue in and of itself.  Anyone want to switch to DOS?  It's an older, more-developed product than this Windows business...
Except for certain classes of bugs that are age-independent (buffer overruns, etc.), bugs are generally specific to the application they are found in.  Some of the most well-loved proprietary ILS software (ie. Polaris) benefit from more modern development.  Mature can also mean fossilized.


This is evidence of a very young development program and the lack of real management in the process. The open source process is too organic and lacks tight priorities and strong management oversight.

As someone who has been keeping half-an-eye on the OS ILS development program over the last 18 months, it seems to be relatively well organised with some clear (customer driven) priorities.


Integration

Some argue that it's difficult to integrate open source with proprietary solutions. It's always a professional task to make software work well.


The truth is that the software world will always be one of hybrid solutions. SirsiDynix has a long tradition of using open source in our solutions, properly tested and integrated, as well as ensuring that our APIs and portal solutions allow for integration of any desired solution. We also ensure that these work with all of our ILS solutions, multiple platforms, operating systems, servers and browsers.

Not to ask too-pointed a question here, but when will SD come out with a version of Symphony that will allow use of MySQL?  It's been a customer request at least as far back as 2003, about the time I joined the SD user community. (should be sourcable)  If you want SQL compliance, you have to go with Oracle, which is spendy, or you can used the older ISAM databases.  If you're committed, show it.
Does Symphony actually use Oracle as a relational database, or as a glorified file system for buzzword compliance?
Unicorn originally used an ISAM flat-file database.  Unicorn/Symphony customers have the option of using Oracle instead, if they would prefer that.  My understanding is that the ISAM flat-file structure was ported verbatim to Oracle, so makes little (or no?) use of relationships, normalization, referential integrity or triggers to maintain consistency.
Perhaps ironically, Horizon actually DOES have a fairly well normalized sensical relational database structure. Horizon has been cancelled, Unicorn/Symphony has been continued instead, with it's apparent flat-file-ish schema. Horizon's reasonably normalized schema does make it fairly easy to 'hack' additional features on, compared to the typical proprietary ILS.

Community-driven

"Open source exists because a large community of motivated, generous programmers work together. Some are corporate employees, but open source development thrives on volunteers. Even users without programming or other technical skills find ways to help by filing bug reports, writing documentation, or answering questions on email lists." 

Source: http://www.netc.org/openoptions/pros_cons/principles.html#community . It continues: "Current users report a sense of belonging and accomplishment by sharing and collaborating. This cooperation and focus on the common good resonates with why they work in education." I think the "common good" in this quote is broader than the common good of SirsiDynix customers in SA's following paragraph.


There is no difference between this assertion about open source and the SirsiDynix approach. Indeed we have many decades of experience in tracking development suggestions and requests and testing, reviewing and replicating bug reports from programmers and users alike. SirsiDynix also has a history of participation in the care and feeding of a community of users and programmers that share and collaborate with us and with each other for the common good.

Is there a difference between an open community and a community that is forced to be closed?


Scalability

Some open source system vendors describe their software as "consortially aware" or having been built for consortia from the ground up. This is fairly weaselly language.

...said the pot to the kettle
Yes, this software can be `consortially aware' without any of the attendant performance (One didn't even support the Z39.50 international ISO standard until recently!) Having been designed for a single consortium such as PINES, does not guarantee that the software will work for another consortia's needs, particularly with the diversity of needs and variety of system architectures that exist in a fully dimensional marketplace.
"The shortcomings of proprietary systems are pretty much the same across all vendors, and the biggest shortcoming is in “multiness” - the number of sets of rules you can use." -- http://www.swissarmylibrarian.net/2009/10/29/ma-open-source-info-session-notes


If clients are concerned about their ability to scale they should check the actual performance of the ILS in actual complex and consortia environments. The PINES system is actually a very poor performer at its current scale of small public libraries. For example, all large library systems in Georgia have generally decided to stick with SirsiDynix. In fact, several library systems in Georgia have declined the use of the Evergreen system specifically due to scalability and response times.

Citation, please.
One tester of that system wrote, "Slow response time in Evergreen Staff Client. This includes unexpected "crashes" and "frozen" screens which may or may not be due to response time lag. This problem causes extreme delay and long lines at Circ Desk and results in both major staff and patron frustration."
Source: http://pines.georgialibraries.org/files/PINES%20priorities%201009.xls , cell J13. The solution, scheduled for November (not clear what year as the spreadsheet isn't dated - need to find its context on the site), is a server hardware upgrade. This isn't a "tester" who tested Evergreen and declined to use it, as far as I can tell: it's a problem report from libraries using the PINES system. PINES != Evergreen.


SirsiDynix encourages libraries to visit our large-scale clients and see the sub-second search performance on 10's of thousands of users. Such SirsiDynix Symphony clients as the Toronto Public Library, Alliance Library System, Los Angeles County Public Library, and more enjoy very strong and stable performance.

For searches, no need to visit these places in person.  Try comparisons directly from their websites.  Toronto and PINES appear to have similar search speeds (a few seconds) when searching for "fiction".  A keyword search for "It" fails altogether at Toronto, because it is a "stop word".  No one appears to have "google speed", but given the scale of Google's "hardware", I'm hardly surprised.
This is not the case with the small libraries of Georgia who are captive to sub-optimal open source systems. Indeed, despite stating that they were built for consortia, simple consortia features are not available or supported. Add to this the even better performance and TCO improvements of the SirsiDynix SaaS solutions and we offer a much better solution with significantly superior performance.
Sub-second search performance doesn't require closed-source, it requires efficient hardware and programming.  Hardware is (relatively) cheap, particularly in hosting environments, and having many interested programmers looking at code, rather than a hired few, is one way to lead to increased code performance.  The hired way *may* be an effective one, but it is certainly not the *only* one.
Using a 2Mb broadband connection, IE8 and HTTPWatch to measure the reponse time to the nearest millisecond, a sample search for "Germany" on a SirsiDynix SaaS hosted instance of "Enterprise" for a US public library took 7.231 seconds to load the core content of the page (i.e. main HTML, clickable search results and facets). Cover scans fully loaded after 16.352 seconds. The final HTTP request was completed at 64.611 seconds. From a users point of view, the page was usable & browseable after 7 seconds. Many factors may have affected this sample test, but it was far from "sub-second". The test was performed on a Sunday morning, so it is doubtful that the SaaS server was under load at the time. Subsequent searches completed more quickly (averaging 3.724 seconds for the page to be usable), which indicates the SaaS implementation relies on the caching of the page's JavaScript elements to achieve optimum performance (n.b. this is not a critism!). A visual inspection of the HTML shows that the developers have not chosen to optimize the HTML (e.g. large sections of unneccessary whitespace take up around 45% of the HTML).

Speed

End-users are not satisfied with sub-Google performance. The expectation has been set outside of the ILS market and the ILS market doesn't get by without meeting it. Therefore, SirsiDynix is focused on speed.

as opposed to what?  Useful results?  Enter "to be or not to be" into a Unicorn/Symphony search box, and enjoy the meaningless-to-patrons error message that results.  Try the same thing in Google, or Koha
or Evergreen
, and enjoy finding Shakespeare.  Speed is not a be-all, end-all of search-engine work.  Failing to produce meaningful results would, I think, constitute "sub-Google" performance.
Agreed. Also try typing in the name of any book or movie whose title starts with 'and' -- without putting the title in quotes.
Or the example that Joshua Ferraro seems to love:  "It."
See comments in previous section -- the Enterprise developers chose to include a sizeable amount of whitespace "bloat" in the HTML (i.e. it is not optimized for speed).


Our stress testing is done on the professional stress testing facilities at Sun Microsystems, Microsoft and UNIX servers. We test at 50,000 users per configuration for over a week. We use advanced automated testing procedures that cost money but deliver a definite positive result and tell us where to invest time in improving the performance of our software.


In addition we also test for all major browsers and try to ensure compliance with all standards and browsers evident in our market. This includes PC and Macintosh.

The W3C HTML Validator lists 20 errors and 68 warnings for a sample search results page from a SaaS instance of Enterprise http://is.gd/4JYC7 It is not possible to check the Unicorn OPAC, as that seems to require a valid session ID (i.e. you cannot create a persistent URL for a search).


This has not been the case in the open source ILS systems. If anything, one of the major complaints by users and clients is that it is so slow. Simple searches in PINES can hang for minutes

data, please
, resulting in the `searching..." bar popping onto the screen to encourage user patience. This is unacceptable in ILS software, which is why we test our system so rigorously.


Reliability

Finally, one of the biggest claims of open source proponents is that it is more reliable. They argue that since any programmer can find and fix bugs, the software will be repaired and improved more quickly. There is, however, no guarantee that the bug you want fixed will engage a member of the community to fix it. A bug fix can work in one environment and not others and the testing is up to each individual organization in open source.

Whereas in closed source, the customer is wholly dependent on the supplier's decision to put money and time toward fixing the problem.  This is not a problem if the needs or the problems are widespread, but if your issue is small or you are one of only a few customers who have it, you have no way of fixing it yourself and no guarantee that your supplier will deem the problem important enough to fix.


With open source, the advantage depends on the participation of enough competent programmers who are deeply committed to the entire development process. Without enduring, sufficient, talented interest, an open source project is doomed to fail, and many do.


Unfortunately for the open source proponents in the ILS community, there currently isn't a critical mass that is demanding the development of open source software. At this point in time, the open source community for ILS software is tiny.

Really? Really? That'd be why there are several OS ILS support companies, conferences and oh, I don't know, all the people currently editing this document? And surely, the reason for writing this screed is because SD can see that people are looking to change and move away from stagnant/"mature" software/companies to software that allows them to move with their users' needs?


Therefore, the reliability of ILS software developed on an open source platform is questionable. Just like proprietary software, the reliability of an open source program depends on clear feedback after rigorous use in a variety of environments. But that simply cannot be the case at this point in time because the variety of environments is small, and the critical mass needed has not been reached.

Again, SA asserts that the user/developer community is too small to be effective.  I cannot speak for the Evergreen community, but it seems to me that the Koha community has easily surpassed a "critical mass" point, particularly in India, the US, and Western Europe, with a signfiicant and growing Pacific Rim presence, as well.  The Koha IRC is manned nearly around-the-clock with knowledgeable members of the community.


Open Source and Libraries

Although many in the ILS industry are taking an in-depth look at the viability of open source development over the long run, we believe the movement is premature. Moreover, we are joined in our opinion by none other than Cliff Lynch, the head of the Coalition for Networked Information and a leading thinker in the library space.

http://www.cni.org/


Cliff called the development of the open source ILS by OLE, Pines, etc. one of the "stupidest strategies ever undertaken" in the library world. At a time when libraries should be investing in systems to improve the priority issues in the end-user's research, discovery and learning experience, here we have a cadre of libraries investing in the reinvention or at least, recreation, of something they already have and have at a cheaper cost than the redevelopment effort.

Cliff Lynch clarifies position here: < http://www.librarytechnology.org/blog.pl?ThreadID=134&BlogID=1 >


In addition, these projects do not have a compelling vision of what the end result will be and appear to be driven by library workers' desires rather than institutional strategies or end-user needs.

As a SD customer, I'm not convinced that the company has a compelling vision of either.
As such, they are tying up resources in an open source ILS effort at a time when budgets are constricted and other priorities are much more important and strategic.
As a non-US Horizon customer, a sizeable amount of our budget is tied into the annual maintenance costs for the product, despite not receiving an upgrade to the product for at least 2 years. These costs would easily cover the salary of a competent & experienced full-time developer.


SirsiDynix on Open Source

SirsiDynix is not de facto against open source. We use open source software a great deal in our development efforts, in our software and in our company. We easily support clients using the poster children of open source software ­ Linux, Apache, and Firefox. We have done so for many decades. Simply put, it's a good solution when it matches the needs of our clients.

Linux support for Unicorn is fairly recent.  2003? 2004?  (source this, someone?)  certainly not "many decades".  Don't know how long Apache has been an option for Unicorn web services, though the staff client is just now getting to the Web--also not "many decades."  Firefox, by that name, has only existed since 2003/2004.


SirsiDynix has been an early leader in building more open library management systems and indeed, being more open to even greater integration. This is especially true in the user experience end of our products where clients have added hundreds of applications onto our OPAC easily using our API strategy. We also have a very long track record in being open to our customers with beta tests, discussion forums, user groups, feedback mechanisms, and more.


However, SirsiDynix has also been in the ILS industry since 1979 and has developed the best-in-class solutions year-in and year-out. We've led the development of some of the most advanced features and capabilities of ILS platforms. So we know a thing or two about what it takes for library systems to be successful.

So you might also recognize that healthy competition and open debate are key to a robust marketplace.It's important for individual libraries and institutions to have important opinions, but they also need to be their own. Pressure from other groups - be they other vendors, the open source community, other librarians examining the evidence and providing perspective  - should only help to refine all products and test claims such as these.
It may be anecdotal, but I have the impression that whenever Sirsi and/or Dynix have acquired competing technology via mergers, they tended to discontinue the superior application for business reasons.

While we encourage the development of open formats, we must discourage libraries from jumping headlong into an open source platform to operate their ILS system on. At the current production cycle, jumping into open source would be dangerous, at best.


Caveat emptor!

Good advice at any time, and especially if there is a lot of money involved.  Weigh the options, take a good hard look at the TCO, with *real* numbers for your library--get quotes--and talk to real customers and users, not marketing folks or lobbyists.