Competency N – Evaluation

Competency N – Evaluation

Competency N: Evaluate programs and services using measurable criteria.


Meaning and Importance
It’s important to know how a library’s services are being used, and by how many people. How many people came to this program? How many people are using this database? Has attendance changed over time? What are the factors that may have lead to an increase or decrease in usage? The answers to these questions can inform how much of the budget should be spent on programs and resources, if new resources should be added to the collection, or if more marketing needs to be done for those resources.

Measurable Criteria – Statistics
Knowing how many people are using a service, or are attending a program, is a quick indicator of how successful that program actually is. It’s not the ONLY indicator, but being able to compare prior attendance, and seeing the upward or downward trends, as well as being able to compare the times/days/etc. that the programs fall on, is a great way to see WHY some programs are more successful than others.

Similarly, tracking statistics for usage of databases, ebooks, computers, and other services, allows librarians to see which are the most used and thus the most useful. If every computer is full from opening to close during the week, then perhaps adding more computers would be a good idea. If only two patrons are using a particular database each month, out of a potential user base of thousands, then perhaps the money spent on that database might be better moved to a different resource.

Libraries who are using grant money to fund a program or service are often required to send in information about those programs/services to the grantor. For instance, libraries that receive federal grant money for programs or services need to send in statistics at the end of each year (or quarter). The grantor wants to know that the grant money is being put to good use– and that often means high numbers of usage/visitors. Tracking that information is important for the library as well, as it could potentially lead to more or less grant money in the future.

There are several ways that libraries can track usage data. My currently library uses a combination of manually-added statistics and electronically generated reports. For instance, at the reference desks, we keep track of reference and technology questions asked. We also count attendance for each program we host, and at all outreach events. On the electronic side of things, we run usage reports on the number of new library cards made, database usage, ebook checkouts, book/DVD checkouts, and hours of computer usage. At the end of each month, we collate all stats into a spreadsheet, which is then collected into a larger spreadsheet at the end of the year.

Evaluation
On the librarian side of things, an information professional needs to be able to analyze and evaluate a service themselves. This comes into play usually when deciding on a new service to add to the library’s offerings, or evaluating when to remove a service that no longer serves a good purpose. Librarians need to take into account multiple elements: ease of use, whether the service is relevant and necessary, to the library and library patrons’ needs, whether it’s up-to-date and modern, cost of initial setup and cost of maintenance over the course of several years, how good the customer service and/or tech support, if there are updates and whether they’re provided free or have a fee, and whether the service is easy or difficult to implement.

Usually vendors will send free samples or demos over to libraries in order for them to evaluate the service, but librarians often have to ask questions and do further research in order to best determine if the service is valuable to their library and its patrons. Even in the case of not having a demo or example to work from, it’s often possible to evaluate a vendor based on the information provided on their websites. Librarians must be diligent in analyzing and evaluating a service before adding or removing it. A badly designed new service can be just as problematic as a service that hasn’t been updated in five years and is hopelessly out of date, after all.

Evidence
In INFO 241 Automated Library Systems, we had to complete a vendor evaluation. I chose to evaluate COMPanion Corp., a Salt Lake City based company that provides several library services, including an ILS called Alexandria. I had previous experience with Alexandria in a former elementary school library job, but I hadn’t had the chance to formally evaluate it or COMPanion Corp.’s other services. This was a good opportunity to do so, and I wrote a paper analyzing their available services and each individual service offerings from the perspective of a charter school librarian who was looking to purchase.

This involved researching COMPanion Corp.’s websites, which was not entirely easy to do as they (at the time) didn’t have everything in one place. Instead, Alexandria was on its own website, but most of the detailed information about its programs were on a different website. I had to spend several hours searching for information, gathering it into one place, and then writing a cohesive evaluation.

The evaluation included pricing information, ease of use, available upgrades and add-ons, customer support information, and help documents. I did this for all of COMPanion Corp.’s applications, which included Alexandria, KeepNTrack (a volunteer management software), and Textbook Tracker. I then evaluated the vendor itself, including positive and negative aspects of their company. This involved analyzing the difference between the data provided by COMPanion Corp. versus data provided by professional informational websites such as Marshall Breeding and websites where users reviewed their services.

By completing this assignment, I learned the importance of evaluating not only an individual application or software, but also the vendor itself. I learned how to dig past what the vendor provided as its own data, and how to cohesively present this information to upper management. It was a good learning experience and I will be able to take this with me into my future career.

Finch, Anastasia – Vendor Evaluation_ Companion Corp

Future Application
When I am in the position of evaluating a service or a program, I will be more informed on how to best conduct that analysis. Through my coursework at SJSU and my personal experience at my public library job, I will be able to not only evaluate using statistics gathered hands-on during programs and reports, but also professionally evaluate using information provided by vendors and data-driven websites.

Follow Me