Top 10 LibQUAL+® Resources
Bruce Thompson, Texas A&M University and Baylor College of Medicine
A Google search on "LibQUAL+" yields approximately 50,000 hits, and more than 50 refereed journal articles have been published on the protocol. Given this wealth of information, it is hard to know what are the key resources for people to use to learn about LibQUAL+, and how to use LibQUAL+ scores to improve library service quality. The page presents a subjectively determined list of "Top Ten" resources for people interested in learning about LibQUAL+. The resources are divided into sections: "before," "during," and "after" the implementation.
Explains the Logistics of Doing LibQUAL+ - Read Before You Do LibQUAL+
- Sample LibQUAL+ Results Notebook
This PDF download is an example of one of the major deliverables that each institution receives after LibQUAL+ has been implemented. Participants also receive their quantitative data in .csv files, SPSS syntax files, and the open-ended qualitative comments.
- Self-Paced Flash Tutorial
This tutorial explains the "zones of tolerance" rating system, the three basic dimensions of service quality measured by LibQUAL+, and how to read some of the charts used to report results.
How to Know (and Explain to Others) the LibQUAL+ Scores are Trustworthy (Psychometrically Reliable and Valid)
- Colleen Cook's Ph.D. Dissertation
Cook, Carol Colleen. (2002). A mixed-methods approach to the identification and measurement of academic library service quality constructs: LibQUAL+™. (Doctoral dissertation, Texas A&M University, 2001). Dissertation Abstracts International, 62, 2295A (University Microfilms No. AAT3020024).
This doctoral dissertation provides a comprehensive overview of the development of LibQUAL+®, including a very extensive presentation of the qualitative interviews with library users at various institutions, which provided a grounding of the protocol within the mindset of library users.
- Article Summarizing the Qualitative Grounding of LibQUAL+
Cook, C., & Heath, F. (2001). Users' perceptions of library service quality: A LibQUAL+™ qualitative study. Library Trends, 49: 548-584.
This somewhat shorter (versus the dissertation summary), article-length presentation of the qualitative grounding of LibQUAL+® within users' perceptions of library service quality.
- Article Illustrating Evidence that the Scores are Trustworthy
Thompson, B., Cook, C., & Thompson, R.L. (2002). Reliability and structure of LibQUAL+™ scores: Measuring perceived library service quality. portal: Libraries and the Academy, 2: 3-12.
This article illustrates the numerous quantitative analyses conducted to support a conclusion that LibQUAL+ scores are reliable and valid.
After the LibQUAL+ Results Are In: Ways to Understand and Explore LibQUAL+ Results
- Using score norms for benchmarking:
Cook, C., Heath, F. & Thompson, B. (2002). Score norms for improving library service quality: A LibQUAL+™ study. portal: Libraries and the Academy, 2, 13-26.
This article provides a tutorial on using percentile ranks or norms tables for service quality benchmarking purposes.
Thompson, B., Cook, C., & Kyrillidou, M. (2006, April). Stability of library service quality benchmarking norms across time and cohorts: A LibQUAL+™ study. Paper presented at the Asia-Pacific Conference of Library and Information Education and Practice, Singapore.
A discussion of the stability of LibQUAL+® norms benchmarking tables over time.
- How to Confirm YOUR LibQUAL+® Data are Trustworthy
Thompson, B., Kyrillidou, M., & Cook, C. (2006, September). How you can evaluate the integrity of your library assessment data: Intercontinental LibQUAL+® analysis used as concrete heuristic examples. Paper presented at the Library Assessment Conference: Building Effective, Sustainable and Practical Assessment, Charlottesville, VA.
The tutorial explains and illustrates how to use SPSS to confirm that the quantitative data from your institution are reliable and valid.
- How to Create Your Own LibQUAL+ Charts using Excel
Charting LibQUAL+™ Data, by Jeff Stark.
This user-friendly tutorial uses screen-shots and narrative to illustrate how to create graphs to illustrate findings beyond those presented in the standardized LibQUAL+ report.
We also offer radar and bar chart templates that allow you to produce your own radar and thermometer charts by inserting data values from your survey results notebook or raw data file in the appropriate fields.
Radar Chart Template
Thermometer Chart Template
Disseminating Results: Code of Conduct
Institutions may share their own data within their institutions in any way they see appropriate for promoting and improving library services. Institutions should not use other libraries' data in any way that would compromise and harm the reputation of other institutions. Institutions may use other libraries' data in a confidential manner without disclosing the institutional identity of other libraries. Access to the password-protected area where the results from LibQUAL+ are posted should be controlled by the director, or the designated coordinator, of the participating library.
In a "New Measures" environment, if we are to learn from one another and improve libraries, we must refrain from comparisons that suggest that some institutions are better than others based on the LibQUAL+ protocol. LibQUAL+ allows institutions to compare user PERCEPTIONS of service delivery against expectations; a library may assert that it is doing a better job of meeting user expectations (based on Gap Scores), than another but it is not useful to assert that a library is BETTER than another. Libraries may compare their results with those of peer institutions for identifying best practices and emulation in meeting user expectations and in managing user perceptions. Perceptions and attitudes can change rapidly as a result of local circumstances; rank ordering is not useful in this context. LibQUAL+® attempts to serve as a tool for local diagnostic purposes and cross-institutional comparisons for learning from one another.
LibQUAL+ is only one of multiple methods an institution may adopt in evaluating their services regularly and systematically to ensure that they are meeting the needs of their users. ARL will continue to offer opportunities for libraries to share their experiences and uses of the data so that libraries can learn how to better meet user expectations from an exemplar and identify best practices in the area of meeting user expectations and managing user perceptions.