Running a Survey

This page contains more information on the steps involved in running a survey. For a detailed explanation, please read the LibQUAL Procedures Manual.


We've decided to run a survey. What are the next steps?

Pre-Registration Tasks

Before you register, consider your survey population and sample size. Some institutions choose to survey all user groups at their institutions, while others survey a sample. Make sure you have financial support and buy-in from your parent institution, as well as IRB approval—many institutions require this. IRB approval is handled entirely at the local level, and LibQUAL does not need to be informed about your local policies and does not require proof of approval.

Before or shortly after registering, you'll need to identify your data source for your email invites. A good source for this may be your campus or institutional computing office, administrative records, or library patron database.

Create a User Account

If you are a new institution (or are not sure if your institution has run a survey in the past), please email to get set up with an institutional and personal account.

Register for a Survey

When you're ready, register for a survey. Registration for the current calendar year is open until mid-November (note that the annual survey cycle typically closes around December 10). Registration for the next calendar year opens each August. The following steps are involved in the registration process:

  • Select a survey (select the year in which you wish to participate)
  • Select additional reports, if any (you can always order custom reports at a later time)
  • Add a subscription (optional)
  • Select your institution and primary contact
  • Select a consortium, if any
  • Select an institution type
  • Select your language(s)
  • Verify and update your contact information
  • Payment (pay by credit card at the time of registration or by check or wire/ACH after registering)
    An automatically-generated invoice will be emailed to the primary contact immediately upon registration.
  • Review your registration and select "finish"
    *Please click "finish" only once to avoid duplicate registrations; it may take a few moments

Customize Your Survey

After you register, you will immediately be able to configure your survey in your stage 1 dashboard. Here you can enter administrative permissions; customize your demographic questions, survey title, logo, incentives, and select your Lite view percentage; select or add optional questions; identify your position options; identify your discipline options; and enter branch libraries.

Previewing Your Survey

At the bottom of each of your survey customization tabs, you will see a preview survey link, highlighted in pink. Prior to opening your survey, you must fill out the preview survey (answer each question and hit "submit") in each language the survey is being offered in. Once the preview is complete, the "Open Survey" button will activate and you can move to stage 2: monitoring your survey.


When is the best time to launch the survey?

Past experience has proven that Monday or Tuesday morning is the best time of the week to send out your survey announcements and your follow-up messages. Libraries should look for a quiet period during the semester to launch the survey locally. We recommend at least a three-week survey run.

You may open your survey (that is, click the "Open Survey" button to generate your survey URLs) prior to the date you wish to invite respondents. This is especially useful for promotional purposes. Remember that once the survey is open, the URLs are live and will collect any data submitted.


Whom should we contact if we experience technical problems with the web survey form?

If you experience technical problems with the survey form, you should first contact your local IT personnel. If the problem cannot be resolved locally, send an email to, providing the following information about your computer system: platform (e.g., Windows XP, Mac OS X), browser type and version (e.g., Chrome, Firefox), and a brief description of the problem, including what was being done when the problem occurred and the page on which the problem was experienced.


Data Security

The LibQUAL survey is anonymous. The survey collects information on users' perceptions of library service quality. Because this is a web-based survey, respondents consent to participate by electing to fill out the survey questionnaire. Participating institutions are responsible for providing an explanation of the survey and information pertaining to its confidentiality.

For more information on data storage and security, please visit the Data Security and IRB page and the ARL Policy for Protecting Human Subjects page.


How should we handle paper surveys?

Each paper survey should be assigned a consecutive number before being given to respondents. Paper survey data should be entered EXACTLY as it appears on the page. There is a specific place in the system for submitting data from paper surveys you will see during administration. Data from paper surveys must be entered online before your institution’s survey closes. After the closing date, no additional data can be added. Libraries should keep a tally of the number of forms entered by hand.


Is the preview survey data made available to survey participants?

Data from the preview survey is not collected and is therefore not available to survey participants.


When will we receive a list of our local incentive prize winners?

As soon as you close your institution’s survey, the de-duplicated, randomly generated list of local prize winners will be automatically compiled. You can view the list of your local winners online in your stage 4 survey dashboard.


Who will notify the local incentive prize winners?

Each participating library will be responsible for publicizing the incentive drawings and notifying their local prize winners.


Analyzing Your Data

Complete vs. Valid Surveys

Completed—all questions were answered, except the few items that are not required. The nonrequired items are "the branch you use most often," the comments box, and the respondent's email address (for the incentive prize).

Valid—meets the following criteria:

  1. All 22 core items (8 for the Lite version) have a response, and a user group was selected.
  2. Not too many N/A responses: fewer than 12 for the long version or 5 for the Lite version.
  3. Not too many logical inconsistencies: Fewer than 10 questions (fewer than 4 for the Lite version) have responses where the minimum rating is higher than the desired rating.

Data Syntax Definitions

For xxx_mn, xxx_pr, and xxx_de fields:

  • Score of 1–9: Normal range of possible responses
  • Score of -1: Indicates no response was given for this question component
  • Field left blank: Indicates question was not asked (lite protocol)

For xxx_na fields:

  • Score of -1: Question was presented but no response was given to any of its parts
  • Score of 0: One or more of the question’s components was answered. ‘n/a’ box was NOT checked.
  • Score of 1: ‘n/a’ box was checked. In this instance, all other question components are marked as if no response was given (-1 for perceived, minimum, and desired items, -99 for computed components)
  • Field left blank: Indicates question was not asked (lite protocol)

For xxx_ad, xxx_su fields:

  • Score of -8 to 8: Normal range of possible scores based on question component responses
  • Score of -99: Indicates that a score was not able to be computed since one or more of the required components was left blank
  • Field left blank: Indicates question was not asked (lite protocol)

The numbering system used in the data file is different from the one used in the results notebook. Along with your raw data, you can download keys to the column labels and the option identifiers from the data repository (