From the Users’ PerspectiveThe UCSD Libraries User Survey Project by Dawn E. Talbot, Gerald R. Lowell, and Kerry Martin
The University of California, San Diego Libraries conducted its first comprehensive user survey in 1996. This article describes the user-driven survey methodology, its successes and failures, and conclusions about the survey process. The survey provided valuable benchmark data and has led to improved services for users.
Dawn E. Jalbot is Information Manager, Center for Magnetic Recording Research, University of California, San Diego, 9500 G/man 92093-0401
Dr., La /o//a, California
[email protected]>;
Gerald R. Lowell is University Librarian, University
of California,
San Diego
>;
consultant
at the time of this project.
U
ser surveys can provide reliable and valid data for planning service improvements and for budget planning. Most often, however, well intentioned staff, unskilled in survey design and process, spend extensive time and effort conducting user surveys that lack the methodology required to produce statistically valid findings. As had other institutions, the University of California, San Diego (UCSD) Libraries had from time to time conducted surveys to elicit feedback from users regarding services. These ranged from point-of-contact questionnaires to surveys of specific groups of library users, but none had been comprehensive in scope and captured the views of significant representatives from all campus library groups. Beginning in 1995, following library organizational changes and the adoption of a new mission statement, resources were allocated for an extensive library survey. After months of time-consuming planning, the Library conducted its first comprehensive statistically valid user survey in the spring of 1996. Library staff developed questionnaires with involvement from faculty, and graduate and undergraduate students, so that the survey topics were important to these users. This article describes this user-driven survey process, including its successes and failures. Detailed results from the survey are not covered here.’ Rather, the focus of this article is on the methodology used for this comprehensive survey effort that garnered a high response rate.
OVERALLGOALSFOR THE UCSD SURVEY The decision to conduct a comprehensive user survey followed the introduction of a broad change to the Library’s management philosophy and organizational structure. During this restructuring process, the Library management team pursued a plan-
ning process outlined by James C. Collins and Jerry I. Porras2 Subsequently, a Vision Statement was adopted that contained the following goal: “By 1998, when surveyed, 90% of the UCSD Libraries’ primary users-faculty, staff and students-will rate the UCSD Libraries’ collections, services, programs, staff and facilities as either ‘outstanding’ or ‘excellent’ .” Obviously, a survey of some type was implied if the Library truly desired to evaluate its progress toward this goal. As the process moved forward, it became apparent that the same questionnaires and sampling plan would need to be utilized in subsequent surveys. This would allow the measurement of the degree to which change had taken place, the degree to which objectives were met, and the perceived value of new programs introduced to meet user needs. This required the survey questionnaires to be designed so that they would “age well” with time and be re-useable with only minor revisions. The Library was committed to collecting information using standard survey research techniques to insure valid and reliable data that could be used in subsequent planning. Having relied heavily on market research surveys while working in the private sector, the University Librarian had a strong appreciation for the benefit of valid and reliable surveys. There was also a recognized need to gather data that would be accepted by groups outside the library, including campus administration, budget officers, and the Library’s users themselves. The Library also needed benchmarking data, only obtainable through a carefully administered research tool. These data would assess the current level of satisfaction with, awareness of, and expectations for, the Library’s collections, services, programs, staff, and facilities. As noted in the Vision Statement (Appendix A), the Library is a user-cen-
September 1998
357
tered organization. The Library saw the survey as an opportunity to invoke this philosophy in practical terms: it wanted the users to help determine its future direction. Therefore, the Library wanted to hear directly from the users about their needs and concerns, rather than relying on anecdotal information from library staff. Although more costly to design, such a survey would yield the best results. This desire for user involvement greatly influenced the methodology used to develop the questionnaires. User focus groups helped determine the content of the survey. BACKGROUND To THE UCSD SURVEY Selling the Concept
of the Survey
Many staff members in the organization did not initially support the concept of a survey. Some questioned why the Library was going to start “from scratch” when there were prior survey results that could be used. Some felt that the University Librarian’s comments about the need for a statistically valid survey devalued earlier survey efforts, given the amount of time that had been invested in these earlier projects and the pride with which staff viewed them. To obtain buy-in from the staff, the University Librarian promoted the need for a comprehensive valid survey through small group discussions, starting first with senior library administrators. This incremental buy-in strategy proved successful. There were two other important issues to address. First, the input from the survey needed to result in tangible change. Since the users were intimately involved in the process, their feedback needed to be not only taken seriously, but also acted upon. If not, the credibility of the organization would be significantly jeopardized. Second, the staff needed to understand that some of the feedback would be negative. They had to be open to constructive criticism. Planning the Survey: The User Survey Team and the Consultant During the restructuring of 1995, a new management philosophy was introduced. Called shared decision making (SDM) by the Library, this philosophy is based on self-managed teams, self-directed work groups, and other models emphasizing greater involvement of staff in operational decision-making activities. Consistent with SDM, the Library’s senior leadership felt strongly that a representative team would best lead its user survey efforts. A
358
The Journal
of Academic
Librarianship
call for nominations (including self-nominations) was issued to all library staff; the Library’s senior management group then selected the team members. In July 1995, the University Librarian announced the formation of the first UCSD Libraries’ User Survey Team (UST). The team began with nine members, both librarians and support staff, and represented a cross section of the UCSD Libraries. The team was fully empowered to launch the survey process. It approved the survey methodology to be employed, drafted the survey, conducted all project administration, and facilitated the presentations of findings to library staff and the campus. This environment of full empowerment created a challenging arena for the team and dramatically affected how the group viewed its work, in that each team member approached participation with great seriousness and intensity. Given this level of full empowerment, all team members exhibited a personal sense of pride. Patience, fortitude, and dedication were key requirements for each team member.
“The selection of a particular sample design should attempt to maximize reliability and external validity, subject to cost and feasibility constraints.”
Very early in this process the team decided to work with a consultant. The consultant had to be experienced in conducting user surveys, but not necessarily within the library community. The decision to work with a consultant from the marketing arena, rather than the library profession, was an attempt to bring a fresh perspective to the study, free of bias which library practitioners may unknowingly introduce. Duke University Library took a similar approach in its 1993 user survey.j The UST selected the consultant after an exacting and competitive process. The project was not turned over to the consultant. Instead, the team worked closely with the consultant during each step of the process-as the focus groups were conducted, as the survey instruments were developed, as pre-testing was done, as survey instruments were refined, as the survey was released into the field, and as results were compiled.
The User Population The UCSD Libraries primarily serve a research university community of approximately 15,000 undergraduates, 3,000 graduates, and 4,000 faculty and researchers. It is a multi-library system consisting of 11 subject or format-specific libraries dispersed throughout the campus and at two off-site locations. Over 300 library staff members, including 50 professional librarians, serve this user population. Human Subjects Program and Institutional Review Boards One unforeseen hurdle that the Library encountered was the perceived jurisdiction of the UCSD Human Subjects Committee and Institutional Review Board. Although the jurisdiction of this board would have normally included user surveys, the senior administrator overseeing the Human Subjects Committee granted an exception. This exception was based on the view that “research,” as defined by federal regulations governing the use of humans as research subjects, was a “systematic investigation...designed to develop or contribute to generalizable knowledge.” The primary goal of the Library’s survey was to improve the services of the UCSD Libraries by determining customer preferences. This was deemed not to be “research” to contribute to “generalizable knowledge.” METHODOLOGY Survey Objectives In designing the survey, an attempt was made to overcome some of the known obstacles to user surveys. Doris J. Schlicter and Michael Pemberton4 have identified a number of reasons why libraries, specifically, are reluctant to undertake user surveys: a lack of skilled staff to design and implement a large scale survey, survey costs, difficulty of translating data into actionable items, and an inherent distrust of survey research methodology. Of particular concern to our study was that survey data would result in actionable items. A further objective for the UST was to ensure that library management recognized the “tacit contract” between the Library and its users. Once a survey is commissioned, ultimately it will require action by management. In any survey, the respondents need to perceive a return for their invested time. If actions do not occur as a result of a survey, or if actions do occur but no feedback is provided, the desire of users to participate in future sur-
veys will be seriously jeopardized. It is important to convey to users the changes made as a result of survey feedback and especially important to explain why some requested needs could not be met. Timeline The UST began meeting in August, 1995. In the early planning process it was determined that spring quarter 1996 would be the optimal time for going into the field. This was driven primarily by the calendar affecting the undergraduate population. Spring quarter would avoid conflict with final examinations and would provide freshmen with two quarters of familiarity with the Library. Since the Library wanted to conduct the surveys of all populations concurrently, this set the timeline for all survey efforts. The consultant began working with the team in January, 1996. Knowing that a spring quarter implementation was required, a detailed timeline was developed. This timeline, expressed as a Gantt chart, essentially counts back from the field release date to present time, detailing all the actions that need to be completed in order for that field release date to be realized. This timeline would drive the survey process through the field testing period and beyond to the data entry, analysis, and final roll out to library staff and the broader campus community. Sample Design and Mode of Administration A major objective for the UCSD Libraries Survey was that it included all primary clientele. Initially, the user populations were defined as faculty and researchers, key administrative staff, graduate students, and School of Medicine (SOM) students, and both upper and lower division undergraduate students. After conducting focus groups with key administrative staff, they were dropped from this study, primarily due to budget constraints. To ensure external valid5 results without incurring the time and cost to interview all members of the target population researchers have developed protocols for drawing a sample of individuals that adequately represents the entire population and from which conclusions about the sample may be drawn with a known level of reliability.6 The simplest of these protocols is simple random sampling, a type of probability sampling,’ in which each subject has an equal and known chance of inclusion, and the selection of one case does not influence the selection of another. To have a statistical basis for
making statements about the population based on the sample, one must use a random sample; one cannot extrapolate to the population using non-random (i.e., non-probability) samples (e.g., convenience or accidental sampling of patrons within the library). Variations on random sampling such as cluster sampling (applied in the undergraduate sample design) and stratified sampling (applied in the graduate and faculty sample designs) add flexibility and cost savings, and reduce standard errors. The selection of a particular sample design should attempt to maximize reliability and external validity, subject to cost and feasibility constraints. Campus mail was the chosen mode of administration for both graduate students and faculty. Stratification8 by department for both the faculty and researcher group, and the graduate and SOM student group, was considered necessary because perceptions towards the UCSD Libraries were thought likely to differ among different departments. Stratification would also permit the various UCSD Libraries to respond to the needs of the specific constituent group that each library served. While in principle stratification should allow one to make reliable statements about the characteristics of all individual departments, the relatively small size (and, thus, the number of completed survey questionnaires) in some departments precluded any meaningful statements about those departments. The graduate sample included 36 different departments, with departmental sample sizes ranging from 1 to 83 individuals with averages of approximately 16. The faculty sample included 69 different departments with departmental sample sizes ranging from 1 to 93 with an average of approximately 7 individuals. For undergraduates, in-class administration was the chosen method. While compliance from the teaching faculty was expected to be difficult to obtain, in practice this was not the case. Only one faculty member refused permission for the survey to be conducted during class time. The in-class method was inexpensive and guaranteed a high response rate. A mailed survey was ruled out for this group because of the high mobility of the undergraduate population, and after discussions with administrators of undergraduate user studies who reported that undergraduates did not respond well to either mail or e-mail questionnaires. Low response rates likely result in self-selection and non-response biases and, consequently, a decrease in the external validity of the
results. Telephone and in-person methodologies were rejected due to the high costs. As stated, the UCSD undergraduate population was approximately 15,000. Since the Library wanted to gain information about non-users as well as users, it could not utilize its in-house database of library patrons. Therefore, the enrolled class (e.g., English 101, Psychology 169) was chosen as the primary sampling unit. With this design a single student may be sampled more than once, if he/she is enrolled in more than one sampled class. This was considered acceptable since the design element of the survey was “class” rather than the “individual student.” In order to stratify the large undergraduate sample, classes were divided into lower and upper division. The groups were further divided into five broad subject groups-Arts, Engineering, Humanities, Science and Mathematics, or Social Sciences. Crossing these two elements resulted in 10 strata. This stratification reduced the standard error9 and allowed conclusions to be drawn about the sub-populations, for example, lower division engineering students. The final 2000-element sample was selected from the 10 strata by drawing classes randomly with probabilities of selection proportional to class enrollment (see Appendix B). The collected sample deviated in size from the sample design in some cases. This was mainly due to the fact that classes in the available clusters were of widely varying sizes. In addition, some students were absent from class at the time of the study. The customary manner of dealing with such deviations is weighting lo the data so that they better approximate the intent of the sampling plan. The population weights were designed to adjust the stratum proportions in the sample to the stratum proportions in the undergraduate population. Further analysis revealed that the undergraduate sampling weights had an insignificant impact on the results; hence, the ultimate survey findings were based on the unweighted data. Development of the Survey Questionnaires In order to achieve a user centric focus, the UST drew on methodology used in market research. In an effort to avoid library staff biases in the survey questionnaires, focus groups were used to shape the issues on the final survey questionnaires. A focus group consists of 8-10 randomly selected persons with
September
1998
359
homogeneous backgrounds. They usually came together for two hours and were led through a series of relevant survey topics by a trained moderator. Focus groups have been used in library research to garner users’ opinions on services, but in this scenario the focus groups helped identify the issues that were important to the UCSD Libraries’ populations-both users and non-users. Vicki Young has prepared a concise outline for conducting library focus groups. ’ ’ The Library conducted focus groups for each of the four natural populations identified as primary clientele. Tasks associated with the focus group design, recruitment, and execution were shared between the UST and the consultant. This allowed for a more efficient deployment of appropriate expertise. The UST was responsible for recruiting participants for the focus groups using a variety of methods suited to the population being recruited. Nominal monetary incentives were used to help recruit participants and these differed according to the population group being studied. The focus group served not only to shape the issues that would be included in the survey, but also to provide guidance on how the survey could be administered most effectively. Suggestions were also elicited for appropriate incentives to be used with the mailed surveys to achieve high response rates. The incentive used was a raffle ticket to win a $250 gift certificate to the University Bookstore. Short written surveys were also conducted within focus groups. Their responses were discussed during the session to understand better how questions were being interpreted. An important distinction to make between data collected from focus groups and that collected by a survey concerns reliability and external validity. Focus group responses should be seen as a snapshot or an impression and not as a reliable sample. One cannot extrapolate to the entire population from such data since these respondents represent but a small fraction of the population as a whole.” The development of the actual survey questionnaires (one for each population under study) was interactive. A variety of issues, such as the number of questions, the length of the questionnaire, the clarity of the instructions, the terminology used, and the usefulness of the data each question would generate, were constantly reviewed to insure that the survey objectives were being met. As a result of this tuning, many draft versions of the ques-
360 The Journal of Academic
Librarianship
tionnaire were written before the final versions were ready for printing. In order to insure a clear, well-designed survey instrument, it is important to pretest the developing questionnaires. One method used at UCSD was the one-on-one pretest or “think aloud” method. In this pretesting approach, randomly selected individuals from each of the population groups met with the consultant and worked through the draft questionnaire talking out loud their responses. This process revealed problems that the respondent had encountered while completing the survey. Particularly important here was the ability to discern issues which respondents had with the tone of a particular question or the manner in which a question was phrased.
“In an effort to avoid library staff biases in the survey questionnaires, focus groups were used to shape the issues on the final survey questionnaires.”
For the undergraduate survey an important “pretest” was to evaluate the in-class mode of administration. Permission was obtained to pretest the questionnaire during the last week of winter quarter in a class that would not be in the final sampling plan. The completed surveys were collected, the data entered, and the results analyzed to determine that the questions were measuring the issues of interest and that the respondents were interpreting the questions as intended. Final Survey Questionnaires The resulting three questionnaires used by the UCSD Libraries to survey their users were the result of discussions with focus group participants, with the consultant and the members of the UST, and with Library management. While some may view the lack of involvement by the Library staff as questionable, the UST was a representative group of staff and the Library’s SDM principles were followed. The faculty/researcher and graduate/ SOM questionnairesI were very similar and, thus, allowed for cross correlation of results. Within the faculty questionnaire was a unique section that evaluated how well the Library met faculty teaching needs with respect to their undergraduate students. Within the graduate/SOM ques-
tionnaire was a section evaluating the study areas in relation to both comfort and availability, in the Library they used most often. In order to evaluate the Libraries’ mission statement, an overall satisfaction question was included. Design of this question proved difficult. The Libraries’ mission statement used the terms “excellent” or “outstanding” but using such terms on a satisfaction scale was problematical. Ratings such as excellent or outstanding are too loaded for respondents to be comfortable using them. After much discussion, the scale selected for the overall rating question was a l-5 point scale with 1 being “very satisfied,” 2 being “satisfied,” 3 being “somewhat satisfied,” 4 being “not too satisfied,” and 5 “not at all satisfied.” The overall satisfaction question was augmented by questions that dealt specifically with user satisfaction or dissatisfaction with facilities, staff, collections, and services. A second group of questions dealt with usage of the UCSD Libraries using a frequency-of-use scale. Within this category it was determined how often users went to the Libraries in person or used them remotely, which Libraries were used most often for research or for studying, and which collections and services were used most frequently. Other areas studied were library instruction and service priorities. Various types of instructional programs were listed from which the respondents could indicate their interest. In order to develop future directions for the UCSD Libraries, respondents were asked to select their top three choices from a list of services and resources that the Library presently offers. They were then asked to select their top three choices for future programs, again from a list. Items on both lists were derived from focus group discussions as well as from UST input. Respondents could also write in items that were not on the list. The survey also collected demographic data. For example, questions covered areas such as respondent’s field of study, which primary library was used, frequency of library use, respondent’s length of time at UCSD, information about respondent’s computer literacy, and, in the case of faculty, academic position. These data enabled action plans to be tailored to specific user populations. An open-ended question was also included in case the survey questions had not captured all possible
areas upon which respondents wanted to comment. The undergraduate questionnaire14 was far simpler in design. This was done for two reasons. First, by necessity, the questionnaire had to be short because it had to be completed within 10 minutes of class time. It was determined in focus groups that undergraduates, particularly lower division, have limited experience with using the Libraries, however, they were eager to appear diligent and responsive and would often answer questions when they had little or no experience with the service or resource in question. In an effort to collect data that could be compared with the other two surveys, the undergraduate survey covered in less detail the same five areas*valuation of services, facilities, staff and collections; usage; library instruction; existing resources; and future resources. One area of major interest in the undergraduate survey was the evaluation of the UCSD Libraries with respect to the study facilities and the staff. We were particularly interested in use by undergraduates of the libraries as a place to study as distinct from doing research. Another area of importance to the Library was the interest expressed by undergraduate students in library instruction. The questions on these topics followed the same format as used in the other two surveys previously described. Data Collection
and Response Rates
As has been mentioned above, in-class administration was used for the undergraduate survey. Seeking permission from faculty was critical to the success of this mode of administration. Once the list of classes to be included in the survey had been selected, the UST assigned one of its members to manage the faculty permissions. In only one instance was permission denied. The basic “in and out in 10 minutes” process had been pre-tested and refined into a set of procedures that the UST implemented in 17 classes (Appendix B). Team members distributed the questionnaires while a team member provided verbal instructions from a written script to insure uniformity of the process. Most questionnaires were completed within the allotted time. Only a carefully orchestrated action plan with many staff members on hand to distribute and collect surveys, especially in large classes, would have been successful with the “in and out in 10 minutes” methodology.
The response rate for the undergraduate survey was 77.5%, calculated as 2,861 completed surveys divided by the 3,689 eligible respondents. This should be considered a lower bound on the actual response rate since the enrollment figures used in this calculation (week one of the quarter) are probably an overestimate of the actual enrollment at the time of the survey administration (weeks 2-4). The number of undergraduate-completed surveys yielded a confidence interval of 95 f 1.8%. Ramifications of class enrollees who were not surveyed for reasons of non-attendance, late arrival in class, or because their class was excluded from the sample is addressed in Appendix C. The graduate and faculty questionnaires were administered through the UCSD campus mail network. All surveys, with a cover letter from the University Librarian, were mailed during the first week of the 1996 Spring Quarter. The field period for these mailed surveys was 11 weeks. During this time and the three months previous, a moratorium was imposed in the Libraries on any type of user survey. This was instigated to focus attention on the major user survey. During the field period a brief question and answer sheet was distributed to all public service points to help staff answer any questions they might receive from respondents. To maximize response rates and reduce the later non-response conversion efforts, a raffle ticket was enclosed with the survey questionnaire. The ticket was to be returned with the completed survey midway through the 1 l-week field period
in order to qualify for a gift certificate for the UCSD Bookstore. Much effort was put into non-response conversion. Exactly one week after the initial mailing, everyone was sent a postcard thanking him or her for responding and reminding everyone to complete and return the survey if he or she had not already done so. Two weeks later a second letter including a replacement questionnaire was mailed. The third and final non-response conversion effort entailed phoning non-respondents and was implemented in two stages during the later part of the field period. In stage one, members of departments with response rates below the overall response rate were targets and, in stage two, members of the remaining departments were contacted. The graduate student response rate was 7 1.06% yielding a confidence interval of 95 +4.8%. The response rate for faculty was 7 1.73%, yielding a confidence interval of 95 &5.1%. KEY FINDINGS
Although it is not our intention to discuss the data collected in detail, since they relate to a specific institution, it might be insightful to consider a few representative examples of both information collected and action plans. Understanding and acting on users’ comments and concerns were paramount since the UCSD Library seeks to be a user-centered organization. Which Libraries Do They Use? Respondents were asked which libraries they used as well as which library they used most often, i.e., their primary research library.
Table 1 Faculty and Graduate Student Usage of the UCSD Libraries Percentage who used this Library
Library
Faculty Art & Architecture
Graduate
Percentage who named this their Primary Library Faculty
Graduate
7.4
11.0
0.6
1.9
68.8
50.3
48.7
35.7
Mag. Recording Research
1.7
3.9
0.3
0.3
Special Collections
3.1
4.9
0.2
0
28.3
10.5
9.9
0.3
3.4
7.6
1.5
1.7
Science & Engineering
35.9
47.3
16.2
24.4
Scripps Inst. Oceanography
24.6
22.3
10.1
8.4
Social Sci. & Humanities
20.9
41.2
12.3
19.5
4.3
13.9
0
3.2
Biomedical
Medical Center Music
International
Relations & Pacific Studies
September 1998
361
Undergraduate
Table 2 Usage of the UCSD Libraries Percentage who used this Library
Library Geisel
Percentagewho named this their Primary Library
91.6
70.4
Biomedical
2.7
3.2
Medical Center
0
1.1
32.9
14.8
3.4
0.2
49.5
8.5
9.5
1.6
Science & Engineering Scripps Inst. Oceanography Undergraduate International
Relations & Pacific Studies
For the undergraduate survey, the Libraries housed within the Geisel building were collapsed into one category since undergraduates were unable to distinguish one collection or library from another. The physical building-Geisel Library-was their reference point and included Art & Architecture, Music, Special Collections, and Social Sciences & Humanities libraries.
How Often Do They Use the UCSD Libraries? Faculty and researchers tend to use the Library one or more times per week, and more often remotely than in person. Graduate students also use the Library on average one or more times per week, but in person rather than remotely. Undergraduate students use the Library most often as a place to study rather than to do research or to use the services. Over 47.5% of the undergraduates said the UCSD Libraries were very important to them as a place to study and 75.2% use the Library one or more times per week for this purpose. Only 12.1% use the Library for research one or more times per week with 48.5% using the Library once a quarter for research purposes. Overall, How Satisfied Are They? All the user groups rated their satisfaction with the UCSD Libraries highly: 85.5% of the faculty and 87.1% of the graduate students were satisfied or very satisfied, whereas undergraduate student satisfaction was lower, with 70.1% of respondents expressing satisfaction. Undergraduates showed the most dissatisfaction with 3.7% noting they were not satisfied. The areas that drew the most criticism were either peripheral services or facilities. The single area that elicited the strongest response was the easiest to
362
The Journal of Academic Librarianship
improve: the photocopy service/vendor. All population groups agreed that this was the area that needed the most improvement, and the Library was quick to respond and improve this service. The value of differentiating the various population preferences was evident in examining the issue of operating hours. The undergraduate group was the most vocal with regard to increased hours because they use the Library as a study hall, whereas the faculty, who use the Library most often remotely, did not consider longer hours to be of primary importance to them. In addition, the in-house users, especially undergraduate and graduate students, found the physical facilities to be an issue, especially the poor lighting. The survey data showed very few real surprises; indeed, one would expect that a well-managed institution would generate few large issues in such an undertaking. However, this type of effort is aimed at uncovering either real problem areas or perceived problem areas that can be improved simply through better communication. It did so in at least two important areas. First, the team was surprised at the almost universal disinterest in any type of library instruction program-an area of perceived value by library staff, and, secondly, the strong confirmation that was expressed by each and every polled population, including the undergraduates, in the value of the print collections.
COMMUNICATINGRESULTS As mentioned earlier, an important goal for the UCSD survey was communication with the user population. Having solicited user input, library staff had a responsibility to provide feedback to the campus community. Campus and Library publications were used prior to the field period to
heighten awareness of the project in order to gamer a high response rate. Once the data were analyzed, it became important to share the results first with library staff and then the broader campus community. For the initial roll out of results, the consultant made a presentation to the Library’s management group. With the wealth of data gleaned as a result of this project, it was a challenge to present results in sufficient detail to be meaningful without losing the audience’s attention. Where possible, results were presented graphically. Following this presentation, the UST presented sessions to library staff. The Library’s Data Services group mounted the datasets on a Web site, so that library staff could further manipulate the data on an as-needed basis. As use statistics from the Web site show, this has been an effective answer to the problem of how to make results of the survey easily accessible to a wide audience. The UST used additional methods to communicate survey results to the broader UCSD community. It published key findings in campus publications, as well as through the Library’s own publications. Several team members developed a portable display to rotate among the UCSD Libraries. A key component of presentations to the campus community was a focus on action items that the Library had instigated as a result of the survey findings. As the Library continues to find ways to meet user needs, the Web page will reflect these efforts. CONCLUSION A well-designed and well-administered user survey was an important part of the UCSD Libraries’ Vision of becoming a more user-centric organization. For the first time in the Library’s history, there is a wealth of valid data about all elements of the UCSD Libraries. These include data on collections, services, user preferences, user demographics, and so forth. Improvements are already visible in areas such as photocopy services, library hours, and physical facilities (e.g., lighting). A strong focus on the value of the print collections is also obvious. In addition to user-driven changes, there is also a visible increase in overall staff awareness of the importance of the user to library operations and programs. Questions, such as “How will this affect the user?” or statements such as “Every change we make should benefit our users, ” began to be heard more often as a result of the awareness brought by this
survey effort. This interesting benefit, gained from the process itself, was not predicted at the beginning of the project. Another unanticipated benefit was that members of the UST became experts at survey administration, and the team leader has taught classes in survey design and implementation as a result of the experience gained. There is also a better understanding for library staff of the trade-off between meeting user needs and balancing scarce resources. In this transitional period to more electronic content, there is a clearer picture of how this electronic arena is affecting in-person use of the UCSD Libraries. The survey produced a wealth of findings, far more than could be dealt with in the short term. Each individual library is responsible for analyzing and evaluating feedback from the survey and recommending changes where appropriate. The Public Services Advisory Committee is also evaluating survey feedback for areas where cross-departmental changes may be required. Originally, the Library thought that it would survey its user communities annually. After completing this first comprehensive survey, the Library has revised the timetable and will most likely survey users again after three years. An annual time frame was deemed too short, since there would be insufficient time to effect real, measurable change. Since this is a new and growing process, the Library will be in a better position to determine the time interval between surveys after the second survey. What will the Library do differently next time it administers the user survey? Although there was satisfaction with the general methodology used, with the questionnaire design, and with the types of data collected, the area of logistics needs some significant improvement. For example, extending the time line from 9 to 12 months to include peripheral but necessary tasks, such as publication deadlines, would be useful. More effort needs to be put into working with appropriate campus departments to produce better sampling frames. Also, there should be a better division of labor for time-consuming tasks, such as obtaining faculty compliance for the in-class administration. Could the Library have gathered this information through less expensive mechanisms, given that the out-of-pocket costs of survey administration exceeded $44,000? A survey of this scope was the only means of receiving a comprehensive,
campus-wide view of how well the UCSD Libraries as a system were meeting user needs. The Library will certainly continue to rely on ongoing communication with it users to learn where the Library needs to make improvements, but through a comprehensive survey process the Library now has a much better understanding of needs, and the intensity of need, from a scientifically valid perspective. So what about that mission statement goal of 90% of the users ranking the Library as “outstanding” or “excellent?’ As noted earlier, 85.5% of the faculty provided an overall ranking of “satisfied” or better; and 2.8% were dissatisfied. Graduates and School of Medicine students provided an overall ranking of 87.1% for “satisfied” or better, with 1.7% being dissatisfied. Undergraduates were not as satisfied: 70.1% provided rankings of “satisfied’ or better, with 3.7% being dissatisfied. So, although we have much to be satisfied with, the UCSD Libraries can continue to work towards its mission as a user-centric organization.
We value access to information without censorship as fundamental to higher education and research. We value convenient access to information. Our systems, policies, and programs should allow users to find the information they need without staff assistance, and should allow staff to provide assistance for those who need it. We value an appropriately selected, managed, and preserved collection. We respect all individuals; diverse backgrounds, skills, needs; and their right to privacy. We value the expertise staff.
their and
of our library
Each of us, as library staff, plays an important role and contributes to the success of the organization. We value training ment.
and staff develop-
Open and honest communication within our organization raises morale and promotes efficient teamwork. Up-to-date technology provides one of the means to reach our goals.
“The survey data showed very few real surprises; indeed, one would expect that a well-managed institution would generate few large issues in such an undertaking. ”
APPENDIX A
UCSD LIBRARIES VISION
STATEMENT
During the UCSD Libraries’ strategic planning process a four-part Visions Statement was developed. It contained the organization’s “Values and Beliefs,” “Mission Statement,” and “Purpose,” “Vivid Description.” The measurable and quantifiable mission statement would more typically be termed an institutional goal. The complete UCSD Libraries Vision Statement can be found at: http:// orpheus.ucsd.edu/libnet/mission.html. Values and Beliefs We are a user-centered organizationservices and patron satisfaction are key. Our library is not confined--our users seek information from places located throughout the world.
Progress comes with discovery with experimentation.
and
Mission By 1998, when surveyed, 90% of the UCSD Libraries’ primary users-faculty, staff and students-will rate the UCSD Libraries collections, services, programs, staff and facilities as either “outstanding” or “excellent.” APPENDIX B THE UNDERGRADUATE SAMPLE DESIGN The list of eligible classes used for the undergraduate sample was the 1996 Spring quarter enrollment file from the Registrar’s Office. The classes were divided into lower-division undergraduate classes with course numbers l-99 and upper-level undergraduate classes with course numbers 1OO- 199. The undergraduate classes were also assigned to one of five groups-Arts, Engineering, Humanities, Science and Mathematics, and Social Sciences. For each stratum, classes were drawn randomly with probabilities of selection proportional to class enrollment until the class list for that stratum was exhausted. Each consecutive class was incorporated into the sample in the sequence of drawing until the threshold for drawing sufficient
September 1998
363
sample in the respective stratum was satisfied. For each stratum, that threshold was based on the number of class enrollees representing that stratum’s proportion of class enrollees out of the entire population of class enrollees (e.g., the proportion of the population in the lower-division Science and Math stratum was approximately l/5, hence, this stratum contained l/5 of the total sample). APPENDIX C UNDERGRADUATE DATA COLLECTION AND RESPONSE RATES The sample included class enrollees from 22 classes with a total enrollment of approximately 4,910 in three distinct groups of individuals: (1) the 2,861 respondents, (2) class enrollees in sample classes to which access was not granted by the class instructors or which were excluded for administrative convenience and (3) class enrollees in sample classes who were not present during the period in which the surveys were administered, who arrived late, or who, though in attendance, did not complete the survey. Individuals in the second group are unlikely to be different from the respondents unless the failure to collect those classes related in some way to the likely responses of the enrollees in that class. The sampling plan assumed that the five classes in group two were outside the sample design. Individuals in the third group were unlikely to have known that the survey was to be administered during their class. Thus, the probability of intentional self-selection seems minimal. However, their non-attendance could be related in less direct ways to the answers they would have given, that is, non-attendees may have different attitudes toward the library. The administration of the survey early in the quarter was intended to reduce such non-attendance. losses through Non-respondents who did not take, or did not complete, the survey because they were late for class, are subject to this same analysis. More problematic are persons who did not complete the survey despite
364
The Journal of Academic Librarianship
being in attendance since the reasons could be related in some way to the issues being studied in the survey. Acknowledgment: Efforts of the User Survey Team, a self-directed, library-wide team established as part of new library management initiatives, is hereby acknowledged. The Team was composed of six library staff members who contributed equally to the success of the UCSD User Survey project. They were: Renata Coates, Martina Cotton, Tami Echavarria, Diane Eells, Beverly Greene, and Lydia Ybarra, in addition to Dawn E. Talbot. The authors are grateful to Robert Molyneaux, Assistant Professor, College of Library and Information Science, University of South Carolina, for his review of the manuscript. NOTES AND REFERENCES 1, The survey, due to its comprehensive nature, has resulted in a copious amount of information. Detailed findings are available at the UCSD Libraries survey Web site: http:// orpheus.ucsd.edu/survey/. 2. James C. Collins & Jerry I. Porras, “Organizational Vision and Visionary California Management Organizations,” Review 34 (1991): 30-52. 3. Kenneth W. Berger & Richard W. Hines, “What Does the User Really Want? The Library User Survey Project at Duke University,” Journal of Academic Librarianship 20 (1994): 306-308. 4. Doris J. Schlicter & Michael Pemberton, “The Emperor’s New Clothes? Problems of the User Survey as a Planning Tool in Academic Libraries,” College & Research Libraries 53 ( 1992): 257-265. 5. External Validity refers to how much the sample estimation differs from the true value of the population parameter being estimated, with repeated iterations of the sampling plan. Validity can be evaluated by looking at the bias of the estimate. Bias is defined as the difference between the population parameter and the sample estimate of that parameter. The only way to avoid bias entirely is to survey the entire population. 6. Reliability refers to how reproducible the sample estimate is over repetitions of the sampling process. The reliability of a sample estimate can be stated in terms of its sampling
variance or standard error. Whenever a random sample is drawn, sampling variability is introduced; the only way to avoid it is to survey the entire population. 7. Probability samples, otherwise known as random samples, are those in which every element has a known, non zero chance of selection and the elements are selected through a random procedure. While elements do not need to have an equal chance of selection, every element must have some chance of being selected and that chance must be known. By fulfilling these two conditions and using the correct statistical formula, values for the entire population can be estimated together with the margin of error for that estimation. 8. Stratified sampling is a sampling method in which a sample is drawn from a population that has been divided into groups or strata of individuals who are intended to be relatively homogeneous on some characteristics related to the study variables to be measured or estimated. 9. Standard error is the most commonly reported measure of a survey’s precision. It is the standard deviation divided by the square root of the sample size. 10. Weighting is a procedure to correct for unequal probabilities of selection of sample members, unequal survey completion rates by demographic subgroup, or other factors related to producing unbiased estimates. See Ronald Czaja & Johnny Blair, Designing Surveys: A Guide to Decisions and Procedures (Thousand Oaks, CA: Pine Forge Press, 1996). I 1. Vicki Young, “Focus on Focus Groups,” College & Research Libraries News 54 (1993): 39 l-394. 12. Richard Widdows, Tia A. Hensler, & Marlaya H. Wyncott, “The Focus Group Interview: A Method for Assessing Users’ Evaluation of Library Service,” College & Research Libraries 52 (1991): 352-359. 13. The faculty/researcher and graduate/ SOM questionnaires used in this study are available on the UCSD Libraries Survey Web site: faculty/researcher questionnaire at: http:llssdc.ucsd.eduIlib_surv/codebooks/ faculty.html; graduate/SOM questionnaire at: http:llssdc.ucsd.eduIlib_surv/codebooks/ graduate.html. 14. The undergraduate questionnaire used in this study is available on the UCSD Libraries Web site: http://ssdc.ucsd.edu/lib_surv/ codebooks.ugrad.html.