Computerized surveys have all but replaced traditional paper and pencil instruments in many organizations (Good, 1997). The economic benefits associated with reduced printing and postage costs, the speed of data collection and the wide availability of easy to use survey programs has fueled the surge in computer based survey popularity. A wealth of research has demonstrated the comparability of computer and written examination formats across achievement and aptitude testing, but few researchers have compared the results of satisfaction measured gathered using the different techniques (Comley, 1998).
The present study compares the results of satisfaction surveys administered in a large, public organization. This study replicated my earlier investigation in a similar governmental organization. The sample included 832 employees who agreed to participate in the survey process. Two identical survey instruments were created and delivered to each employee over two consecutive weeks. In earlier research, the individuals completing the written survey were different than those completing the computerized survey. I our study, each respondent completed each version of the instrument. To overcome possible completion order difficulties, half completed the computer form first, while the other half completed the written form first. It was also hoped that this approach and the sizable number of participants involved would overcome the real changes in employee satisfaction that may occur from one week to the next.
A uniquely designed set of 12 satisfaction items were created for this experiment. The items covered a variety of satisfaction related factors, including compensation, supervisory and collegial relations, environment, etc.
Data from the written survey were analyzed first and baseline logit difficulties defined. Next data from the computer administered surveys were anchored to written item difficulties (items 2,7) to generate a simple, comparable set of computerized delivery item difficulties.
Results were quite striking. Five of the ten non-anchored items were found to be significantly different based on instrument delivery format. Furthermore, there was a clear trend across the instrument. Overall, respondents tended to rate themselves as more satisfied when responding to the computerized version.
Table 1: Items Manifesting Significant Differences: Computer vs. Paper Delivery
3) I am satisfied with the benefits I receive
4) Teamwork is encouraged
6) My supervisor allows me to contribute in managerial decision making
10) My immediate supervisor is friendly and helpful
11) I feel I have job security
To better understand the observed differences in satisfaction, three focus groups were convened. Attendees included 30 employees who volunteered to discuss the experience in a confidential forum. The single common response across members and groups appeared to be comfort with the level of confidentiality. Respondents felt that their answers would not be traceable to them in paper format, but were not convinced of the same security using the computer. As one female employee stated, "they tell us they monitor our computer use - you know, to stop us from playing games on the internet - and, well, just because they say our responses are confidential, who knows. I ain't risking my job for this thing."
In our discussions, the focus group employees did not report substantive changes in satisfaction, and while this cannot exclude the possibility that changes occurred, those changes should be mitigated by the method employed.
While the result of this single evaluation appears to suggest there may be skewed responses to computerized surveys, it may simply be unique to this particular organization or others like it. On the other hand, the discovery of a difference does emphasize that in a world of efficient survey administration, we cannot take for granted that delivery format is unrelated to outcome. Surveys are not examinations and those conducted in atmospheres with established hierarchies, such as job satisfaction surveys, may carry with them elements of discomfort more demonstrable in a computerized format.
Gregory Ethan Stone
The University of Toledo
Comley, Pete. (1998). On-Line Research: Some Options, Some Problems, Some Case Studies. In Westlake, Andrew et al. (Eds), New Methods in Survey Research 1998. Proceedings of the ASC international conference, a satellite meeting for COMPSTAT 98.
Good, K. (1997). A study of factors affecting responses in electronic mail surveys. Dissertation, Western Michigan University, DAI, volume 58-10A, 119 pages.
Taking A Byte Out of Job Satisfaction, G. Stone Rasch Measurement Transactions, 2005, 18:4 p. 9
Forum | Rasch Measurement Forum to discuss any Rasch-related topic |
Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement
Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.
Coming Rasch-related Events | |
---|---|
Oct. 4 - Nov. 8, 2024, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
Jan. 17 - Feb. 21, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
May 16 - June 20, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
June 20 - July 18, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com |
Oct. 3 - Nov. 7, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
The URL of this page is www.rasch.org/rmt/rmt184a.htm
Website: www.rasch.org/rmt/contents.htm