December 1998 // Case Studies
A Multimedia Instructional Program in Survey Question Design
by Russell K. Schutt, Floyd J. Fowler, Jr., and Ray Melcher
Note: This article was originally published in The Technology Source (http://ts.mivu.org/) as: Russell K. Schutt, Floyd J. Fowler, Jr., and Ray Melcher "A Multimedia Instructional Program in Survey Question Design" The Technology Source, December 1998. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.

Learning research methods requires mastering many terms and techniques and learning how to apply them in real world settings. Although combining research experience with text-based instruction is the most suitable approach for achieving these goals, instructors often have to minimize the experiential component so they can cover more material or cope with large classes. Multi-media computer-based instruction provides a means for infusing a more experiential component into a classroom or laboratory setting.

Our CD-ROM program uses a multimedia computer-based approach to train students in the principles of survey question design and to introduce them to two tools used to increase adherence to the principles: cognitive interviews and focus group discussions (Fowler, 1995). Both of these tools help survey researchers understand how potential respondents interpret questions and how changes in wording can alter responses. Learning how to use these tools requires exposure to cognitive interviews and focus group discussions as well as practice in evaluating questions based on interview and group feedback.

What We Did and Why

The CD-ROM Training Program has three modules: (1) an introduction to guidelines for good survey questions and to the tools used for question design; (2) multiple examples of the tools in use; and (3) tests and feedback to increase understanding of the guidelines (Figure 1). All examples are based on an actual research project that involved, in part, revising an interview schedule used to assess applicants to an employment services agency.

In the introduction, we review the concepts of measurement reliability and validity, the guidelines for good survey questions, and the tools used to evaluate questions (Figure 2). Students then select a survey question from several used in a client assessment instrument. They view videotaped cognitive interviews and focus group discussions in which program clients and staff discuss the questions. They also can view pretest data collected with the survey question and excerpts of graduate students discussing problems with the questions (Figure 3).

Students test their understanding of the question design guidelines by taking a quiz about each survey question and by rewriting each one. They can evaluate their answers by reading question critiques and by comparing their rewritten question to one that we provide. The entire process can be repeated for four different assessment questions. Supplementary information is also included for question design principles that are not illustrated with the questions in our instrument.

Discussion

The CD-ROM uses realistic and substantively interesting examples that encourage students to see the relevance of the techniques outside of the classroom and the complexities involved in their application. It provides multiple forms of feedback and permits a student-directed instructional sequence, so that students can move on to new points when they feel prepared to do so. The graduate students in the videotape also model the process of developing expertise in question design.

Several colleagues and I have used the CD-ROM with undergraduate students, mostly as a vehicle for enhancing presentations to an entire class but also as a basis for self-directed practice. The reactions we have received reinforce our impression of the training program’s value but also suggest several improvements. When the CD-ROM was used for class presentation, students enjoyed the experience and remarked that it improved their understanding of the question design guidelines. They felt that the instructional material was very clear, although some complained about poor audio quality in several excerpts.

Students who used the CD-ROM on their own gave positive reports. For example, one student said, “the program was outstanding … helped me gain a clear idea of how the process works when designing questions.” The videotaped cognitive interviews and focus groups added realistic examples that extended text-based coverage (Schutt, 1999). On student reported: “The CD gives a good definition of [cognitive interviews and focus groups] in the beginning; however real life examples … gave me a much better idea of how each … is supposed to work.” Students varied in their reactions to the opportunity to rewrite a survey question after viewing related cognitive interviews and focus group discussions. One remarked that “this is the part [where] I feel I learned the most from the CD. When you are forced to come up with your own questions, using what you have learned from the video segments, is where you actually learn the most.” However, another student suggested there should be evaluative feedback on their revised questions. Several remarked that it should be easier to move around within the program. Some students also wanted more sample questions (the program provides four survey questions with associated cognitive interviews and/or focus group material).

We believe that our video-based approach to training in question design should be refined to include experiences with many more types of questions. We also conclude that piggy-backing development of a training program like ours onto an ongoing research project is less effective than would be designing special “problem questions” in advance and scripting cognitive interviews and focus groups to show how the problems can be identified. Because our examples came from an ongoing project, we could not do “retakes” of our videotaped encounters nor script in advance the comments of participants. We also had to add extra text to discuss problems in question wording that were not illustrated with the actual survey instrument we used. This revised approach would take more time and resources and would reduce the “real world” flavor of our illustrative material, but it would still provide students the opportunity to see and hear the tools in use. We are also considering semi-structured approaches to requesting and evaluating student revisions of survey questions, so that students can receive individualized feedback.

The many tools that students of research methods must master lend themselves to the instructional approach explored in our CD-ROM Training Program in Question Design. With the refinements we have recommended, it can provide a model for using computer multi-media capabilities to provide more effective instruction.

Authors' Notes: Development of the CD-ROM was funded by the University of Massachusetts Instructional Technology Program. We are grateful for the assistance of Harriet Wilt, Greg Fitzgerald, Diane Elliott, students in our Graduate Program in Applied Sociology, and staff and consumers at Impact Employment Services.

Orders for the CD-ROM can be made by sending a $6 check made out to "Department of Sociology, Educational Sales & Service to: Professor Russell Schutt, Department of Sociology, University of Massachusetts Boston, Boston, MA 02125, ATTN: CD ROM

References

Fowler, F. J. (1995). Improving survey questions: Design and evaluation. Thousand Oaks, CA: Sage.

Schutt, R. K. (1999). Investigating the social world: The process and practice of research, 2nd edition. Thousand Oaks, CA: Pine Forge Press.

time management gamesshooter gamesbrick bustermarble popper gamespc game downloadsbest pc gameshidden object gamessimulation games
View Related Articles >