May/June 2002 // Assessment
Interactive Assessment and Course Transformation Using Web-Based Tools
by Celina Byers
Note: This article was originally published in The Technology Source (http://ts.mivu.org/) as: Celina Byers "Interactive Assessment and Course Transformation Using Web-Based Tools" The Technology Source, May/June 2002. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.

The search for excellence in teaching normally involves a large amount of complex work. Familiarity with and application of instructional design principles in the preparation and delivery of course material do not guarantee success; course assessment must also allow for the evaluation and, if necessary, modification of instructional design. To this end, course assessment must incorporate at least three dimensions of the educational process: the instructor's perception, the student's perception, and the student's performance. The instructor designs and delivers the course material and perceives the effectiveness of educational strategies by reading the student's reactions. This assessment can be formative, performed during the offering of the course, or summative, done at the end of the course (Worthen & Sanders, 1987). The most common method of obtaining the student's perception is summative, performed as a capstone activity at or close to the end of the course. The student's performance, or learning achievement, may be evident throughout the course in "homework, tests, and class discussions," but in many classroom activities learning "is fugitive, recordable only at great cost and inconvenience" (Keith, 1996, p. 179). However, Web-based tools can facilitate course assessment by transforming a dauntingly cumbersome task into a feasible one, thereby making possible an interactive approach to course assessment.

The examples in this article are based on a multimedia production course that I taught in the spring of 2000. One of the difficulties in teaching multimedia production is to get learners to view it as a process that involves much more than the hands-on, final phase of production. In fact, an understanding of the theoretical principles that underlie the production phase is the goal of most courses in this discipline. The purpose of this paper is to provide some food for thought about the roles Web-based tools can play in the active learning process, interactive assessment, and closing the feedback loop in course assessment.

Theoretical Framework

From the simple task of presenting a course syllabus available online to the complete delivery of instruction at a distance, Web-based environments are gaining popularity because they appeal to students, are flexible, and facilitate new kinds of learning (Owston, 1997). Used as support for face-to-face instruction, Web-based environments can be instrumental in enhancing student-centered approaches. The paradigm change from students as passive receptors of data to students as active learners, well explored in "Seven Principles for Good Practice in Undergraduate Education" (Chickering & Gamson, 1987), can be facilitated by a Web-based learning environment provided by a course management system like Blackboard or WebCT. In my own classroom, I use WebCT as the course platform.

Interactive assessment implies a dynamic process that is both formative and summative. It is based on examination of the three aforementioned dimensions: the instructor's perception, the student's perception, and the student's performance. Students are provided with instruments to let the instructor know what went well and what did not, both in terms of their performance and in terms of their perceptions of the teacher's performance. Constantly getting back to the students closes what Angelo and Cross (1993) call the "classroom feedback loop." And as they point out, when "this approach becomes integrated into everyday classroom activities, the communications loop connecting faculty to students?¢‚Ǩ‚Äùand teaching to learning?¢‚Ǩ‚Äùbecomes more efficient and more effective" (p. 6).

There is some disagreement among educators about the efficacy, or even the desirability, of using tests and quizzes in a learner-centered context. My multimedia production students mirrored this controversy while expressing their concerns about the reading quizzes (Exhibit 1). Tests and quizzes give rise to affective arousal and anxiety, which if too intense can inhibit learning. Controlled levels of affect and anxiety, however, can enhance the inclusion of new material into both short-term and long-term memory (Hagedorn, Sagher, & Siadat, 2000). There is, then, a strong argument for making quizzes and tests part of a formative process of classroom assessment during an academic course.

Angelo (1991) defines classroom assessment as "a simple method faculty use to collect feedback, early and often, on how well students are learning what they are being taught. The purpose of Classroom Assessment [sic] is to provide faculty and students with information and insights needed to improve teaching effectiveness and learning quality" (p. 17). However, even simple methods of assessment involve data collection, analysis, and utilization of results. When these techniques are used often, they amount to extra work that faculty cannot always afford to include in their schedule. The most common practice, then, is to perform assessment as summative in a capstone exercise and use the results to improve future offerings of the course. An online course management system can change the situation by greatly facilitating the two initial phases of the process, data collection and analysis, thereby providing teachers with the necessary elements to complete the third step. The immediate and constant feedback from the students, in conjunction with consequent correction of course direction when necessary, constitutes "interactive assessment." The online course management system provides an environment where students can complete quizzes and surveys online and immediately receive the results and their interpretation.

Techniques

Students were assigned weekly readings to be completed before class meetings. At the beginning of each meeting they took an online reading quiz (Exhibit 2) through the course management system platform. They met in a smart classroom containing an Internet-connected instructor computer station with a screen projector and tables that allowed the students to work individually and in groups. This classroom was used in addition to a laboratory with computers available for the students. For the six initial weeks of the course the students were taken to the lab at the beginning of the class to take the reading quizzes and at the end of the class to answer the class survey. All the other activities (e. g., reading discussions, flowchart, storyboard) were developed in the classroom. The authoring tool, Authorware, used for the final project production was introduced to them only in the seventh week of the course. From that point on, the classroom activities were divided between the classroom (for group discussion and theoretical work on the multimedia projects) and the laboratory (for lessons and practice with Authorware).

The online course management system was used to make available the course syllabus, the class assignment rubrics (guidelines plus evaluation criteria), and the weekly class agenda. The calendar tool was employed to inform students about on-campus events, conferences, and other resources that I thought might be of interest to the students. E-mail and bulletin board tools were used for communication between instructor and students, students and instructor, and students and students. Students could monitor their progress by accessing their grades for every activity that had a grade associated with it. Reading quizzes, class surveys, and final course evaluation were also made available online.

Data Sources

The question types of the reading quizzes included single-right-answer multiple choice, multiple-right-answer multiple choice, true/false, and matching. The online course management system graded the quizzes upon submission and released the score to the students. The system also provided me with simple and useful statistical analysis of each quiz (Exhibit 3). If questions arose about the reading quizzes, I could share the class results with the students, show the statistical analysis, and discuss the process to clarify doubts.

I released the weekly class agenda (Exhibit 4) containing the planned activities and objectives for the weekly meeting five or six days before each meeting. The class survey (Exhibit 5) answered at the end of each class meeting was based on the listed objectives and the means used to achieve them. These two tools played a vital role on the implementation of the interactive assessment approach. The meeting agenda helped the students to be better prepared for the weekly meeting. The class survey, containing objective questions and open fields for comments, provided the students with a powerful tool to inform the instructor about the course development and the gradual contributions to their learning process. Protected by the anonymity that a Web-based survey tool grants, students felt empowered and participated actively. Based on the survey results (Exhibit 6, Exhibit 7), I could give feedback to the students, correct the course path when necessary, and amplify useful strategies. After discussing the general dissatisfaction with the reading quizzes, for example, the group suggested other activities—student-submitted questions, group discussions, or other creative alternatives (Exhibit 8)—to achieve the same objectives.

The capstone course evaluation (Exhibit 9) exercise provided the students with the opportunity to self-evaluate their learning on major course elements, evaluate the efficiency of the instructor, evaluate the contribution of each course assignment, and evaluate other aspects of the course such as textbooks, the authoring tool, and available resources.

Conclusion

The results of the entire process gave me the satisfaction of being able to improve the course while it was happening and also to acquire a firm base for the design of later offerings. As a result, the course improvements had an advantage not available prior to my use of the interactive assessment approach. They were based not just on my own experience as an instructional designer, but on the collaboration of the group of which I was a part, a collaboration supported by systematically accumulated data. In addition, the next time I offer this course it will begin a step ahead; the changes that benefited this course, along with those suggested by the capstone evaluation, will be incorporated into my next version and, with the help of my next group of students, the course will be further refined. Needs change with the times, and no two groups are exactly alike, so it would seem that these refinements will always be required; yet this should add a freshness and sense of expanding my own learning each time I offer the course.

The learner-centered environment is widely accepted as the optimum educational paradigm. This paradigm implies that the students themselves are the primary learning resource, which means that the instructor, as the designer of the learning environment, must sincerely and proactively discern the students' needs and opinions about their learning, respond in a timely and effective fashion, and constantly inform the students about what actions are being taken and why. Conversely, and even more importantly, the students must complete the feedback loop by telling the instructor, accurately and immediately, to what degree they see their needs being met. Web-based tools facilitate this process by allowing students to record their opinions of the course online and producing immediate, error-free analysis data. Applying these data to course changes while the course is ongoing demonstrates to the students that their feedback has an effect, and makes manifest that their learning is a cooperative effort by themselves and their instructor. Continually applying interactive assessment provides the participants, both instructor and students, with real data upon which to base decisions about change, and provides this data as a steady flow of information so that changes can be made in real time.

[Editor's note: This paper is modified from a presentation at the 2001 WebCT Conference in Vancouver.]

References

Angelo, T. A. (1991). Ten easy pieces: Assessing higher learning in four dimensions. In T. A. Angelo (Ed.), Classroom research: Early lessons from success (pp. 17-31). San Francisco, CA: Jossey-Bass, Inc.

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers. San Francisco, CA: Jossey-Bass, Inc.

Chickering, A., & Gamson, Z. (1987, March). Seven principles for good practice in undergraduate education. AAHE Bulletin, 3-7.

Hagedorn, L. S., Sagher, Y., & Siadat, M. V. (2000). Building study skills in a college mathematics classroom. The Journal of General Education, 49(2), 132-155.

Keith, S. Z. (1996, June). Self-assessment materials for use in portfolios. Primus, 6(2), 178-192.

Owston, R. O. (1997, March). The World Wide Web: A technology to enhance teaching and learning. Educational Researcher, 26(2), 27-33.

Worthen, B. R., & Sanders, J. R. (1987). Educational evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.

word gamesbrain teaser gamestime management gamesmanagement gamesadventure gamesmatch 3 gamesbest pc games
View Related Articles >