Spotting an Elephant in the Dark
Spotting an Elephant in the Dark" The Technology Source, June 1997. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.
The Flashlight Project is developing a constellation of survey items, interview questions, cost analysis methods, and other resources that educational institutions can use to study and steer their own uses of technology.
As a tool, Flashlight has several features that may be of wider interest to the field
of assessment and program evaluation:
- focusing on choices about learning and teaching made by students and educators (how they use technology) in order to illuminate the relationship between investment in technological infrastructure and improvement (or lack of improvement) in outcomes,
- focusing on the practices that tend to produce good learning rather than on the (almost impossible) task of directly measuring changes in learning while the learning objectives are also changing;
- surveying and summarizing changes in teaching and learning practice across the large number of courses typically needed to create substantial improvements in programmatic outcomes,
- investigating negative hypotheses about technology as well as positive ones, and
- developing an evaluation tool kit that is easy enough to understand and use that the necessary numbers of faculty and staff can be involved in designing studies, gathering data, and using results;
Illuminating the Elephant
Educational institutions of all types are investing enormous effort, money and risk capital in computing, video and telecommunications. (As are their students.) They hope for changes in educational strategies and thus to change educational outcomes. For example, institutions may invest in Internet connectivity partly to help support more collaborative learning and more use of information resources off-campus; this may in turn be to help achieve better retention, economies of scale, and graduates who are more able to apply what they've learned.
Each institution would usually like to know whether its investment is working. And, if not, the staff would like to know what the barriers to success might be.
The technology per se is relatively easy to "assess" -- it's relatively obvious whether the e-mail is operating or not and it is sometimes feasible to measure its volume. But two years later is there indeed more collaborative learning? Are graduates now working more competently in teams? If so, has the e-mail played any sort of role in that success?
The act of program evaluation in education is like using a small, dim flashlight to decide what sort of animal might be in front of you in a pitch black cave. (We'll assume for metaphorical purposes that you can't hear or smell!) The relative brightness (rigor) of the flashlight (evaluation) is less important than where one points the beam (asking the right evaluative question). Each evaluative question is the equivalent of pointing the tiny beam in a particular direction and waiting to see what walks into the light. It may seem a hopeless task -- a pitch black cave, a narrow and wavering beam of light, and in that beam occasional flickering impressions of light and dark. What 'rough beast' is really out there?
Fortunately, in this case, the task happens to be relatively more feasible. Imagine that your curiosity is quite focused. You are vitally interested in knowing whether there is an elephant in front of you, perhaps because you are hoping to see an elephant and have reason to think one might be around. Whether a mouse is (also) around is of little concern to you - just elephants. Because you have that specific question in mind, you would probably shine your flashlight high and look for signs of tusks or floppy ears in the narrow beam of light. Or you might have some other idea for how to use your light to identify an elephant.
As it happens, many technology-using educators are looking for "elephants" these days: elephant-sized technological revolution in their instructional programs. And the cave is indeed huge and dark: ordinarily we don't see major changes in who can learn, what graduates can do, or what education costs unless change is broad, deep and diffused into the fabric of the program. It is even harder to see whether there are widely diffused changes in the fabric of teaching and learning practice in an institution. The patterns are hidden in the relatively private activities of large numbers of students and staff.
The investigation might well be impossible but for one thing: different types of institutions and disciplines seem to be adopting similar technologies and using them in comparable ways for similar purposes. They also have similar anxieties about what might be going wrong out there in the dark. Schools, two-year colleges, research universities, and large-scale corporate training programs; geographers, psychologists, and chemists -- their wishes and worries about technology are strikingly similar. This consensus set of hopes and fears is what we have been referring to as the elephant. And the fact that so many educators are wondering whether this elephant is in their cave has given birth to the Flashlight Project.
This set of consensus wishes about technology, to be described in more detail below, includes support for good practices such as collaborative learning, faculty-student interaction, active learning (e.g., through work on realistic, complex projects) and increased student time on task as well as outcomes such as more extensive and equitable access to an education, graduates who can apply what they have learned, and costs that are under control.
The consensus worries include issues such as inadequate support, ineffectual pedagogy, and technology that might be hindering learning.
Step One: Describe the Elephant The Flashlight Planning Project, 1994
The goal of the 1994-95 Flashlight planning project was to discover whether five very different postsecondary institutions had similar wishes and worries about technology. The work was supported by the Fund for the Improvement of Postsecondary Education (FIPSE). The leadership team included the author, Sally Johnstone and Robin Z ?Ã‰â€šÃ„Ã´iga of the Western Cooperative for Educational Telecommunications, and Trudy Banta of Indiana University Purdue University Indianapolis.
Five disparate institutions delegated a two-member team -- one faculty member and one administrator -- to participate. The team prepared an initial working paper and a two round Delphi study by which the participants fine-tuned the model. The effort climaxed in a two day working meeting in which participants made final decisions about elements of technology use, educational strategy and educational outcomes that were of common concern.
These five distinguished and distinctively different institutions of higher education included:
- one of the largest community college districts in the country (Maricopa Community Colleges),
- a public institution that offers a state-wide, virtual community college program supported by a combination of video, computing, and telecommunications (Education Network of Maine);
- a major land grant institution with innovative programs exploiting technology for students on- and off-campus (Washington State University -WSU);
- an institute of technology with a national record in both distance learning and services for the handicapped (Rochester Institute of Technology -RIT; and
- a public university that exemplifies institutional partnership at virtually every level (Indiana University - Purdue University at Indianapolis - IUPUI).
Step Two: Create the Flashlight
Leading the development of the student, faculty, alumni, and alumni supervisor survey items and interview guides is Robin Z ?Ã‰â€šÃ„Ã´iga of the Western Cooperative for Educational Telecommunications (WCET). Leading the development of the cost analysis measures is Joe Lovrinic of IUPUI.
With support from the Annenberg/CPB Projects, the WCET has been working with the author to develop a survey item bank and interview guide for each of four sets of potential respondents:
- students currently enrolled in a course of study,
- the faculty who are teaching them,
- people who have completed that course of study, and
- their current supervisors.
We may also develop a similar set of instruments later on for support staff involved in distance learning programs.
The Shape of the Elephant
It is not possible in this brief paper to describe the issues that Flashlight will track --the shape of the "elephant" -- in great detail, but here are the key areas.
The Technology. The first element of this consensus strategy is "worldware," i.e., hardware and software that was developed for use in the wider world but that is also used for teaching and learning (e.g., spreadsheets, the Internet, computer-aided design software). Courseware (i.e., software developed and marketed for specific instructional purposes) is in some use but worldware is far more prevalent. Flashlight tools will help educators learn what sorts of worldware students are using, where and how much, e.g., in course work, in their jobs, at home.
Changes in Teaching and Learning. One of the most important assumptions underlying Flashlight's design is that technology does not itself cause changes in learning, or access, or costs. Rather it is how the technology is used that matters.
Today's technologies, especially worldware, are empowering, i.e., they widen the options available to educators and learners. Thus three institutions might invest in the same computer conferencing software, with one achieving more collaborative learning for commuting students, another disrupting classes and increasing attrition, and the third experiencing no perceptible changes in process or outcomes. The difference stems from the choices made by faculty and students about how to use the opportunities offered by the conferencing system.
Flashlight focuses on whether faculty and students find the available technology useful (or a hindrance) when they try to implement each of "seven principles of good practice in undergraduate education." (Chickering and Gamson, 1987; Chickering and Ehrmann, 1996):
- Interaction between the student and teacher (or tutor, or other expert);
- Student-student interaction;
- Active learning;
- Time on task;
- Rich, rapid feedback;
- High expectations of the student's ability to learn; and
- Respect for different talents, ways of learning.
Because so much research indicates that these practices support better learning, it would be significant to discover that they were being implemented and that technology was playing an important role. By the same token, these objectives are mentioned so often by technology-using educators (especially the first five) that it would be significant to discover that an institution investing heavily in technology was not implementing these principles.
Notice that our ability to focus on research-based conditions supporting good performance is a big win. Many people assume that evaluation of the outcomes of technology investments is easy: "Just see whether students are learning more!" But the real value-added from technology usually comes when instructional objectives change, which means tests change, too. It's useless to discover that students scored 80% on one test and 90% (or 80% or 75%) on a different test administered three years ago. On the other hand, if you can discover that:
- the conditions for good learning have improved (as measured by increased implementation of the seven principles); and
- the faculty and the students believe that their use of technology was substantially helpful,
then you've learned something important. Similarly it would be useful to discover that collaborative learning is down and that e-mail has been problematic. Or even that collaborative learning is extensive but e-mail is widely seen as alienating.
Flashlight has a myriad of focused questions about the most common hopes and fears about technology and the seven principles: particular conjectures about how specific technologies might be used in ways that help or hinder the implementation of each of the seven principles.
Access issues in this consensus strategy include student location (relative to the campus), time demands, and native language. In other words, many educators hope that the ways they use technology will open their instructional programs to students regardless of their location, regardless of their job schedules (so long as they have sufficient time to study) and regardless of their native language (so long as they speak English). Flashlight will also help institutions interpret retention at the course, and course of study, levels.
Flashlight should also help institutions investigate reasons for retention and attrition: student engagement (or lack of it), barriers to access (or the lack of them) for various sorts of students, and the intellectual accessibility of the instruction (e.g., are our mediated courses providing a more equal opportunity for learning for students whose native language isn't English?)
Our teams also identified four learning outcomes for which there ought to be
perceptible improvement, so long as the foregoing changes in technology and
teaching/learning practices had been sufficiently widespread for enough years. These
learning outcomes following completion of a course of study include the ability:
- to apply what was learned in the instructional program (i.e., what was learned was not sterile or shallow - the 'graduate' would be seen to use the learning in real situations after completing the instructional program),
- to work in teams,
- to use information technology appropriately and creatively in one's work, and
- to manage one's own process of continuing learning.
Cost outcomes of greatest interest include (for large distance learning programs) capital and operating costs relative to comparable programs on one's own campus, and savings in costs per graduate coming as a result of (hoped-for) increases in retention. Other issues of interest include how costs of education may vary for the student (e.g., costs of income lost due to time spent in commuting, costs of acquiring equipment and network connections used in part for education).
Project Status and Next Steps
At this writing (April 1997) the Flashlight Current Student Inventory has completed beta testing; 4,200 surveys were sent out to students at our five institutional partners in spring 1996. The data are being used to aid final revision of this part of the tool kit. Happily, the early results have already resulted in changes in policy by at least one Flashlight institution.
The cost analysis strategy is under active development, as is the faculty survey item bank. Release of the Current Student Inventory is expected in April 1997. Initial beta testing of the Faculty Inventory is planned at our partner institutions for fall 1997.
The project has also begun to offer workshops. The Flashlight Project is now affiliated with the Teaching Learning and Technology Roundtable program of the American Association of Higher Education. We offer a series of training workshops through the Roundtable's numerous regional events each year. Use of videoconferences and on-line seminars is also being explored.
Flashlight publishes a free electronic newsletter called F-LIGHT with news about workshops, product release, and changes to our Web site. We put out around 6 issues a year. To subscribe to F-LIGHT, address e-mail to LISTPROC@LISTPROC.WSU.EDU with the one line message SUBSCRIBE F-LIGHT (your name)
Development of the Flashlight instruments has been supported by the Annenberg/CPB Projects with in-kind support from the five participating institutions. Support for the Flashlight Planning Project was provided by the Fund for the Improvement of Postsecondary Education (FIPSE) of the United States Department of Education. Finally, this project would not have been possible without the volunteer efforts of the five institutional members of the Flashlight Consortium and their staffs.
Chickering, Arthur and Zelda Gamson (1987) "Seven Principles of Good Practice in Undergraduate Education," AAHE Bulletin (March).
Chickering, Arthur and Stephen C. Ehrmann (1996), "Implementing the Seven Principles: Technology as Lever," AAHE Bulletin, October, pp. 3-6. Also available on the Web at http://www.aahe.org/ehrmann.htm
Ehrmann, Stephen C., "Gauging the Educational Value of a College's Investments in Technology," Educom Review , XXVI:3,4, (Fall/Winter, 1991), pp. 24-28.
Ehrmann, Stephen C., "Asking the Right Questions: What Does Research Tell Us About Technology and Higher Learning?" in Change. The Magazine of Higher Learning, XXVII:2 (March/April, 1995), pp. 20-27.pc gamesplatform gamesadventure gamespc game downloadsaction gamestime management gamesmarble popper gamesbest pc games