February 1998 // Vision
Walking the Line:
Rectifying Institutional Goals with Student Realities
by Derek Maus
Note: This article was originally published in The Technology Source (http://ts.mivu.org/) as: Derek Maus "Walking the Line:
Rectifying Institutional Goals with Student Realities" The Technology Source, February 1998. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.

The scene opens with a brightly-colored background. The name of a product—perhaps familiar, perhaps not—flashes across the screen, often accompanied by a slogan about learning in the "next century," now only two short years away. As the camera zooms in, a classroom full of ethnically and racially diverse students of somewhat indeterminate, although usually middle school, age are gathered around a number of computer screens (although never more than two students per screen, with the occasional exception of an impromptu cluster of wide-eyed youngsters gazing upon a particularly startling discovery—usually a video file of some kind). No teacher is present, except occasionally as a benevolent figure standing in a doorway, marveling at the collection of students doing independent "research" either on the Internet or using a particular brand of software.

This is the model of advertising that has been adopted, almost universally, among manufacturers of both software and hardware as they attempt to justify their products as tools for facilitating learning. And why not? The scenarios being displayed on the television screen—while often highly idealized in both demographic composition and resemblance to actual classrooms living or dead—are certainly attainable within the near future, at least in terms of the easy availability of technology within schools (especially if these companies continue and expand their philanthropy towards school systems that cannot afford otherwise to purchase them). One can excuse the companies for being overly rosy in their outlook, as they are simply fulfilling one of the primary goals of advertising: to present the future in its improved form as a result of the use of whatever product is being marketed, whether deodorant or software.

However, what cannot be excused as readily is the wholesale acceptance on the part of institutions of higher learning of this vision as the reality of entering students' pre-collegiate experience with instructional technology. A number of fallacies concerning the technological acumen of the younger generation (almost all of which are implied by the kind of marketing that sells computers and software) are present in the strategies employed by most colleges and universities in their attempts to incorporate new technologies into the educational regimen. These fallacies must be addressed and corrected before the myriad tools placed before the eager and willing instructor can actually assist, rather than hinder or obfuscate, the process of teaching and learning effectively.

I come to this subject from the perspective of a teacher of composition and literature, arguably one of the more difficult disciplines in which to integrate technology smoothly while still achieving measurable improvement. Let me hasten to add that I believe wholeheartedly in the possibility of such improvement, lest I be branded a closet Luddite. While there are a number of issues concerning this subject that I find problematic, the three most striking contradictions between institutional assumptions and classroom realities that I have experienced are:

  1. Experience with technology is neither universal among students, nor uniform in its depth.
  2. Lasting interest in technology among students is widely disparate, even after extensive in-class use.
  3. Quality of work is not necessarily increased, and is often harmed, by application of technology tools.

Student Experience

The idea of techno-savvy youngsters showing their elders how to function properly amid new gadgetry—whether it is a preschooler connecting a VCR to a television or a younger employee teaching an older colleague how to design a Web page—is often perceived either as a "cute" joke or a threat. The fact is that it really is neither, as this whole concept springs forth from a misapprehension of the capabilities of the younger generations.

While nearly anyone born after 1945 has been subject to the breakneck pace of technological advancement, this fact does not guarantee (nor even imply) an understanding of, much less a working proficiency with, a particular innovation. Nor does the ability to manipulate one piece of technology necessarily demonstrate that a particular individual has a propensity for using others. Nintendo is not a stepping-stone to practical computer skills any more than toy cars are a preparation for real driving. In many ways, the technologization of toys has hindered the development of a mindset that views computers as a tool rather than an amusement.

Many colleges and universities have based the technological portions of their curricula around the notion that students already have, or will soon acquire, the ability to perform "basic" tasks associated with contemporary technology. Many introductory-level courses at these schools assume (or require) that students can use e-mail (often on less user-friendly UNIX systems), Web browsers, or word processing and spreadsheet programs. Furthermore, the card catalogs of most university libraries have been replaced with listings in an electronic format whose functions may or may not be familiar to even more seasoned users.

When one contrasts these institutional expectations with the actual experience that incoming students receive in high schools and grade schools of widely varying funding levels and curricular sophistication, a gulf rapidly appears between students where one may not have existed before. This issue is not unique to electronic instructional technology—the stereotypical, yet still extant, one-room schoolhouse cannot hope to provide the same preparation for the modern collegiate curriculum as the privately-funded and fully-appointed prep school, whether in terms of access to technological tools or the number of volumes in the school library. However, since most schools deal with students from both backgrounds, teachers are often either expected to incorrectly presume them to be equals in their degree of computer literacy or to bring the student who is "lagging" up to speed.

This technological tutoring, while useful, is a distraction and added burden for both the instructor and the student, no matter if the subject is composition or chemistry. The instructor is stuck between the rock of teaching a course that is "too simple" in its use of technology (thus boring the more proficient students and not providing any substantively useful experience for the remainder) and the hard place of teaching a class that has the lofty goal of mastery of several technological skills but starts half the students off with an impossible handicap.

Furthermore, such "remedial" technology experience is often expected to be obtained by the students outside of normal instruction, either in supplementary tutorials or simply by going to a computer lab and familiarizing themselves with software on their own time. This often only compounds the problem by creating additional stress and time demands for the already harried freshman.

Student Interest

My experience in trying to help bring some students up to the par that is expected of them in regards to technology is that many of them simply do not want to involve themselves wholeheartedly with computers. Whether this is symptomatic of a deep-seated technophobia, or more likely, simply the product of differences in personal temperament, it is an issue which must be addressed without the immediately dismissive answer that "it's required, therefore do it."

The oft-chanted mantra that computer skills are indispensable in today's job market is one that has made its way to the highest echelons of education and forms the cornerstone of the philosophical edifice that supports the adoption of technology into collegiate curricula. I will not argue that such skills are not potentially desirable in most occupations and even necessary in some others, but the reality is that there are still innumerable students leaving universities with bachelor's degrees who are voluntarily going into jobs that require little or no prior technological training at all.

The considerable corporate/scientific bias that higher education has adopted in the past twenty years has skewed the perspective of many university administrators into believing that all their "clients" are pre-professional students destined for a climb up the corporate ladder with a laptop tucked under one arm. Between the fact that a bachelor's degree doesn't carry the weight it once did and the large percentage of the student populace that exists outside of the technologically-based disciplines (you don't need technology to write a brilliant English paper, even if it can help you), this vision is largely incommensurate with observable reality.

Thus, it should not come as much of a surprise that many otherwise highly-motivated and capable students simply do not want to add technology to their course of study beyond a certain point. Word processing and electronic research skills are necessary to function in most current campus environments, if only because typewriters and card catalogs are becoming increasingly scarce. Beyond that, administrators and instructors must decide if e-mail and Internet skills are truly necessary for mastery of an academic discipline or are simply possible facilitators to such mastery. The former dictates making such skills mandatory, regardless of particular students' tastes, but the latter does not; the value of an education lies in the end (the knowledge ultimately gained), not necessarily the means.

Quality of Work

Despite the well-publicized skeptical assertions of Thomas Russell and others regarding the possibility of "significant difference" resulting from the use of technology in education, I firmly believe that there is a place for technology in the classroom that not only makes the administration of the course simpler for the instructor but also increases the possibility for student learning, regardless of subject matter.

However, the reality of the situation in many schools is that technology is popping up all over the place without any overarching framework to guide or evaluate its use. While numerous steps have been taken in the recent past to rectify this (such as the National Learning Infrastructure Initiative that Educom has begun), much of students' use of technology in the classroom, even at those few schools that mandate a computer for all students, has been limited to window-dressing. They make their work look better on the page (or the screen), more technologically sophisticated, but the content shows no greater insight and is often even somewhat diminished. Students must be taught to move their understanding of the uses of these new avenues for research beyond mere wonderment at the novelty of a .mov file of Neil Armstrong walking on the moon.

If the philosophy behind the use of technology is clearly codified, and the more successful models of use are adopted on a larger scale, the criticisms of Russell and his devoted followers will fade into silence. However, as long as instructors are simply using technology by rote, either due to institutional policy that provides no guiding pedagogical principles or because of pressure to diversify their teaching portfolios by any means necessary, students will receive an introduction to technology along with their steady diet of required courses, but their understanding of neither will grow appreciably. Such a result does not—nor should it—justify the enormous expense that some institutions (and, correspondingly, their students) have incurred in order to remain technologically current.

Conclusion

Curricular engineers should be cognizant of the fact that practical technological knowledge is no different than any other academic skill that is required of students. If we expect our students to know how to use these tools, we must teach them how to do so, not by sending them to the lab or to two-hour night classes, but through standardized required courses that provide a basic training in general technological skills that are deemed necessary for the completion of an undergraduate degree in any major.

Most schools allow individual students to test out of math or composition classes if they prove to be redundant, so why not allow the same with such a technology requirement? There is no penalty for not testing out of first-year English if a student's high school background does not give them the same preparation that another student's does. As it stands, students whose training falls behind the inflated learning-curve that is dictated by many college curricula are needlessly forced to undergo hardships that may or may not prove of any benefit to them in the long run.

Given the wide disparity between the opportunities for learning about technology that a child growing up in rural North Carolina has and those that a child growing up in suburban Boston has (just to use two examples familiar to me; the disparity exists between high schools in the same town in many cases), it is essential to keep the range of experiences in mind when designing college curricula around a set of tools that have essentially only been available for a decade, even on the leading edge. Technological skills should be evaluated according to the same sort of standards as other academic disciplines. Until every student has the kind of access that the rose-colored lens of the advertising camera would have us believe is the norm, it is a disservice to our students to demand that they make themselves conform to this distorted image.

management gamessimulation gamescard gamesmahjongmatch 3 gamesadventure gamespc gameshidden objects gamesbrick buster
View Related Articles >