The debate over the use of computers in public education dates back to at least 1983, when the federally appointed National Commission on Excellence in Education issued its report A Nation at Risk, which harshly criticized the failures of the U.S. educational system and tied them to the nation’s economic problem: “Our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world. . . . The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very virtues as a Nation and a people.” The report concluded, “We must dedicate ourselves to the reform of our educational system for the benefit of all.”
The warnings of A Nation at Risk came at a time when Americans were beginning to embrace computer technology. IBM had just released the first personal computer in 1981; the company sold 6 million of them in 1983. In 1984, Apple Computer gained popularity after introducing the Macintosh, a personal computer that featured a novel, easy-to-use graphical user interface and a mouse in addition to a keyboard. The Internet and the World Wide Web would not gain popularity for another decade, but in the early 1980s Americans already felt that a social transformation was underway and that their children should be prepared for it. As the authors of A Nation at Risk put it, “Learning is the indispensable investment required for success in the ‘information age’ we are entering.”
By 1988 more than half of all workers in the United States were using computers. The nation’s school system followed this trend: According to American Prospect cofounder Paul Starr, “Between 1981 and 1991, the proportion of schools with computers rose from 18 percent to 98 percent, and the number of students per computer fell from 125 to 18.” Elementary and middle schools purchased mostly Apple computers, while high schools favored DOS-, and later, Windows-based machines. And as computer technology became more advanced, so too did the educational software that was marketed to schools. Computer-assisted education (CAI) was, and still is, touted as a potentially revolutionary new means of learning. “PCs and general applications software made computing more flexible and easily adapted to different subjects and styles of teaching,” writes Starr. He adds, “Unlike motion pictures, radio, and TV, computers were far more susceptible to both student-centered and teacher-defined activities.”
In the early 1990s, the movement to use computers in the classroom was reinvigorated by the explosive growth of the Internet and the World Wide Web. Many parents and educators hoped that the Internet would enrich CAI and the overall educational experience by connecting classrooms to the outside world. Responding to this enthusiasm, in 1996 President Bill Clinton proposed federal funding to help bring computer and Internet technology into all classrooms by 2000. Congress allotted $2 bil- lion to the Technology Literacy Fund in 1997, and this money helped bring the percentage of K–12 classrooms connected to the Internet from 3 percent in 1994 to 77 percent in 2000. Grants for integrating Internet technology into education were also a major part of the No Child Left Behind Act that President George W. Bush signed into law in January 2002.
The case for integrating computers into the classroom is summed up by a 2002 Department of Education report:
The latest research and evaluation studies demonstrate that school improvement programs that employ technology for teaching and learning yield positive results for students and teachers. Given that many schools and classrooms have only recently gained access to technology for teaching and learning, the positive outcomes of these studies suggest a future for education that could be quite bright if the nation maintains its commitment to harnessing technology for education.
The adoption of new and emerging technologies by schools and classrooms offers even more reason to be hopeful. With sufficient access and support, teachers will be better able to help their students comprehend difficult-to-understand concepts and engage in learning, provide their students with access to information and resources, and better meet their students’ individual needs. If we take advantage of the opportunities presented to us, technology will enhance learning and improve student achievement for all students.
In addition, a 2002 National Policy Association report titled Building a Digital Workforce: Confronting the Crisis emphasizes the need to train students for a computer-dominated workplace. Echoing 1983’s A Nation at Risk, the authors of Building a Digital Workforce write: “America has a workforce crisis. It has a sufficient supply of workers, but they lack adequate 21st Century IT [information technology] skills to fuel the information age economy. . . . Unless the country acts now to fill this gap, its competitiveness may be threatened.”
Yet, from its very outset in the late 1970s and early 1980s, the movement to use computers in education has had its share of critics, many of whom simply don’t believe the claims of technology enthusiasts. For technology skeptics, zealous claims about CAI echo Thomas Edison’s 1922 prediction that “The motion picture is destined to revolutionize our educational system and . . . in a few years it will supplant largely, if not entirely, the use of textbooks.” Motion pictures became primarily a media for entertainment rather than education, and critics charge that the same thing is happening to computers and the Internet.
The anti-technology viewpoint is presented in detail in books such as 1999’s High-Tech Heretic: Why Computers Don’t Belong in the Classroom and Other Reflections by a Computer Contrarian. Author Clifford Stoll argues against the conventional wisdom that students need to be computer literate to succeed in the workplace:
What jobs will be around in 2100? Surprise! They’re pretty much the same jobs available today: dentists, truck drivers, surgeons, ballet dancers, salespeople, entertainers, and school- teachers. A century from now, there will still be movies stars, morticians, gardeners, forest rangers, and police officers. . . . Curious thing about all these jobs—none of them require computing.
But the main argument used by parents, teachers, and policy makers who oppose CAI is that the money that schools spend on computers could be better used for other things, primarily hiring more teachers. This view was expressed in 1998 by the National Science Board (NSB):
The fundamental dilemma of computer-based instruction and other IT-based educational technologies is that their cost-effectiveness compared to other forms of instruction— for example, smaller class sizes, self-paced learning, peer teaching, small group learning, innovative curricula, and in-class tutors—has never been proven.
The NSB is referring to the fact that widespread use of computers in education is less than two decades old, and research on its effectiveness is largely ambiguous.
Indeed, one thing that both sides of the computers-in-education debate agree on is that more research is needed to determine what educational techniques—computer-assisted or otherwise—are most beneficial for students. As Michael Dertouzos, author of What Will Be: How the New World of Information Will Change Our Lives, puts it, “We need to continually examine what succeeds and fails, and why. And we should do so before we deploy any technical approach on a grand scale.” America’s schools are the laboratories in which educators’ many different experiments with CAI are being conducted.
The debate over computers in education is constantly evolving based on new research and technological advancements. Ultimately, decisions about whether and how to use computers in education will be made largely by individual communities and schools, based on their particular resources, needs, and goals. The viewpoints in At Issue: Computers and Education highlight the main arguments in the debate about whether CAI is good for the nation’s school system as a whole.