(Originally written as a response to a display of "favorite educational web sites" by graduate students - many of which were interesting, but most of which also seemed to simply reproduce what schools have 'always' done - the way PowerPoint simply mimics the old 'Film-Strip" projectors of the 1960s. I received little response from that audience to this challenge, so I thought I would offer it here...)
In January 1983 Apple Computer released the “Lisa” – the very first commercially available computer to use a “GUI” or mouse and to operate via on-screen “windows.” The Lisa allowed users to draw with the computer. It featured calculators that looked like calculators with buttons which could be pushed and word-processing via “LisaWrite Paper” or “LisaPad Paper,” which looked – for the first time – as if the user was typing on a sheet of paper.
For the first time a computer started without a “command prompt.” For the first time people were introduced to concepts such as “mouse,” “click,” “folder,” “drop-down menu,” “desktop,” and “multiple applications.” The Lisa could even produce graphical output, pushing the dot-matrix printers of the era beyond their limits.
The Lisa bombed in the marketplace. It was not just the cost, about 40% higher than the price of an IBM-PC of the time (or up to over $20,000 in 2007 money), but a simpler problem – the Lisa did things that no one had yet imagined doing. It shifted functions humans then did with pencils and rapidographs, early calculators and IBM Selectrics, Wite-Out and file folders, to the computer. But in doing so, it changed how every one of these functions were understood, and it changed the nature of expertise in a hundred fields of human endeavor.
It is important to remember that personal computers were not new in 1984. Offices and even homes were filled with IBM PCs and Apple IIs. But these computers were simple improvements on already existing office machines. They might mimic mainframe systems in data management, or they might mimic the standard secretarial typewriter, or they might be the fastest adding machine around, but they did nothing really new, nor did they simplify anything. Only the most advanced, the most senior personnel, were granted the complex training required to move up to work on these machines. Essentially, they reinforced office and even educational hierarchies.
As I said, the Lisa failed. The very ideas were mocked. Why would someone want 16, or even four, applications open at the same time? (“Who could pay attention to that?”) Why would you want to type on this complicated computer and make changes with mouse clicks? (“It will encourage laziness, people will write before they think.”) And how would you use that “mouse-thing” anyway? You would have to take your right-hand away from the keyboard, which would slow down typing.
It was such a failure that when Apple introduced the Macintosh 18 months later a number of Lisa’s features had been pulled. Multi-tasking was gone, and keyboard “shortcuts” had been introduced to reduce mouse reliance.
This lack of ability to see change in function within future technology remains an issue a quarter century later. When technology is used in most American schools today, it only very occasionally looks toward new forms of cognition and learning – social networking implications for example, language learning sites which break through physical isolation. Far more often it simply reproduces what educators already believe –that IBM PC thought structure – the same, just a bit “better” and faster. So we see classroom quizzes, but better, and video libraries, but easier, and typing, but more efficient. In other words technology used to maintain power structures and cognitive processes as they are.
I would argue that when we use technology in these conservative forms we actually move our students backwards. Not only do we not teach them to find their way in the future, we teach them that the technological world of today and tomorrow will not help them. If I still fail on digital quizzes I am without hope. If all this technology can not help me read and process, I may as well give up right now.
“Lets imagine a country which we will call Foobar, where reading and writing don’t exist, but which despite this has managed to develop a sophisticated culture of science, the arts, philosophy and commerce. A bit of a stretch I know, but not entirely inconceivable. All cultural transmission in such a society would take place by oral means and a good memory would undoubtedly be an invaluable asset. Education would probably consist of much rote learning and place a high value on memory work. Now imagine what the impact on such a society and in particular on its education system might be when someone finally invents the pen. Well, undoubtedly a politician somewhere will pound a table and insist that we need a ‘pen in every classroom’. An education administrator will say ‘no, we should have a pen room where children can go once a week to learn how to use these pens’. So, eventually schools will all have pens and teachers will have to figure out how to make use of them. The Foobarian Department of Education will ponder the issue. They will eventually write a ‘pen’ curriculum and issue guidelines on how the ‘pen’ may be used to support memory work and rote-learning in schools.” Ken Jennings of
This, to me, is the challenge. If we do not understand that technology – every technology – has a fundamental impact on cognition and learning, we are missing the point entirely. Gutenberg’s 15th Century technology changed how people learned and how they saw the world. So did the Edison/Lumiere film technology of the 19th Century: once British audiences saw film of the Boer War they were never the same again – never thought in the same patterns, and never accepted learning in quite the same way. Marconi altered this again, providing a radical new idea of immediacy and authenticity. And this has proceeded with every technological leap.
Educators in particular have always struggled with this. In the late 19th Century they worried that students would not be able to distinguish between “literature” and the stuff of “dime novels” (original paperback popular fiction). In the early 20th Century they were sure their youngsters could not distinguish between reality and cinema. Then that radio was wasting their time and destroying their imagination. In 1984 multi-tasking on a computer was cognitive overload and editing on a computer made one lazy. And we hear all these threats, and more, today.
The goal, this, in my mind, is not in pandering to these future fears, but in learning to force schools and educators into a place where they stop restricting the potential of their students because of their own discomfort with the world as it exists.
- Ira Socol