High Tech and Literacy

I often worry about how the recent technological changes will affect society. I particularly am concerned about the impacts of constant-feedback-technology (blackberry, cell, email, voicemail, sms, i.m, – you need input, input, input – like the robot in the movie Short Circuit).  Will attention-span-disorder (like I have) become a regular feature of society? Will today’s kids be able to think strategically and long-term?

Some worry about how technology and the advent of the internet will decrease literacy.  People will stop reading books and newspapers.  But the article below provides a really interesting different angle on how tech evolution will just create different opportunities.

 

The End of Literacy? Don’t Stop Reading.

By Howard Gardner
Sunday, February 17, 2008; B01

What will happen to reading and writing in our time?

Could the doomsayers be right? Computers, they maintain, are destroying literacy. The signs — students’ declining reading scores, the drop in leisure reading to just minutes a week, the fact that half the adult population reads no books in a year — are all pointing to the day when a literate American culture becomes a distant memory. By contract, optimists foresee the Internet ushering in a new, vibrant participatory culture of words. Will they carry the day?

Maybe neither. Let me suggest a third possibility: Literacy — or an ensemble of literacies — will continue to thrive, but in forms and formats we can’t yet envision.

That’s what has always happened as writing and reading have evolved over the ages. It was less than 100,000 years ago that our human predecessors first made meaningful marks on surfaces, notating the phases of the moon or drawing animals on cave walls. Within the past 5,000 years, societies across the Near East’s Fertile Crescent began to use systems of marks to record important trade exchanges as well as pivotal events in the present and the past. These marks gradually became less pictorial, and a decisive leap occurred when they began to capture certain sounds reliably: U kn red ths sntnz cuz Inglsh feechurs "graphic-phoneme correspondences."

A master of written Greek, Plato feared that written language would undermine human memory capacities (much in the same way that we now worry about similar side effects of "Googling"). But libraries made the world’s knowledge available to anyone who could read. The 15th-century printing press disturbed those who wanted to protect and interpret the word of God, but the availability of Bibles in the vernacular allowed laypeople to take control of their spiritual lives and, if historians are correct, encouraged entrepreneurship in commerce and innovation in science.

In the past 150 years, each new medium of communication — telegraph, telephone, movies, radio, television, the digital computer, the World Wide Web — has introduced its own peculiar mix of written, spoken and graphic languages and evoked a chaotic chorus of criticism and celebration.

But of the changes in the media landscape over the past few centuries, those featuring digital media are potentially the most far-reaching. Those of us who grew up in the 1950s, at a time when there were just a few computers in the world, could never have anticipated the ubiquity of personal computers (back then, IBM‘s Thomas Watson famously declared that there’d be a market for perhaps five computers in the world!). A mere half-century later, more than a billion people can communicate via e-mail, chat rooms and instant messaging; post their views on a blog; play games with millions of others worldwide; create their own works of art or theater and post them on YouTube; join political movements; and even inhabit, buy, sell and organize in a virtual reality called Second Life. No wonder the chattering classes can’t agree about what this all means.

Here’s my take.

Once we ensured our basic survival, humans were freed to pursue other needs and desires, including the pleasures of communicating, forming friendships, convincing others of our point of view, exercising our imagination, enjoying a measure of privacy. Initially, we pursued these needs with our senses, our hands and our individual minds. Human and mechanical technologies to help us were at a premium. It’s easy to see how the emergence of written languages represented a boon. The invention of the printing press and the emergence of readily available books, magazines and newspapers allowed untold millions to extend their circle, expand their minds and expound their pet ideas.

For those of us of a 19th- or 20th-century frame of mind, books play a special, perhaps even spiritual, role. Works of fiction — the writings of Jane Austen, Leo Tolstoy, Toni Morrison, William Faulkner — allow us to inhabit fascinating worlds we couldn’t have envisioned. Works of scholarship — the economic analyses of Karl Marx and John Maynard Keynes, the histories of Thucydides and Edward Gibbon — provide frameworks for making sense of the past and the present.

But now, at the start of the 21st century, there’s a dizzying set of literacies available — written languages, graphic displays and notations. And there’s an even broader array of media — analog, digital, electronic, hand-held, tangible and virtual — from which to pick and choose. There will inevitably be a sorting-out process. Few media are likely to disappear completely; rather, the idiosyncratic genius and peculiar limitations of each medium will become increasingly clear. Fewer people will write notes or letters by hand, but the elegant handwritten note to mark a special occasion will endure.

I don’t worry for a nanosecond that reading and writing will disappear. Even in the new digital media, it’s essential to be able to read and write fluently and, if you want to capture people’s attention, to write well. Of course, what it means to "write well" changes: Virginia Woolf didn’t write the same way that Jane Austen did, and Arianna Huffington‘s blog won’t be confused with Walter Lippmann’s columns. But the imaginative spheres and real-world needs that all those written words address remain.

I also question the predicted disappearance of the material book. When they wanted to influence opinions, both the computer giant Bill Gates and the media visionary Nicholas Negroponte wrote books (the latter in spite of his assertion that the material book was becoming anachronistic). The convenience and portability of the book aren’t easily replaced, though under certain circumstances — a month-long business trip, say — the advantages of Amazon’s hand-held electronic Kindle reading device trumps a suitcase full of dog-eared paperbacks.

Two aspects of the traditional book may be in jeopardy, however. One is the author’s capacity to lay out a complex argument, which requires the reader to study and reread, following a circuitous course of reasoning. The Web’s speedy browsing may make it difficult for digital natives to master Kant’s "Critique of Pure Reason" (not that it was ever easy).

The other is the book’s special genius for allowing readers to enter a private world for hours or even days at a time. Many of us enjoyed long summer days or solitary train rides when we first discovered an author who spoke directly to us. Nowadays, as clinical psychologist Sherry Turkle has pointed out, young people seem to have a compulsion to stay in touch with one another all the time; periods of lonely silence or privacy seem toxic. If this lust for 24/7 online networking continues, one of the dividends of book reading may fade away. The wealth of different literacies and the ease of moving among them — on an iPhone, for example — may undermine the once-hallowed status of books.

But whatever our digital future brings, we need to overcome the perils of dualistic thinking, the notion that what lies ahead is either a utopia or a dystopia. If we’re going to make sense of what’s happening with literacy in our culture, we need to be able to triangulate: to bear in mind our needs and desires, the media as they once were and currently are, and the media as they’re continually transforming.

It’s not easy to do. But maybe there’s a technology, just waiting to be invented, that will help us acquire this invaluable cognitive power.

hgasst@pz.harvard.edu

Howard Gardner teaches cognitive psychology at the Harvard Graduate School of Education. He is directing a study of the ethical dimensions of the new digital media.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

related posts

post a new comment