Archive for September, 2007

Technology and the Future of the Overstressed Society

Have you noticed how “stressed” everyone is today? Professionals, white collar workers, tech workers, sales workers, even high school and college students all complain about being stressed or overstressed. Many older Americans dismiss such complaints as the whining of a younger generation, a group that just can’t take it… but are these complaints mere whining… or do they have a basis in fact?

One fact is fairly clear. Americans today, on average, have a better life than did Americans seventy-five or a hundred years ago. Very, very few in the work force today had to live through the Great Depression. Nor do they have to worry about children dying of polio and whooping cough. The statistics show that most people are living longer and doing so in better health. There is a greater range of choice in occupations, and more Americans are able to get [and do obtain] higher education. The size of the average house is larger, and most houses have conveniences hardly imaginable a century ago. Although the average American work week is now longer than that of all other industrialized western nations, it’s far less physically arduous than the work of a century ago.

So why the complaints about stress?

Technology — that’s why. It’s everywhere, and it’s stressing us out in more ways than one. Those scanners in supermarkets and every other store? They not only ring up the sales and feed into inventory calculations, but they also rate the checkers on how fast and efficiently they handle customers. I knew this in the back of my head, so to speak, but it was brought home to me when a single mother who was a checker at a local store told me she’d been demoted to the bakery because she didn’t meet speed standards.

Computers, especially those with color graphics and associated high speed printers are another source of stress. Why? Because they successfully invite revision after revision by overcareful supervisors and clients. Do it over… and over… and over.

Then, there are instant messaging, emails, and texting. IMs and texting, especially among the young, lead to carelessness in spelling and grammar, and that feeds back into the need for those endless document revisions, because, believe it or not, those grammar and spell-checkers just don’t catch everything. Then… emails… which encourage everyone to get in on everything, until at times, it seems as though everyone is watching and looking for ways to make completing anything difficult. On top of that, add bosses who feel slighted if one doesn’t answer emails quickly, and all that answering and justifying and explaining doesn’t get the projects done. It just takes up time that can’t be used to do real work, a problem that some supervisors just don’t get.

As for students, keeping in touch through the technology of cell-phones, emails, and texting seems to occupy their every waking, walking, and driving moment. Add to that the allure of the wonders of hundreds of cable or satellite channels, and the need to earn money for an ever-more expensive education — or vehicle payments — and they’re stressed out.

The impact of technology pervades everything. Computerized legal databases and software make litigation ever more complex — not to mention expensive and stressful.

Healthcare has even more problems. We have more than 47 million Americans without health insurance, and the number is growing faster than the population. Why? Because expenses are growing, thanks to a proliferation of medical technology and drugs that raises costs. When my grandfather was a doctor, diagnostic technology was essentially limited to a few blood tests, a stethoscope, and an X-ray machine. Today, the average doctor’s office is filled with equipment, and that equipment creates an expectation of perfect medicine. That expectation, combined with the opportunism of the technologized legal system, leads to far more litigation. That leads to higher malpractice insurance, and more stress on doctors and more and expensive tests and procedures to make sure that nothing gets missed — or to cover the doctor from legal challenges. It’s not uncommon for some medical specialties to have annual malpractice premiums in excess of $200,000 a year. Assume that a doctor actually sees patients 5 hours a day in the office some 50 weeks a year, the other time being spent in things like hospital rounds, reviewing charts, etc. Under those conditions, an annual malpractice premium requires a charge of more than an $80 an hour. If the doctor has a million dollars in medical and office equipment and that’s not unusual either, the amortization will be excess of $100 per patient hour seen. Needless to say this creates stress and pressure, and for all the complaints about the medical profession, doctors have one of the lower life expectancies of professionals.

In higher education, computerization has led to ubiquitous on-line evaluations and anonymous ratings of professors, and the subsequent inevitable grade inflation, because tenure often depends on pleasing the students. It’s also led to a proliferation of policies and procedures, so easily printed on those handy-dandy computerized systems. In my wife’s university, the policies and procedures for rank advancement and tenure have been rewritten and changed once or twice every year over the past decade, with scores of drafts being circulated electronically before each revision was finalized.

In effect, the expectations of technology have created more stress for modern society than the wind, rain, and inconsistent weather ever did for our agricultural forebears — largely because technology also makes people more and more accountable, even when they can’t do anything about it. The way technology is used today also creates what my father called “being eaten to death by army ants.” No one wants to kill you, but everyone wants a little something — reply to these emails, revise that set of documents, change that phrase to please the attorneys, change this one for the boss’s supervisor — and when it’s all said and done, who has time to do actual new work?

Yet, if you ignore the army ants, everyone thinks you’re difficult and uncooperative, and you lose your job. Is it any wonder that American professionals are working longer and longer hours?

But… ah, the blessings of technology.

The "Literary Canon," Education, and F&SF

Roughly twenty years ago, Allan Bloom published an incendiary book entitled The Closing of the American Mind. In it, Bloom charged that abandoning the traditional literary canon in favor of multiculturism and gender- and ethnic-based literary selections effectively had gutted the American liberal arts education. I’m oversimplifying his charges, but they run along those lines.

During the 1960s and 1970s, and thereafter, but particularly in those turbulent years, there were numerous and loud cries for “relevance” in higher education. Those cries reverberate today in such legislation as the No Child Left Behind Act and the growing emphasis on institutions of higher education as a version of white collar and professional trade schools. Less than ten percent of U.S. collegiate undergraduates major in what might be called “liberal arts,” as compared to twenty percent in business, sixteen percent in health, nine percent in education and six to ten percent in computer science [depending on whose figures one uses]. Less than three percent major in English and history combined.

As a writer who effectively minored in English, I’ve thought about the writers and poets I had to study in the late 1950s and early 1960s and those studied by students today. Back then, for example, there was a fairly strong emphasis on poets such as T.S. Eliot, W.B. Yeats, W.H. Auden, and Wallace Stevens, none of whom are now listed as among the topmost poets assigned in college English classes. Now… times do change, but I realized that poets such as Eliot bring certain requirements that poets and writers such as Maya Angelou, Jane Austen, and Toni Morrison do not. For much of Eliot or Yeats to make sense, the student has to have a far wider grasp of literature and history. Much of the difference between those writers once assigned and those now assigned, from what I can tell, is that a far greater percentage of those now assigned are what one might call self-affirming writers. They affirm a set of values that are either explicitly contained in the work at hand, or they affirm current values. By contrast, poets such as Eliot and Yeats often question and use a wide range of references and allusions unfamiliar to most students, some of which are current and some of which are historical and few of which are “common” knowledge.

In that sense, the best of F&SF, in my opinion, is that which stretches the reader into considering old values in a new light and “new” values through the light of experience, accepting neither at face value. Many F&SF writers present the “new” in a way that proclaims its value uncritically, while others present and trash the “new,” as does Michael Crichton all so well. Then there are those who appear to believe that shocking readers is equivalent to making them think and stretching their horizons. Most of the time, it’s not.

According to Mark Lilla, a professor of political philosophy at Columbia, recently quoted in The New York Times, “What Americans yearn for in literature is self-recognition.” But struggling with unfamiliar themes and values, searching out allusions and references require work and can be an alienating to students, and certainly doesn’t boost self-recognition.

Particularly in the late 1960s and early 1970s, it seemed to me, there was a concerted effort in the SF field to raise issues while adhering to some degree to the tradition of the “literary canon,” and this effort continues with at least some authors today. This melding represents, again in my opinion, one of the great strengths of the field, but paradoxically, it’s also another reason why F&SF readership tends to be limited, at least for these types of F&SF, because a reader either has to be knowledgeable or willing to expand his or her comfort zone.

This gets down to an issue at the basis of education, primarily but not exclusively higher undergraduate education: Is the purpose of higher education to train people for jobs or to teach them to think so that they can continue to learn? Most people would ask why both are not possible. Theoretically, they are, but it doesn’t work that way in practice. Job training emphasizes how to learn and apply skills effectively and efficiently. Thinking training makes one very uncomfortable; it should, because it should force the student out of his or her comfort zone. At one time, that was one of the avowed goals of higher education, and part of the so-called literary canon was chosen so as to provide not only that challenge but also a cultural history of values as illustrated by literature, rather than a mere affirmation of current values.

In addition, today, with the smorgasbord approach to education, a student can effectively limit himself or herself to the courses that merely reinforce his or her existing beliefs and biases. It’s comfortable… but is it education?

Future Fact? Present Fraud? Or…?

Once more, just the other day, someone said to me and my wife, “We never really went to the moon. It was all a fraud.” This person is not uneducated. In fact, the individual has an earned graduate degree and spent some fifteen years as an executive in the financial industry.

It doesn’t seem to matter to this individual — or the millions that share such a belief — that scientists are bouncing laser and radio beams off the reflectors left on the moon by our astronauts. Nor do the photographs and records that could not have been obtained any other way count against this belief. Nor the fact that ground-based and space-based evidence agree. Nor does the fact that we and other countries have put dozens of astronauts into space matter.

Nope. To such people, the moon landings were all a fraud.

Maybe this kind of belief has something to do with the brain. A recent study confirmed that there is indeed a difference between the way “liberals” and “conservatives” process and react to information, and that that difference goes far beyond politics. Liberals tend to be more open to new experiences, conservatives more entrenched and unwilling to move away from past beliefs. And, of course, interesting enough, there are those who classify themselves as liberals who actually have a conservative mind-set, who will not deviate from what they believe regardless of evidence, and there are those who claim that they are conservative who are very open to new evidence and ideas.

Neither mindset is necessarily “good” or “bad.” As many conservatives would say, and have, “If you don’t stand for something, you’ll fall for anything.” That can be very true. On the other hand, no matter how hard one wants to believe that the earth is flat, I’m sorry. It just isn’t. When new information arrives that is soundly and scientifically based, regardless of opinion and past beliefs, a truly intelligent person should be willing to look at it objectively and open-mindedly.

In a sense, I think, most people are basically conservative. We really don’t want to change what we believe without very good reason. In evolutionary, historical, and social terms, there are good reasons for this viewpoint. Just as in mutations affecting an organism, most changes in social and political institutions are bad. Only a few are for the best.

The problem occurs when the probability of danger from an event is not absolute, or unitary, as some economists put it, but still likely to occur, and when that occurrence would be catastrophic to the human race. Over the history of homo sapiens, some hundreds of thousands of years, or millions, depending on one’s definition of exactly when our forebears became thinking human beings, this kind of situation has not occurred until the past half century. While it might be unthinkable and improbable to most, a nuclear war would be devastating to the human race. So, it appears, will runaway global warming, regardless of cause.

The “conservative” view is to wait and let things sort themselves out. After all, hasn’t this worked throughout history? Well… not always, but in terms of survival and civilization, there was always someone else to carry on. When the Mayan civilization fell because they hadn’t planned well enough for unforeseen droughts, other human civilizations carried on. The same was true of the Anasazi, and now recent satellite measurements and photographs suggest that the same occurred to the Cambodian peoples who built Angkor Wat, then a city complex of over a million people, when drought struck in the 1500s.

But what happens when we as a race face a potential climate catastrophe as devastating as global warming could be? One that affects an entire world, and not just a continent or subcontinent? Can we afford to be conservative? Or is it a situation where, in reacting, we could fall for anything?

Is global warming a fraud perpetrated by scientists, as those who deny the moon landings believe about that? Or is it a real and present danger? Or is it over-hyped, the way all the warnings about DDT appear to have been – a real danger in limited areas and to certain species, but truly not the harbinger of a universal silent spring? And how should we react, whether conservative or liberal?

Flash and Substance in F&SF

As some of you know, I’ve been involved in fantasy and science fiction for some time — otherwise known as “too long” by those who don’t like what I write and “please keep writing” by those who do. For almost as long as I’ve been writing, I’ve wondered why a number of good solid, inventive, and talented writers failed to be recognized — or when recognized, were essentially “under-recognized” or recognized late. That’s not to take away from some who were recognized, like Jim Rigney [Robert Jordan], but to point out that sometimes recognition is not necessarily fair or just.

One of them was, of course, Fred Saberhagen. Another, I believe, was Gordy Dickson, as was Murray Leinster. Among writers still living and writing who haven’t received their due, in my opinion, I might include Sheri Tepper. There are certainly others; my examples are far from all-inclusive.

But why has this happened, and why has it continued to go on?

One of the problems in the F&SF genre and, indeed, in every field of writing — and, as I discovered over nearly 20 years in Washington, D.C., also in politics — is that the extremists among the fans, reviewers, academics, and critics have a tendency to monopolize both the dialogue and the critical studies. And, for better or worse, extremists generally tend to praise and support, naturally, the extremes. In writing, from what I’ve seen, the extremes tend to be, on one end, extra-ordinary skill in crafting the individual sentence and paragraph, usually to the detriment of the work as a whole and, on the other, incredible action and pseudo-technical detail and devices and/or magical applications in totally unworkable societies and situations.

While I can certainly appreciate the care and diligence involved in the construction of the Gormenghast trilogy, books whose “action” moves at the speed of jellied consume, uphill — and that may overstate the pacing — that trilogy is not a work of literature, regardless of all the raves by the extremists. Likewise, month after month, I see blogs and reviews which praise books, which, when I read them, seem not to have much depth and rely on action and clever prose to disguise that lack; or on well-crafted words and not much else; or almost totally on humor, often at such basic levels as to be embarrassing; or… the list of sins is long. What I don’t see much of is reviews which note books with deep and quiet crafting, relying neither too much nor too little upon words, actions, inventions, or humor, but balancing all in a way to create a realistic world with people and situations which draw in the reader in a way to engage both emotion and thought and provoke a reconsideration of some aspect of what we call reality.

Now… I have no problem with brilliant unrealism, or incredibly moving prose. I do have great difficulty with books being termed good or great solely on such criteria, particularly when the critics of the extremes often tend to overlook excellent prose, plotting, and even incredibly credible devices and societies because the author has presented them so quietly and convincingly.

In a determined but comparatively quiet way, by creating Jim Baen’s Universe, Jim and Eric Flint attempted to create a sold-paying market for good stories that appealed to a wide range of readers, and not primarily to the extremists. Will this effort work? I hope so, and it looks promising, but it’s still too early to tell.

Shock value and novelty do indeed attract readers. Sometimes they even sell books. I won’t contest that. Nor will I contest the fact that much of what doesn’t appeal to me is obviously very appealing to others. What I will point out is that work which engages readers on all levels and raises fundamental issues tends to sell and stay in print over the years [so… maybe I was wrong about Gormenghast… or maybe it’s the exception that proves the point].

Calling All Tenors, Baritones, and Basses

For those young men who have a good voice and the ability and desire to learn music… how would you like a job where you can travel the world — or at least the United States — and get paid for it, and where adoring young women often follow your every word and note? If so… have you considered being a collegiate-level professor of voice?

While the openings in full-time, tenure-track university positions for female singers with doctorates are almost non-existent, universities and colleges are always looking for qualified and talented male tenors, baritones, and basses. “All” you have to do is become a classical singer qualified to teach on the university level. This does require work in addition to talent, and getting a doctorate in music is not for everyone, nor is it without some cost, but the very top positions in the field can earn close to $100,000 a year, and that doesn’t count fees for performing outside the university. Now… admittedly, a starting salary for a male tenure-track junior voice faculty member is “only” $35,000-$50,000, but a full-time position usually includes health care and one of the best and most portable retirement pension systems in the country.

More than a few times, when my wife has suggested that male students might have a future by majoring in music, the usual response is that “I won’t make enough money.”

And exactly how true is that? The latest data from the Census Bureau notes that the median income of men working full-time in the USA is slightly over $42,000. The median for men with professional degrees [law, medicine, science, MBA] is around $72,000. Of course, all young men and women will be above average, just as the children in Lake Wobegon are all above average, and all will make fantastic salaries.

But what is fantastic? The average veterinarian makes $65,000, the average architect $57,000, the average accountant $41,000, the average secondary school teacher $47,000. For every junior attorney making $130,000, there are many more making $40,000-$60,000. With the median salary of attorneys around $80,000, that means half are making less than that, often after years of practice.

So how unaffordable is the possibility of a $75,000 a year income after 15 years, for a nine month contract, with all sorts of fringe benefits — such as health care, retirement, tickets to sports and cultural events, and even free or subsidized parking?

But don’t apply if you’re female. Because schools can legally discriminate by specifying voice type, there are on average at least twice as many positions for men, despite the fact that most voice students are female, and on average, you’ll only make 75% of what the men do.

Deception and Greed

A century or so ago, and certainly earlier, the general consensus, both among the public and the scientific community was that homo sapiens was the only tool-using creature, and certainly the only one who had self-consciousness. But recent studies of various primates, Caledonian jays, and other species have proved that mankind is not the only tool-user, merely the most advanced of tool-users. More recent studies also suggest that some primates and jays, and possibly even elephants, have at least a form of self-consciousness.

What led to this conclusion? Experiments in the use of deception and self-imagery. In essence, certain species hide food and deceive others as to where they’re hiding the food. The way in which they used deception, and the varying levels of deception, depending on the closeness and relationship of those nearby, suggests that they are aware of themselves as individuals, and are also aware of others as individuals.

What I find intriguing about these studies is that there appears to be a link between intelligence and greed and deception. Now… a wide range of species accumulate food and other items, but only a handful exhibit what might be called “greed.” Greed can be defined as the drive to acquire and maintain possession of more physical goods or other assets than the individual or his family/clan could possibly ever physically utilize, often to the detriment of others.

One thing that’s interesting about human beings is that we also possess the greatest degree of concentrated greed and deception of any species. No other species comes close. This raises an intriguing question: To what degree is intelligence linked to greed and deception?

Are greed and deception by-products of intelligence, or are they the driving force to develop intelligence?

While the evolutionary/historical record suggests that species capable of greed and deception tend to be more successful in attaining control of their environment, what happens next? Intelligence develops tools, and individuals with greed and deception put those tools to use in varying ways to enhance their own power to the detriment of other members of the species. As the tools become more powerful, their use by those who possess them also tends to concentrate power and wealth, yet almost every successful society has also incorporated deception of some sort into its social framework.

Kurt Vonnegut made the observation in Slaughterhouse Five — through a Nazi character, of course — that the greatest deception perpetrated by the American system was that it was easy to make money. Because it was then thought to be so, income inequality was justified, because anyone who wanted to work hard could “obviously” become wealthy.

Historical institutional “deceptions” include the divine right of kings, the caste system of India, Aryan racial supremacy, the communist “equality of all” myth, and on and on.

But what does this bode in an increasingly technological information age, where hacking, phishing, and all other manner of informational deception has increased, involving not just the criminal element, but industry, politics, and entertainment on all levels? Does it mean that the survivors will have to be even more intelligent, or that social structures will come crashing down because no one can trust anyone about anything? Or will we manage to muddle through? Will survival of deception be the ultimate Darwinian test of the fittest? Maybe… there’s an idea for a book…

The Weaker Sex… Revisited

Years ago, James Tiptree, Jr. [Alice Sheldon] wrote a novella entitled “Houston, Houston, Do You Read,” in which present-day male astronauts were catapulted into a future where there are no men. The implications of the story are that, despite their greater physical strength, men were indeed the weaker sex and perished, albeit with a little “help.” Some recent analyses of educational achievement by gender suggest that Sheldon just might have been on to something.

Over the past few years, sociologists, psychologists, teachers, and even politicians have been raising an increasing number of questions about the gender and educational implications of a high-tech society. Three generations ago, women students were a clear minority in higher education and in almost all “professional” occupations. Today, female students comprise the majority of undergraduate college students and graduates, with a nationwide ratio running in excess of 56% to 44%, a ratio that is becoming more unbalanced every year. So many more women are attending college that upon many campuses, particularly at elite universities and liberal arts colleges, they’re being subjected another form of discrimination. In order to keep a gender balance, many schools effectively require female students to meet higher academic standards than male students.

A recent report [Gender Equity in Higher Education: 2006] reported that in college men spent more time watching television, playing video games, and partying, while women had better grades, held more leadership posts, and claimed far more honors and awards.

The trend of women seeking professional education is also continuing in many graduate fields, such as law and medicine, where women outnumber men at many institutions.

In more and more state universities, women outnumber men, in some cases composing sixty percent of the student body. Even at Harvard, the latest freshman class has more women than men. The only areas where men are numerically dominant are in the hard sciences and in engineering, but even there, a greater and greater percentage of science and engineering students are female. Not only that, but in the vast majority of institutions, the majority of awards and honors are going to women. Now, admittedly, this female expertise hasn’t yet managed to reach and shatter the “glass ceiling” prevalent in the upper reaches of the faculty in higher education or in corporate America, but it’s coming, one way or another. There are female CEOs, but many women are simply choosing to create their own businesses, rather than play the “good old boy game.” Others become consultants.

Another factor that I’ve noted in my occasional university teaching, and one also noted by various family members, three of whom are professors at different universities, is that a decreasing percentage of college-age men are inclined to apply themselves in any degree that requires more than minimal effort, both physically and intellectually. This is a disturbing trend for society in a world where education and intellectual ability have become increasingly important, both to hold society together and to achieve a comfortable lifestyle suited to personal satisfaction and raising children. Even more disturbing is that this gender-based educational disparity becomes greater and greater as the income of the parents decreases. In short, men from disadvantaged homes are often half as likely to get a college degree as women from the same cultural and economic background.

Both my wife [who is a full professor] and I have watched male students turn down full-tuition scholarships because the requirements were “too hard,” while talented women literally had to claw their way through the same program.

Do these trends represent a failure of our educational system.. or is it that too many men can’t really compete with women once the playing field starts to get more level? Or that men need special treatment to achieve on an equal basis? After all, the real hunters are the lionesses, not the “noble” lion.

Reading… Again

In the headlines recently have been more stories about how little Americans read. According to an AP-IPSOS study, twenty-seven percent of all adults in the United States have not read a book in the past year. The remainder — those who claimed to have read at least one book over the past 12 months — averaged seven books. According to another earlier Gallup poll, some 57% of Americans had not read a book [besides the Bible] in the previous year.

I’m not troubled by the fact that there are those who haven’t read any books. In any society, there are people who just aren’t readers. But I am troubled by the numbers and the way they fall out.

I wasn’t surprised that the readers of fantasy and science fiction comprised less than 5% of all readers. Nor was I exactly astounded to discover that, over the past 15 years, book-reading percentages are down for the 18-25 age group, from close to 90% to less than 60%. More than half of all frequent readers are over age 50, and more than 55% of all books are purchased by those over 50. The highest concentrations of readers are among those who are older and college-educated.

Yet book sales are up. Exactly what does that mean? I’m reminded of a study done for the National Opera Association several years ago. Sales of opera tickets were up, and everyone was pleased until they looked closely at the numbers — which showed that while the number of tickets sold was up, the actual number of patrons was down, and that the average age of patrons was increasing.

The statistics on book reading seem to be following a similar pattern, and for years now, various pundits and social scientists have been worried that Americans are losing their reading skills – and that a smaller and smaller percentage of Americans are highly literate. Yet the U.S. economy still dominates the world stage, and, despite the difficulties in the Middle East, our military machine has no equal — even in situations where we have to deal with sectarian civil wars. So what’s the problem?

The problem is information-processing. To make intelligent decisions, human beings need information. They can obtain that information in one of three ways: direct personal experience, listening and watching, or reading. The first two means, while often necessary, share one basic problem. They’re slow, and the information flow is very restricted. Even slow readers generally can process written information several times faster than auditory information, and they can always refer back to it. That’s one reason, often forgotten, why stable civilizations did not emerge until written languages developed. The invention of the printing press in Europe provided a tremendous informational advantage to western European civilization, which, until that time, had lagged behind the Asiatic cultures, particularly the Chinese. The Chinese culture effectively used an elaborate written pictograph-based character language to restrict social and political access to a comparatively minute fraction of the population, which resulted in a tremendous information gap once western cultures combined alphabet-based languages with widespread use of the printing press and the comparative decline of Chinese power and influence.

In its own fashion, an auditory-visual media culture limits and shapes information flow, first by selectively choosing what information to promulgate and second by tying that information to visual images. Now, immediately, someone will question this by pointing out the multiplicity of media outlets and the different media channels. There are hundreds of cable and satellite channels; there are thousands of blogs and web-sites. How can I claim this is limiting? First, on every single cable and satellite station, the information flow is effectively limited to less than one hundred words a minute. That’s the top rate at which most people can process auditory input, and most facts have to be put in words. Second, while the internet remains primarily text-based, the vast majority of internet users is limited to what best might be called “common access” — and that is extremely limited in factual content. If you don’t believe me, just search for Mozart or Einstein or anything. In most cases, you’ll find hundreds, if not thousands, of references, but… you’ll seldom find anything beyond a certain “depth.” Oh… I’m not saying it’s not there. If you’re a university student, or a professor using a university library computer, or if you want to pay hundreds or thousands of dollars in access fees, or if you live in a very affluent community with integrated library data-bases, you can find a great deal… but Joe or Josephine on the home computer can’t.

In reality, the vast majority of internet users circulate and re-circulate a relatively small working data-base… and one which contains far less “real” information than a very small college library, if that.

Then add to that the fact that close to 60% of college graduates, according to a Department of Education study published last year, are at best marginally literate in dealing with and understanding information written at the level of a standard newspaper editorial.

These lead to the next question. Why does all this matter?

I’d submit that it matters because we live in the most highly technical age in human history, where no issue is simple, and where understanding and in-depth knowledge are the keys to our future… and possibly whether we as a species have a future. Yet the proliferation of an auditory-visual media culture is effectively limiting the ability of people, particularly the younger generations, to obtain and process the amount of information necessary for good decision-making and replacing those necessary reading and researching skills with simplistic audio-visuals and an extremely limited common informational data-base. This makes for a great profit for all the media outlets, but not necessarily for a well-informed citizenry.

Like it or not, there isn’t a substitute for reading widely and well, not if we wish what we have developed as western semi-representative governments to continue. Oh… some form of “civilization” will continue, but it’s far more likely to resemble a westernized version of the pre-printing press Chinese society, with a comparatively small elite trained in true thought and mental information processing, all the while with the media and communications systems types enabling “sound-byte” politicians with simplistic slogans while trumpeting freedom of expression and banking greater and greater profits.

Come to think of it… I wrote a story about that. It’s entitled “News Clips from the NYC Ruins.” If you didn’t read it in The Leading Edge, you can read it in my forthcoming story collection — Viewpoints Critical — out next March from Tor.