The [Computer] Age of Illusion

I love my computers… mostly. But computers aren’t exactly what they seem to be for most people, and the wide-spread proliferation of computers and their omnipresence has had good effects and bad. One of the worst is that the “computer age” has generated a host of illusions that have, in general, had pervasively negative effects.

The first illusion is one I’ve mentioned before — the illusion of choice. The internet and the world-wide web — as well as satellite TV — offer an infinite array of choices, don’t they? No… not exactly. If… and it’s a big IF, often requiring considerable expense to someone, you have access to all the university libraries and research facilities through the web, there’s quite a bit to be found. The problem is that, first, most people don’t have universal access or anything close to it, and, second, even for those that do, the search systems are rudimentary, if not misleading, and often simply cannot find information that’s there. More to the point, for general users, the information resembles the “Top 40” in hundreds of different formats — the same two paragraphs of information in infinite variety of presentation. The same is equally true of the software tools available. And it’s definitely true of all the varieties of TV entertainment. Yes, there’s great choice, and most of it’s in the packaging.

The second illusion is what I’d call the illusion of completeness. Students, in particular, but a huge percentage of those under thirty have the illusion that all the knowledge and information can be had through the internet. Just as an illustration I did a search on Paul Bowles, the composer and writer, and came up with a theoretical 285,000 references, which boiled down to 480 discrete references, which further decreased to 450 after deleting the other “Paul Bowles.” Almost 20% of the references dealt with aspects of his most famous book — The Sheltering Sky. Something like 15% were different versions of the same standard biography. Three other books of his received about 10% each of the references. From what I could determine, more than ten percent of all entries were totally useless, and over 70 percent of the detailed references, which might provide unique information, were either about works for purchase or articles or studies not available online. That’s not to say that such an internet search doesn’t provide a good starting point. It can, but, unfortunately, the internet is exactly where most students and others looking for information stop.

The third is the illusion of accomplishment. Americans, in particular, feel that they’re working harder than ever, and the statistics tend to support it. But what did all that work accomplish? With all the emphasis on reports and accountability, businesses and institutions are generating more reports and data than ever before in history. With email and cellphones, the majority of North Americans and those in the industrialized world are “instantly” available. With this instant access, supervisors, customers, and governments all want answers “now,” and more and more time is spent responding rather than “doing,” and all the need to respond to all the inquiries limits the time available to “do.”

The fourth is the illusion of the “quick and perfect solution.” In the world of the mass media, entertainment, and computers, problems are resolved in an hour or by the judicious application of software [and if you can’t make the software work, that’s your problem, not the software’s]. Combine this with the niche-fragmentation of society, and each group has its own “perfect solution,” if only those “other idiots” would understand why what we’ve laid out is THE answer.

The fifth illusion is that of “reality.” Both the entertainment industry and the computer wizards are working hard to make the unreal look as real as possible and to convince everyone that everything can be handled electronically, and that there’s little difference between “electronic reality” and dirt-under-your-fingernails reality. That bothers me, more than a little, because electronics don’t convey the blood, sweat, striving, and agony that fill all too many people’s lives, and that lack of visceral appeal leads to more and more decisions based on image and profit — exactly exemplified in bankers who make million dollar bonuses while police, firefighters, and teachers have their wages cut and/or frozen whenever the economy dips.

But then, of what use is reality when illusion is so satisfying?

The "Deserving" Dilemma

With a military escalation in Afghanistan estimated to cost one million dollars per soldier per year, a health care legislative package that could cost trillions, not to mention escalating costs in dozens of federal programs, both the Administration and the Congress are looking for ways to come up with more funding and/or reduce the cost of existing programs.

The easy target, of course, is to aim at the “undeserving rich,” such as Wall Street hedge fund managers and investment bankers and tax them more heavily. And between the expiration of the Bush tax cuts and various administration and Congressional initiatives, it seems unlikely that the “rich” will escape greater taxation. The problem there, of course, as I’ve noted before, is that most of the “undeserving rich” will escape the majority of the proposed taxes, while the upper middle class will pay for most of it, because, for some strange reason, the political mindset is that anyone who makes over $250,000 is rich. Yet, in practice, most tax avoidance strategies don’t work for couples who make between $250,000 and $400,000. In fact, the more money you make, the better they work. So these tax increases will hit the hardest on those couples in high cost-of-living cities, such as New York and San Francisco, particularly those where both spouses work long hours and who, facing high mortgage and schooling costs for offspring, would laugh bitterly at the idea that they’re rich. Do they really “deserve” to be the most heavily taxed segment of the population? Certainly, they’re anything but poor, but for the most part, the vast majority of their income is earned through long and hard work, and they’re not the ones with the mansions and the yachts and the fabulous vacations.

At the other end of the income spectrum are the “poor.” Like the “rich,” this catch-all definition includes all manner of people, except the poor range from third-generation welfare recipients to hard-working minimum-wage level families, from drug-addicts to disabled individuals who need support services, either to work or just to live, from the able-bodied employable unemployed to the mentally-disabled unemployable. And society has effectively said that all of them are deserving of governmental aid, in some form or another, if not in several forms.

But… exactly who is “deserving,” and of what are they deserving?

Both the Iraqi and Afghan people deserve better governments and lives, and we “deserve” not to be at the mercy of terrorists — but “deserving” or not, does spending all those resources and lives in the Middle East really make sense? On the domestic front, why should middle-class taxpayers subsidize families and single mothers who knowingly have child after child that they cannot support through their own efforts? Why should hedge fund managers who get multimillion dollar bonuses for gaming securities in such a way that it continually threatens our prosperity pay lower effective tax rates than say, primary care doctors or college professors? Why should taxpayers have to fund rehabilitation efforts for teenagers and adults who make bad choices and become addicts?

One answer, of course, that applies to the “poor” is the children. If society doesn’t maintain some living standards for the poor with children, the argument goes, then the cycle of poverty and violence is merely repeated generation after generation, and besides, the children “deserve better.” But with forty percent or more of the American population paying no income taxes at all, virtually all of them, despite the rhetoric, either working-class or poor, and something like 20% of the remainder paying 80% of the taxes, how long before the “needs” of the “deserving” — both internationally and domestically — overwhelm the American mid-middle class and upper-middle class?

Don’t they “deserve” some consideration as well?

I’m Sorry, But People Don’t Learn That Fast

A very recent review of Arms-Commander opined that the book was, as expected, essentially Recluce “comfort food.” I think it’s more than that, but as an author I can live with such a commentary. What really bothered me about the review was the opinion that Saryn’s opponents should have learned from others’ mistakes and adapted to her tactics.

The reason I’m bringing this up is because it’s far from the first time this criticism has been aired, both regarding my fantasy and SF, and the critics don’t seem to have learned from their mistakes, either.

There are several points that the “critics” don’t seem to understand. First, there’s a vast difference between “receiving information” and “learning.” Learning requires not only assimilating the information, but responding to it and changing one’s actions and behavior. To learn from someone else’s mistakes, you have to know about them and understand why they were mistakes. In a low-tech society word doesn’t travel fast, and sometimes it doesn’t travel at all. And when it does travel, you have to be able to trust the bearer of that news. You also have to have enough knowledge to be able to understand what went wrong. Now, in the case of Saryn, and Anna and Secca in the Spellsong Cycle, most of their opponents — and officers — who made the mistakes didn’t survive them. Even when a few did, those that did weren’t likely to be trusted, especially in Lornth, where most lord-holders don’t trust any other lord-holders.

Even if knowledge of such defeats reached others, the knowledge of how those defeats occurred didn’t. This isn’t unique to fiction. The conquests of Alexander the Great tend to follow the same pattern. He had a new way of waging war, and yet almost no one seemed to adapt. Even with more modern communications, how many generals in World War I sent hundreds of thousands of men to their deaths, essentially in exercises in futility, seemingly unable to understand that the infantry charges of the past didn’t work against barbed wire, machine guns, and deep trenches? Even the high-tech U.S. armed forces took almost a decade to switch from conventional war techniques to wide-spread counter-guerilla tactics in Vietnam [and some critics contend we never did].

Even if you do understand what happened, to counter it you have to change the way you operate, usually the way you’ve trained your forces and your commanders and subcommanders. Even in the best of cases, this doesn’t happen quickly. I’ve been a teacher and a swimming coach, and my wife is a singer and voice teacher, and we both know it takes years to get most people to change long-held and incorrect techniques. It’s not something that happens overnight or even in weeks or seasons. And sometimes, after a certain age, people simply can’t change their way of dealing with matters.

Finally, even if you think you want to learn, if that learning requires letting go of long-held beliefs and biases, in many, many cases, it simply won’t happen. Instead, you’ll attribute the problem to other factors or ignore it totally because you hold those beliefs so dearly.

Yet the medieval level holders and barons of Lornth, with no communications faster than horses, no understanding of what really happened, no trust in each other [which was what caused many of their problems to begin with], and no desire to change their tactics and way of life, should have understood what Saryn was doing, essentially before they even had word, and revolutionized everything they knew about warfare and fighting in less than a season?

The idea that any significant fraction of people, and particularly institutions, learn, adapt, and respond quickly is more fantastic than anything I’ve ever put in print.

The "Freedom" Naivete

A website [Pat’s Fantasy Hotlist] just posted a quote from Arms-Commander, in which a character notes that “it is better to be a just tyrant who provides freedom than a dead ruler who tried to be fair in an unfair world.” Almost immediately an anonymous commenter observed that “a tyrant, even a just one, can never provide freedom. It’s antithetical to the very nature of the word.”

I was instantly torn between the desire to laugh hysterically, to go postal, or to sigh in despair. After having spent a lifetime studying government and politics, not to mention nearly twenty years in Washington, D.C., in a number of federal positions in both the executive and legislative branch, and after two tours in the U.S. Navy, I think I have a fair understanding of how governments do and don’t work. So…

First… NO government provides “freedom” in the absolute sense. All governments restrict certain practices and behaviors in order to maintain order, because without order, people literally do not have the “freedom” to walk down the streets safely. Even with such restrictions, the order created may not be anywhere close to desirable — except when compared to the state of no effective government at all, as one can currently see in Somalia. The degree of restrictions and how much order is created varies from country to country and system to system, but restrictions on behavior for the safety of others are anything but “freedom” in the ideal sense.

Second, the original meaning of “tyrant” was a ruler who seized power outside of the previous legal system. In that sense, the founding fathers of the United States were tyrants. Now… they justified that rebellion and seizure after the fact by creating a system that was superior to what preceded it. We can call them authors of democracy, founding fathers, and the like, but a significant number were in fact rebelling aristocrats who did in fact sometimes behave like the popular conception of tyrants, and who insisted on enshrining the legality of slaveholding, certainly a local, if not a regional tyranny. Yet they provided more “freedom” than the previous system.

Likewise, “freedom” and even “liberty” have been evolving terms. In the United States, originally “liberty” was effectively limited to white males, and predominantly property owners. Slaves, women, and children had few legal rights. In practice, and in law, “freedom” is the granting of certain rights to certain classes of people, and the less restrictive the conditions circumscribing those rights are, the “freer” those people are judged to be.

Given that, by definition, in practice, there’s no difference in the moral status as a ruler between a “tyrant” and a “legitimate” government. How each attained power may have a moral connotation, but the “morality” or “ethics” of their regimes depend on the acts and laws by which they rule and the results. Franco was a dictator and a tyrant of Spain, yet Spain today is a free and democratic nation that works, as a result of his “tyranny” and reforms. On the other hand, Salazar of Portugal operated a brutal secret police and impoverished that nation.

Tyrants aren’t, by definition, any more antithetical to freedom than any other class of ruler, because all rulers, democratic or otherwise, in order to maintain a civil society, restrict freedoms. Period.

Reader Reviews

“I couldn’t do it.’ Those are my wife’s words every time I talk about reading though reader reviews of my books. Many authors won’t do it. I’m one who does, grudgingly, very grudgingly, because I’m still a reluctant optimist, but I believe that you can learn something from anything — even reader reviews.

Unfortunately, maybe those other authors are right, because I don’t much care for what I’m learning, and it doesn’t seem to be of much use, not if I want to keep trying to become a better and better writer. At first, I thought that I was imagining things, but then, because I do have a background in economics and analysis, I decided to apply some basic analysis — and I used The Magic of Recluce as the “baseline.” Why? Because it’s been in print continuously since 1991. It’s not a perfect baseline or template, because the reader reviews I used [Amazon’s] don’t begin until 1996, but it gives the longest time-time of any of my books. Over that fourteen year time period almost 35% of readers gave the book a five star rating; 25% gave it a four star rating; 18% gave it three stars; 8% gave it two stars; a little more than 15% gave it a one star rating [and yes, that adds up to 101% because of rounding]. More interesting, however, was the timing of ratings and the content of key words in those ratings.

To begin with, for the first two years or so of ratings, comprising roughly 20% of all ratings, all the ratings were either four or five stars, and not until 1999, eight years after the book was first out, did it receive a one star rating. Not just coincidentally, I suspect, that was the first review that claimed the book was “boring.” More than half the one and two star reviews have been given during the last five years, and virtually all of the one star reviews use terms such as “boring” or “slow.” From the wording of those reviews, I suspect, but cannot firmly prove, most come from comparatively younger readers.

The fact that more and more readers want “faster” books doesn’t surprise me. Given the increasing speed of our culture, the emphasis on “fast-action movies” and faster action video games, it shouldn’t surprise anyone. What does bother me is the equation of “fast” to “good” and the total intolerance that virtually all of these reviews show for anything that takes thought and consideration. The fact that more than twice as many readers find the book good as those who do not, and that a majority still do indicates that there are many readers who still appreciate depth, but the change in the composition of readers, as reflected in the reviews, confirms, at least in my mind, that a growing percentage of fantasy readers want “faster” books. Again… no surprise, but the virulence and impatience expressed is disturbing, because it manifests an incredible sense of self-centeredness, with reader reviews that basically say. “This book is terrible because it didn’t entertain me in the way I wanted.” And terms like “Yech!”, “Yuck!”, “Such Junk?”, “its [sic] horrible”, and “total waste” certainly convey far more about the reader than about the book.

As an author, I understand all too well that not all authors are for all readers, and there are authors, some of whom are quite good, who are not to my taste. But there’s an unconscious arrogance that doesn’t bode well for the future of our society when fifteen percent of readers state that a book is terrible because it doesn’t cater to the reader’s wishes — and throwing the book through a window because it doesn’t [yes, one reviewer claimed to have done so].

I’d say that they need to grow up… but I’m afraid that they already have, and that they’re fast approaching a majority, at least among the under 30 crowd. Two recent articles in other publications highlight the trend. The latest edition of The Atlantic Monthly has one explaining why newspaper articles are too long and basically gives what amounts to a variation on the USA Today format as an answer — quick juicy facts with little support or explanation. And what’s really frightening was the conclusion of an article in the “Week in Review” section of The New York Times last Sunday — that youngsters who are now 4-10 will make today’s young people seem like paragons of patience.

Newspeak, here we come.