Archive for January, 2010

The [Computer] Age of Illusion

I love my computers… mostly. But computers aren’t exactly what they seem to be for most people, and the wide-spread proliferation of computers and their omnipresence has had good effects and bad. One of the worst is that the “computer age” has generated a host of illusions that have, in general, had pervasively negative effects.

The first illusion is one I’ve mentioned before — the illusion of choice. The internet and the world-wide web — as well as satellite TV — offer an infinite array of choices, don’t they? No… not exactly. If… and it’s a big IF, often requiring considerable expense to someone, you have access to all the university libraries and research facilities through the web, there’s quite a bit to be found. The problem is that, first, most people don’t have universal access or anything close to it, and, second, even for those that do, the search systems are rudimentary, if not misleading, and often simply cannot find information that’s there. More to the point, for general users, the information resembles the “Top 40” in hundreds of different formats — the same two paragraphs of information in infinite variety of presentation. The same is equally true of the software tools available. And it’s definitely true of all the varieties of TV entertainment. Yes, there’s great choice, and most of it’s in the packaging.

The second illusion is what I’d call the illusion of completeness. Students, in particular, but a huge percentage of those under thirty have the illusion that all the knowledge and information can be had through the internet. Just as an illustration I did a search on Paul Bowles, the composer and writer, and came up with a theoretical 285,000 references, which boiled down to 480 discrete references, which further decreased to 450 after deleting the other “Paul Bowles.” Almost 20% of the references dealt with aspects of his most famous book — The Sheltering Sky. Something like 15% were different versions of the same standard biography. Three other books of his received about 10% each of the references. From what I could determine, more than ten percent of all entries were totally useless, and over 70 percent of the detailed references, which might provide unique information, were either about works for purchase or articles or studies not available online. That’s not to say that such an internet search doesn’t provide a good starting point. It can, but, unfortunately, the internet is exactly where most students and others looking for information stop.

The third is the illusion of accomplishment. Americans, in particular, feel that they’re working harder than ever, and the statistics tend to support it. But what did all that work accomplish? With all the emphasis on reports and accountability, businesses and institutions are generating more reports and data than ever before in history. With email and cellphones, the majority of North Americans and those in the industrialized world are “instantly” available. With this instant access, supervisors, customers, and governments all want answers “now,” and more and more time is spent responding rather than “doing,” and all the need to respond to all the inquiries limits the time available to “do.”

The fourth is the illusion of the “quick and perfect solution.” In the world of the mass media, entertainment, and computers, problems are resolved in an hour or by the judicious application of software [and if you can’t make the software work, that’s your problem, not the software’s]. Combine this with the niche-fragmentation of society, and each group has its own “perfect solution,” if only those “other idiots” would understand why what we’ve laid out is THE answer.

The fifth illusion is that of “reality.” Both the entertainment industry and the computer wizards are working hard to make the unreal look as real as possible and to convince everyone that everything can be handled electronically, and that there’s little difference between “electronic reality” and dirt-under-your-fingernails reality. That bothers me, more than a little, because electronics don’t convey the blood, sweat, striving, and agony that fill all too many people’s lives, and that lack of visceral appeal leads to more and more decisions based on image and profit — exactly exemplified in bankers who make million dollar bonuses while police, firefighters, and teachers have their wages cut and/or frozen whenever the economy dips.

But then, of what use is reality when illusion is so satisfying?

The "Deserving" Dilemma

With a military escalation in Afghanistan estimated to cost one million dollars per soldier per year, a health care legislative package that could cost trillions, not to mention escalating costs in dozens of federal programs, both the Administration and the Congress are looking for ways to come up with more funding and/or reduce the cost of existing programs.

The easy target, of course, is to aim at the “undeserving rich,” such as Wall Street hedge fund managers and investment bankers and tax them more heavily. And between the expiration of the Bush tax cuts and various administration and Congressional initiatives, it seems unlikely that the “rich” will escape greater taxation. The problem there, of course, as I’ve noted before, is that most of the “undeserving rich” will escape the majority of the proposed taxes, while the upper middle class will pay for most of it, because, for some strange reason, the political mindset is that anyone who makes over $250,000 is rich. Yet, in practice, most tax avoidance strategies don’t work for couples who make between $250,000 and $400,000. In fact, the more money you make, the better they work. So these tax increases will hit the hardest on those couples in high cost-of-living cities, such as New York and San Francisco, particularly those where both spouses work long hours and who, facing high mortgage and schooling costs for offspring, would laugh bitterly at the idea that they’re rich. Do they really “deserve” to be the most heavily taxed segment of the population? Certainly, they’re anything but poor, but for the most part, the vast majority of their income is earned through long and hard work, and they’re not the ones with the mansions and the yachts and the fabulous vacations.

At the other end of the income spectrum are the “poor.” Like the “rich,” this catch-all definition includes all manner of people, except the poor range from third-generation welfare recipients to hard-working minimum-wage level families, from drug-addicts to disabled individuals who need support services, either to work or just to live, from the able-bodied employable unemployed to the mentally-disabled unemployable. And society has effectively said that all of them are deserving of governmental aid, in some form or another, if not in several forms.

But… exactly who is “deserving,” and of what are they deserving?

Both the Iraqi and Afghan people deserve better governments and lives, and we “deserve” not to be at the mercy of terrorists — but “deserving” or not, does spending all those resources and lives in the Middle East really make sense? On the domestic front, why should middle-class taxpayers subsidize families and single mothers who knowingly have child after child that they cannot support through their own efforts? Why should hedge fund managers who get multimillion dollar bonuses for gaming securities in such a way that it continually threatens our prosperity pay lower effective tax rates than say, primary care doctors or college professors? Why should taxpayers have to fund rehabilitation efforts for teenagers and adults who make bad choices and become addicts?

One answer, of course, that applies to the “poor” is the children. If society doesn’t maintain some living standards for the poor with children, the argument goes, then the cycle of poverty and violence is merely repeated generation after generation, and besides, the children “deserve better.” But with forty percent or more of the American population paying no income taxes at all, virtually all of them, despite the rhetoric, either working-class or poor, and something like 20% of the remainder paying 80% of the taxes, how long before the “needs” of the “deserving” — both internationally and domestically — overwhelm the American mid-middle class and upper-middle class?

Don’t they “deserve” some consideration as well?

I’m Sorry, But People Don’t Learn That Fast

A very recent review of Arms-Commander opined that the book was, as expected, essentially Recluce “comfort food.” I think it’s more than that, but as an author I can live with such a commentary. What really bothered me about the review was the opinion that Saryn’s opponents should have learned from others’ mistakes and adapted to her tactics.

The reason I’m bringing this up is because it’s far from the first time this criticism has been aired, both regarding my fantasy and SF, and the critics don’t seem to have learned from their mistakes, either.

There are several points that the “critics” don’t seem to understand. First, there’s a vast difference between “receiving information” and “learning.” Learning requires not only assimilating the information, but responding to it and changing one’s actions and behavior. To learn from someone else’s mistakes, you have to know about them and understand why they were mistakes. In a low-tech society word doesn’t travel fast, and sometimes it doesn’t travel at all. And when it does travel, you have to be able to trust the bearer of that news. You also have to have enough knowledge to be able to understand what went wrong. Now, in the case of Saryn, and Anna and Secca in the Spellsong Cycle, most of their opponents — and officers — who made the mistakes didn’t survive them. Even when a few did, those that did weren’t likely to be trusted, especially in Lornth, where most lord-holders don’t trust any other lord-holders.

Even if knowledge of such defeats reached others, the knowledge of how those defeats occurred didn’t. This isn’t unique to fiction. The conquests of Alexander the Great tend to follow the same pattern. He had a new way of waging war, and yet almost no one seemed to adapt. Even with more modern communications, how many generals in World War I sent hundreds of thousands of men to their deaths, essentially in exercises in futility, seemingly unable to understand that the infantry charges of the past didn’t work against barbed wire, machine guns, and deep trenches? Even the high-tech U.S. armed forces took almost a decade to switch from conventional war techniques to wide-spread counter-guerilla tactics in Vietnam [and some critics contend we never did].

Even if you do understand what happened, to counter it you have to change the way you operate, usually the way you’ve trained your forces and your commanders and subcommanders. Even in the best of cases, this doesn’t happen quickly. I’ve been a teacher and a swimming coach, and my wife is a singer and voice teacher, and we both know it takes years to get most people to change long-held and incorrect techniques. It’s not something that happens overnight or even in weeks or seasons. And sometimes, after a certain age, people simply can’t change their way of dealing with matters.

Finally, even if you think you want to learn, if that learning requires letting go of long-held beliefs and biases, in many, many cases, it simply won’t happen. Instead, you’ll attribute the problem to other factors or ignore it totally because you hold those beliefs so dearly.

Yet the medieval level holders and barons of Lornth, with no communications faster than horses, no understanding of what really happened, no trust in each other [which was what caused many of their problems to begin with], and no desire to change their tactics and way of life, should have understood what Saryn was doing, essentially before they even had word, and revolutionized everything they knew about warfare and fighting in less than a season?

The idea that any significant fraction of people, and particularly institutions, learn, adapt, and respond quickly is more fantastic than anything I’ve ever put in print.

The "Freedom" Naivete

A website [Pat’s Fantasy Hotlist] just posted a quote from Arms-Commander, in which a character notes that “it is better to be a just tyrant who provides freedom than a dead ruler who tried to be fair in an unfair world.” Almost immediately an anonymous commenter observed that “a tyrant, even a just one, can never provide freedom. It’s antithetical to the very nature of the word.”

I was instantly torn between the desire to laugh hysterically, to go postal, or to sigh in despair. After having spent a lifetime studying government and politics, not to mention nearly twenty years in Washington, D.C., in a number of federal positions in both the executive and legislative branch, and after two tours in the U.S. Navy, I think I have a fair understanding of how governments do and don’t work. So…

First… NO government provides “freedom” in the absolute sense. All governments restrict certain practices and behaviors in order to maintain order, because without order, people literally do not have the “freedom” to walk down the streets safely. Even with such restrictions, the order created may not be anywhere close to desirable — except when compared to the state of no effective government at all, as one can currently see in Somalia. The degree of restrictions and how much order is created varies from country to country and system to system, but restrictions on behavior for the safety of others are anything but “freedom” in the ideal sense.

Second, the original meaning of “tyrant” was a ruler who seized power outside of the previous legal system. In that sense, the founding fathers of the United States were tyrants. Now… they justified that rebellion and seizure after the fact by creating a system that was superior to what preceded it. We can call them authors of democracy, founding fathers, and the like, but a significant number were in fact rebelling aristocrats who did in fact sometimes behave like the popular conception of tyrants, and who insisted on enshrining the legality of slaveholding, certainly a local, if not a regional tyranny. Yet they provided more “freedom” than the previous system.

Likewise, “freedom” and even “liberty” have been evolving terms. In the United States, originally “liberty” was effectively limited to white males, and predominantly property owners. Slaves, women, and children had few legal rights. In practice, and in law, “freedom” is the granting of certain rights to certain classes of people, and the less restrictive the conditions circumscribing those rights are, the “freer” those people are judged to be.

Given that, by definition, in practice, there’s no difference in the moral status as a ruler between a “tyrant” and a “legitimate” government. How each attained power may have a moral connotation, but the “morality” or “ethics” of their regimes depend on the acts and laws by which they rule and the results. Franco was a dictator and a tyrant of Spain, yet Spain today is a free and democratic nation that works, as a result of his “tyranny” and reforms. On the other hand, Salazar of Portugal operated a brutal secret police and impoverished that nation.

Tyrants aren’t, by definition, any more antithetical to freedom than any other class of ruler, because all rulers, democratic or otherwise, in order to maintain a civil society, restrict freedoms. Period.

Reader Reviews

“I couldn’t do it.’ Those are my wife’s words every time I talk about reading though reader reviews of my books. Many authors won’t do it. I’m one who does, grudgingly, very grudgingly, because I’m still a reluctant optimist, but I believe that you can learn something from anything — even reader reviews.

Unfortunately, maybe those other authors are right, because I don’t much care for what I’m learning, and it doesn’t seem to be of much use, not if I want to keep trying to become a better and better writer. At first, I thought that I was imagining things, but then, because I do have a background in economics and analysis, I decided to apply some basic analysis — and I used The Magic of Recluce as the “baseline.” Why? Because it’s been in print continuously since 1991. It’s not a perfect baseline or template, because the reader reviews I used [Amazon’s] don’t begin until 1996, but it gives the longest time-time of any of my books. Over that fourteen year time period almost 35% of readers gave the book a five star rating; 25% gave it a four star rating; 18% gave it three stars; 8% gave it two stars; a little more than 15% gave it a one star rating [and yes, that adds up to 101% because of rounding]. More interesting, however, was the timing of ratings and the content of key words in those ratings.

To begin with, for the first two years or so of ratings, comprising roughly 20% of all ratings, all the ratings were either four or five stars, and not until 1999, eight years after the book was first out, did it receive a one star rating. Not just coincidentally, I suspect, that was the first review that claimed the book was “boring.” More than half the one and two star reviews have been given during the last five years, and virtually all of the one star reviews use terms such as “boring” or “slow.” From the wording of those reviews, I suspect, but cannot firmly prove, most come from comparatively younger readers.

The fact that more and more readers want “faster” books doesn’t surprise me. Given the increasing speed of our culture, the emphasis on “fast-action movies” and faster action video games, it shouldn’t surprise anyone. What does bother me is the equation of “fast” to “good” and the total intolerance that virtually all of these reviews show for anything that takes thought and consideration. The fact that more than twice as many readers find the book good as those who do not, and that a majority still do indicates that there are many readers who still appreciate depth, but the change in the composition of readers, as reflected in the reviews, confirms, at least in my mind, that a growing percentage of fantasy readers want “faster” books. Again… no surprise, but the virulence and impatience expressed is disturbing, because it manifests an incredible sense of self-centeredness, with reader reviews that basically say. “This book is terrible because it didn’t entertain me in the way I wanted.” And terms like “Yech!”, “Yuck!”, “Such Junk?”, “its [sic] horrible”, and “total waste” certainly convey far more about the reader than about the book.

As an author, I understand all too well that not all authors are for all readers, and there are authors, some of whom are quite good, who are not to my taste. But there’s an unconscious arrogance that doesn’t bode well for the future of our society when fifteen percent of readers state that a book is terrible because it doesn’t cater to the reader’s wishes — and throwing the book through a window because it doesn’t [yes, one reviewer claimed to have done so].

I’d say that they need to grow up… but I’m afraid that they already have, and that they’re fast approaching a majority, at least among the under 30 crowd. Two recent articles in other publications highlight the trend. The latest edition of The Atlantic Monthly has one explaining why newspaper articles are too long and basically gives what amounts to a variation on the USA Today format as an answer — quick juicy facts with little support or explanation. And what’s really frightening was the conclusion of an article in the “Week in Review” section of The New York Times last Sunday — that youngsters who are now 4-10 will make today’s young people seem like paragons of patience.

Newspeak, here we come.

Successors

On the Locus online site, there’s a discussion about who might be considered a worthy successor to the “grand old man” of science fiction — the late Robert A. Heinlein. A number of names are mentioned, and those contributing all give reasons for their selections. Something about this bothered me when the discussion was launched weeks ago, and, slow as I can sometimes be about the obvious, “it” — or several “its” — finally struck me.

What’s the point of the discussion? For all his accomplishments and faults, and he had both, Heinlein was unique to his time and place. Many of those involved in the discussion acknowledge this, but what isn’t brought up is that the same is true of most writers with any degree of accomplishment and originality.

Although few have noted it, Heinlein’s greatest claim to fame was that he combined originality, ideas that were usually less than jaded, and solid writing with popularity. According to one of the most senior editors in the F&SF field, the number of his individual titles that approached or exceeded million-seller status is “remarkable.” As a new biography to be published by Tor indicates, he was a complex man, with an equally complex and involved personal life.

So… why is anyone looking for his “successor”? Can’t the man be appreciated, or attacked, or analyzed, or whatever, for what he was? Has “sequel-itis” so permeated the critical F&SF community that some writer or writers must be jammed into a designed place?

Everywhere I look these days in entertainment — whether in cinema, music, books, and even games — there’s a tremendous pressure to fit. If an author or a musician does something different, there’s usually far more negative pressure and comment than positive. Much of that pressure is financial. I’ve noted on more than one occasion that any one of my “series” fantasies earns far more than one of my few critically acclaimed SF books — and this is not by any means exclusive to me.

In this light, even the discussion about successors to Heinlein nags at me, because I see it, perhaps unfairly, as another aspect of trying to come up with easy categorization in a field where such categorization is anything but easy and where labels create false expectation after false expectation. For example, it’s fair to say that a “Recluce” book should be a “Recluce” book, taking place in that world and adhering to the rules of that world, with a similar style, but is it fair for readers and marketers to insist that every book I write follow that style?

Certainly, that is the pressure. Some authors actually have a different pen name for each “style” of book they write, but what does that say about readers? Are so many so rigid in their habits and mindsets that they can’t look at anything different by the same author? Or have the marketing mavens conditioned them that way?

How about accepting/rejecting Heinlein for what he is, and doing the same for the writers that have followed him, instead of looking for quickly identified niches and tropes? When reviewers and critics who are supposedly analytical and thoughtful do this, that, frankly, bothers me even more. They should know better, but, then, maybe I’m just expecting too much.

Or does looking at each writer and book for what they are require too much thinking and depress the bottom lines of the industry?

Weapons and Technology from the Gaming Industry?

The Economist reported that the U.S. Air Force has put in a request to procure 2,300 Sony PlayStation 3 (PS3) consoles — not for personnel entertainment, but to hook together to build a supercomputer for ten percent of the cost of ordering one. This isn’t a one-time fluke, either. The USAF has already built and is operating a computer constructed from 336 PS3s. U.S. troops are using slightly modified off-the-shelf electronics for everything from calculating firing trajectories to controlling drone RPVs.

While I’m perfectly happy as a taxpayer to see cost-effective procurement, examples such as these give me a very uneasy feeling… for a number of reasons. First is the obvious fact that anything that is commercial and open can eventually be cracked, hacked, snooped, and sabotaged. Granted, for some applications that’s unlikely or doesn’t matter, but controlling drones? Second, even the example of the USAF procurement gives the “bad guys” new ideas and capabilities. And third, somehow the thought of our supposedly high-tech military having to rely on the gaming industry for the latest technology — and they are beginning to do so, thanks in part to complicated and Byzantine U.S. military procurement regulations — suggests that there’s something a bit askew in our national priorities, especially when a Nigerian national paying with cash, carrying no luggage, traveling alone, and already on the terrorist watch list can get to the point of almost detonating an incendiary device on an aircraft about to land in Detroit.

We don’t seem to be able to carry through on relatively routine security measures; we rely on gamers for high technology; we haven’t been able or willing to build a supersonic replacement for the Concorde; we’re behind the entire rest of the world in implementing high-speed ground/rail transport; and our most profitable industries are financial manipulation and litigation.

Now… it also turns out that some of these gaming devices provide essentially the guts of supercomputers… and that they have far-reaching medical implications.

For all this, I must say that I have to salute the gaming industry… but what exactly does that say about the state of American drive, initiative, and technology in every other area?

Are we so into video gaming that the rest of our high-tech industry needs to subsist on the fruits and scraps of electronic entertainment?

The Trouble with Numbers

We live in a world that has become on a daily basis increasingly more complex because of its ever advancing technology and still rapidly increasing population. One of the most obvious effects of both is that we have come to live in a world defined by, restricted by, and described by numbers.

For example, for most people, the date today is January 1, 2010. That’s effectively an arbitrary denotation of the passage of time since the attributed date of birth of the founder of a major religious belief system. Research, however, suggests that that birth date is off by six or so years and that the time of year was later manipulated for theo-political reasons to coincide roughly with the winter solstice. Because much of the world has based its chronology — and dates and chronologies are important for many political, economic, and social reasons — societies in general have accepted the modified Gregorian calendar for practical reasons and have resisted major changes for exactly those reasons. But most people never consider the background or the implications, and those who do quickly move on to more pressing issues, and ones about which they can do something.

Unhappily, the same lack of understanding lies behind so many of the numbers we use in society today, and the numbers tend to become “reality,” with little understanding of what actually lies behind them — until something goes wrong, and the blame is assessed everywhere but where it should be — and that’s at a lack of understanding of what the numbers really mean… or, in many cases, what they do not mean or represent.

For example, everyone takes “for granted” that if someone runs a temperature over 98.6 degrees Fahrenheit consistently for several hours, that person is sick. Not necessarily. In some cases, subnormal temperatures signal severe illnesses as well. Also, the 98.6 degree number is an average across large populations. It doesn’t hold for everyone, as I well know, because my wife’s “normal” temperature is consistently a degree and a half below “normal.” What that means for her is that what would be a mild or moderate fever for someone else is a severe fever for her. Yet the failure to understand the difference between “normal” for her and for the population as a whole could make a considerable difference to her in the case of a severe infection.

I’ve made the point earlier about numbers in regard to the side-effects with regard to vaccinations. Because some parents do not understand statistics, because they fear side-effects that occur in one in a million cases, they will avoid vaccinations for “childhood” diseases, where the side effects of the disease are often hundreds of times more prevalent than the side-effects of vaccination.

Failure to understand what the economic numbers meant in the several years before the last financial meltdown contributed mightily to the disaster. No matter what any “guru” preaches, you cannot have massive societal and even world-wide price run-ups in securities and real estate prices on a wide-scale basis when real overall economic growth is slow or moderate — not without generating a “bubble” and a subsequent collapse.

Nor can every company realistically aim at 10-40% annual profit targets, and when large numbers of companies are posting such profits at a time when nominal inflation is low… something is wrong, either the way those profits are calculated, or the way inflation is measured… or the reporting of other data… or the business practices of the companies involved.

Likewise, when more than forty percent of the grades given at universities in the United States are “As,” anyone with a modicum of understanding should realize the implications behind those numbers. In three generations, human beings don’t change from 10-15% of the collegiate population being brilliant to 40% plus being brilliant, especially when far larger numbers of less advantaged students are attending college. What it does mean, among other things, is that pursuit of “the almighty grade” has become as rampant as the pursuit of “the almighty dollar,” and that excellence in both academia and business has become secondary to numerical targets of dubious worth in assessing performance.

When “reader reviews” flood Amazon.com, what do they mean? Do they really judge excellence? While some may be accurate in that regard, in practice what those numbers reflect is popularity, not quality. There’s nothing wrong with that… so long as people understand that, but unfortunately, many don’t. More than a few readers have contacted me in surprise after reading one of my “less popular” SF novels to say that they thought a book was far better than the reader reviews. That shouldn’t really be surprising. Often excellent books do not make a quick and easy read, and for some readers, who seek ease of escape and entertainment, an excellent book may not be a good read. That doesn’t mean the book is “bad,” only that it’s not suited to them, but handing out “stars” for popularity doesn’t reflect quality. In fact, one reader made the point that he looks for “bad” ratings among authors he knows are good writers to find the excellent books.

The same problem exists with the travesty of “student evaluations.” I’m sorry, but 18-20 year old students do not know what they need to learn. Studies have shown that high student evaluations correlate directly to high grades given by the professor. There are always exceptions, but across thousands of professors that observation holds true. Thus, the numbers reflected in student evaluations do not reflect the quality of teaching, but the degree of grade inflation. Yet university administrations routinely use these evaluations as a proxy for good teaching. What their use reflects is not excellence, but the need for “popular” teachers to fill classrooms, regardless of excellence.

I could go on and on, but my opening thought for another numbered year is that, with more and more numbers flooding us, day after day… try, please try, to understand what they really mean and not what everyone else tells you they mean.