Archive for December, 2007

Gimmick or Tool?

I recently read a reader’s review of one of my books that complained that I’d used the same “device” in several Recluce books — a use of order/chaos and drugs that suppressed memories. Earlier, other readers complained that surely, in a high-tech future, there would be more fantastic weapons than space torps. These “reviewers” then concluded, on this basis, that the books were repetitive.

My first reaction was, “Come off it, idiots!” My second was, “Why do you bother reading when you obviously don’t understand much about human nature and culture… and clearly don’t want to?” My third reaction was to write this blog to attempt to clarify something that has come up more than a few times, not only in regard to my writing, but in regard to the work of more than a few other writers.

Let’s start out with one basic point that I’ve discussed before, and that Heinlein pointed out in print more than 35 years ago. There are no new plots. There are only differing ways of addressing the eternal basic plots.

The second point is that human beings use tools. We develop them; we use them; we keep using them so long as they work. Hammers have been in existence for as long as we have historical evidence, and for at least some 50,000 years, if not longer. They meet a need, and they aren’t going away.

Now… how does this apply to F&SF? It’s so simple that I’m almost embarrassed to put it in print, but it’s also so simple and basic that more than a few readers obviously haven’t thought about it. When a writer creates a fantasy world and its subcultures, assuming that these cultures are populated by beings with human or humanlike characteristics, these beings will use tools, techniques, and the like for replicable results. They will continue to use them so long as they work, or until they are supplanted by something else which they find better. That means that they will hone and use the “magic talents” that they possess that are useful. They will not throw them away or forget about them unless they are not useful. Thus, fantasy series that are true to societal nature will in fact — and should — present various techniques and tools used over and over again by those who can.

Likewise, these tools — whatever they may be in whatever books by whatever authors — will always be used in furtherance of human motives along one or more of the basic plots in human literature.

New gimmicks merely for the sake of introducing new gimmickry to avoid reader “boredom” are not only fraudulent, but bad writing. They may provide momentary excitement, like a sugar high, or other highs, but there’s not much behind it. And like those addicted to other highs, readers who continually desire new gadgets, gimmicks, and twists can seldom fully appreciate much beyond such.

Now… those who desire the continually “new” will and do argue against writing too many books in a given fantasy universe, but I consider that about as valid as saying writers should stop writing mainstream fiction because people use weapons to get their way in all cultures or because bribery is endemic, or asking why people all travel by one of the limited means of transport in a given culture.

By the same token, hewing to the “traditional” for the sake of the traditional and because the unfamiliar is unacceptable is just as much a fault. Neither new for the sake of new nor tradition for the sake of tradition makes for good writing.

Certain Blessings

At least in western European cultures, we have entered the holidays, and much has been written about how the time has changed from a period of spiritual rejoicing to unbridled materialism, if a materialism leavened by those who still endeavor to do good and by that small minority that always do their best, regardless of season.

In that mixed light, I’d like to reflect on speculative fiction. Although I can scarcely claim to be impartial, given my occupation, I do believe that speculative fiction, certainly at its best, and even at its worst, does convey some blessings upon this troubled world, and, if more people read it, would convey even greater blessings. Am I saying I like all that’s printed in the field? Heaven forbid. I’m not certain I even like or agree with the majority of it. But what speculative fiction does that no other form of literature or entertainment [for the most part] does is speculate on cultures, ideas, likes, dislikes, prejudices, technologies, governments, sexuality and its variations, and much, much more. By doing so, the field offers readers the chance to think about things before they happen. Admittedly, most of what appears in print won’t happen, and much of it couldn’t happen, for various reasons. But that doesn’t matter. What does matter is that the ideas and the reactions and actions of characters to those ideas and places and events give readers not only an intellectual view of them, but a view with emotional overtones.

The emotional overtones are especially important because, for most people, an idea or a possibility has no sense of reality without an emotional component involving a feeling of how it impacts people. What speculative fiction does at its best is to involve readers with new ideas and settings in a context that evokes a range of feelings.

So often, when people or nations are confronted with a perceived danger, fear reigns, and thoughtful consideration is overwhelmed, if not submerged. And unscrupulous leaders and demagogues prey on that fear to enhance their own power and prestige. The most deadly fear is fear of the unknown. Speculative fiction explores the unknown, and the more people who read it and understand it, the smaller that sphere of the unknown becomes, and the less prone to political manipulation those readers become. To some degree, this is true of all fiction, but it is more true of speculative fiction.

And that is, I believe, one of the blessings the genre conveys, and one of which we who write it should always be mindful.

Truths and Untruths

The other day, as I was driving from one errand to another, I was listening to an NPR radio talk show where two independent budget analysts were discussing the federal budget and taking listener calls. One caller wanted to know why Congress didn’t stop all that wasteful foreign aid and use it to deal with the Social Security and Medicare problems. When the analysts both tried to point out that foreign aid is less than one percent of federal outlays [and they were absolutely correct], the caller insisted that they were wrong and that the government was giving foreigners money from other accounts hand over fist. Now, I spent nearly twenty years in and around the federal government, and I left Washington, D.C., some eighteen years ago. I started out as a legislative and economic analyst for a congressman, and I heard the same arguments and complaints about all that wasted foreign aid back then. Those arguments were numerically and statistically wrong in the 1960s and 1970s… and they’re wrong today.

Polls reveal that Americans believe that as much as ten to fifteen percent of federal spending goes to foreign aid, if not more. We’re talking about almost forty years of people believing in this total untruth. Why?

Despite the war in Iraq, the consistent trend in federal spending since WWII has been to spend a smaller and smaller percentage of the federal budget on defense [and foreign aid] and more and more on various domestic programs… and a majority of the American people still don’t know this, or the fact that domestic programs comprise over roughly 75% of federal spending and defense spending just over 20%.

Various groups of people, of varying sizes, believe in other “facts” that are not in fact true, including matters such as, but not limited to, the fact that the moon landings were a hoax, that the United States is a democracy [for those interested, it’s technically a form of representative federal republic], that Social Security taxes are invested, that the line you’re not standing in always moves faster, that North America was a barely inhabited wilderness at the time of Columbus, and that the world was created in 4004 B.C. [or thereabouts]… or [pick your own example].

Moreover, if you ever attempt to explain, rationally or otherwise, why such “facts” are not so to those who deeply believe in them, you risk indignation, anger, or even great bodily harm.

And many well-meaning souls will say in defense of those believers, “Everyone is entitled to his or her own beliefs.”

To what degree? Is a man who “believes” that the federal income tax is unconstitutional free not to pay his taxes? Does he deserve the same benefits as do other citizens? Is the soldier who enlists free to refuse to fight in a war he or she doesn’t believe in?

On another level, what happens to public policy making and politicians when large groups of their constituents believe in such facts and demand more domestic programs and lower taxes because they “believe” that there’s enough in the budget for those programs so long as foreign aid and waste are eliminated? Or when one group believes that abortion is murder and starts murdering doctors who practice it and another group believes it’s a woman’s right to control her own body and they start attacking politicians, financially, verbally, and otherwise, who insist on opposing abortion at all costs?

Just what is a “truth,” and how far can one go ethically in supporting it? And what does society do when that “truth” is an untruth? Or when large segments of the population believe in opposing “truths” and are willing to go to great lengths in support of their particular truth, as is the case in Iraq and other nations around the world, and as appears to be a growing trend in the United States?

Who’s Really in Charge?

In an earlier blog post, I intimated that at least some of those who espouse feminism in politics or science fiction were not so much interested in changing the structure of society as changing which sex had the socially dominant position. This leads to a related question: In any society, who’s actually in control?

Despite all the political scholars, the media talking heads who pontificate on the subject, the professional politicians, and the academics on both the left and the right and elsewhere, all of whom claim something along the line of “Whoever it is that’s in charge, things would be better if we were,” the answer is far from that simple.

Today, most polls suggest that the war in Iraq is unpopular with more than half the U.S. Yet we live in what is technically termed a representative democratic republic, and those representatives seem unwilling or unable to bring the war to a halt. Less than a third of the population is in favor of either the President or the Congress, and yet both the President and the members of Congress have been elected democratically, albeit by an actual minority of qualified electors.

Those merely slightly less cynical than I would claim that “apathy” is really in charge, but I can only find it chilling that with each expansion of the electorate two trends have continued to predominate if not accelerate. The first is that the intelligence of the average member of Congress has increased dramatically while the quality of decision-making has deteriorated equally dramatically. The second is that the numbers and scope of pork-barrel, earmarked, federally-funded projects have sky-rocketed.

Could it just possibly be that the expansion of the electorate might just have resulted in a political system where ever-brighter politicians use increasingly sophisticated technology and techniques to pander to the wishes of a majority of their constituents, regardless of the long-term consequences or the overarching national considerations?

Could it be that the majority of those voting are actually in charge? How could that be? Surely, the astute citizens of our great land would not continue to vote into office politicians whose principal interest in maintaining position and office translates into an ever-increasing drive to funnel federal bacon into their states and districts, to the detriment of larger national interests. Surely, the desire to do right could not degenerate into merely doing whatever is necessary to perpetuate one’s self in office… could it?

Thoughts on "Good" Writing

After more than thirty years as a published professional author, I’ve seen more than a few statements, essays, comments, remarks, and unprintable quotations about writers and writing, and, as I noted in an earlier blog, I’ve seen the proliferation of lists of “bests.”

Just recently, Brian Aldiss published an essay in the Times of London that pointed out how neglected and overlooked so many good speculative fiction writers happen to be.

But… is what constitutes “good writing” merely a subjective judgment?

At the risk of alienating almost everyone who writes and who reads, I’ll go out on a limb and say that I don’t think so. I firmly believe that there are certain basics to good writing that, if we had the tools, which we do not, as of yet, could be measured objectively. But since those tools have yet to make an appearance, I’ll merely offer some subjective and scattered observations.

Some aspects of writing can already be measured objectively, such as basic grammar. When subjects and verbs do not agree, the writing is bad. When punctuation is lacking, the writing is certainly suspect. When six different readers come up with six totally disparate meanings for a passage, the writer’s skill is most probably lacking.

Beyond such basics, however, writers, English professors, reviews, and editors can argue vociferously. Some believe that style is paramount, and that beautiful sentences, impeccably crafted, with each word sparkling like a gem in its own precisely placed setting, are the mark of good writing. Certainly, well-crafted sentences are indeed the mark of a good writer, but when the sentences take over from the meaning, the emotional connotations and overtones, and the plot, those beautiful sentences become purple prose, no matter how well-crafted.

Still others advocate the stripped-down Hemingwayesque style of short direct and punchy sentences and actions. My personal feeling, which I’ve discovered is shared by very few, is that in the best writing neither the reader nor the reviewer notices the writer’s style and sentences, because story and style become one. Put another way, the style becomes transparent in allowing the reader to fully experience the story. When the way in which a story is told is noticed more than the story itself, the writing is not as good as it could or should be.

Others cite originality in plot and the need for every book by an author to have a different plot. This particular fixity seems far more prevalent in F&SF; certainly mystery and romance readers don’t seem to mind the same basic plot time after time, and more than a few “great” writers have used a limited number of basic plots. In fact, Heinlein noted that there were only three basic plots.

Even today, there are editors who believe that any novel that is written in any other tense or persona than third person past tense cannot possibly reach the highest level of literary and artistic perfection. Unlike them, I believe that the choice of tense and persona should be dictated by the story itself and represents an integral part of the novel or story, and that the default third-person, past tense is only a general guideline and certainly not part of a set of objective criteria for excellence in writing.

Endings clearly vary from genre to genre. Certainly, very few “great” mainstream novels have happy or up-beat endings, while very few fantasy novels have endings leaving the main characters as miserable — or as dead or dysfunctional, if not both — as do those mainstream novels. The implication from the “literary” critics seems to be that a novel cannot be good or considered as great unless it leaves the reader lower than a snake’s belly, while the fantasy critics tend to believe that a book cannot be good unless the supply of nifty magic “stuff” is not endlessly innovative and unless the hero or heroine suffers and triumphs over hardships and difficulties so massive and entrenched that the efforts of entire societies had theretofore proved insufficient to surmount. [And I confess that, once or twice, I have succumbed to this weakness, and I do hope that I will possess the fortitude to resist the temptation to go forth and do the same in the future.]

The human condition, in general, tends toward optimism in a world whose behavior tends to reinforce the reality of pessimism. For that reason alone, my personal feeling is that “good” writing should encourage and represent realistic hope.

The Instant Society… and Rise of Stress and the Decline of Forethought

Final examinations are nearing at Southern Utah University, and student stress is building to incredible levels, as it does near the end of every semester these days.

Every day, my wife, who is a full professor at S.U.U., is deluged by students who are “so stressed” that they’re having trouble coping. They have great trouble dealing with the term papers, the projects, the juries, the performances, and the examinations that all come due in the last week of the semester. Now… such requirements aren’t exactly new. They’ve been a part of collegiate curricula literally for generations, and my wife and other professors continually warn students not to procrastinate and attempt to get them to think ahead. But very few of them do, and this generation seems to have far more difficulty in dealing with the situation than any previous generation. Yet the world that awaits them beyond school is filled with deadlines and pressures, and eliminating or reducing such pressures from college, as some institutions are apparently attempting to do, hardly seems a good way to prepare students for “real” life.

Why? Is just that they’re more verbal about the pressures? No… I don’t think so. There are too many other indications that they actually do feel stressed out. But why? Why should these college students be so stressed? They have the highest standard of living of any group of students in history and the most opportunities. When I was their age, the country was in turmoil, and there were riots about the Vietnam War, and a goodly percentage of young men faced the draft or military service in the service of their “choice” before the draft claimed them for the Army. When my parents were students, it was the middle of the Great Depression, and Germany was turning to Nazism, and World War II loomed. When their parents were students, the era of the Robber Barons was in full swing, and the nation was heading into World War I.

The vast majority of problems faced by today’s students are internal, arising out of their own chosen life-style and habit patterns. Yes, there is a drug problem, but they don’t have to use or abuse; that’s a matter of choice. Even war, for them is a matter of choice, given that we have an all-volunteer armed services. HIV, AIDS… those too are essentially a matter of choice, except in very rare cases. Whether one gets into the “right” university or graduate school is not a matter of survival, unlike being conscripted for WWI, WWII, Korea, and Vietnam. And while the “right” school may confer greater opportunities, those opportunities don’t come down to actual survival, but to a higher level of income and prosperity.

Yet “stress” and college counselors abound, and most students seem to complain about being “stressed out.”

I’d submit that this wide-spread epidemic of stress is the result of our “instant society.” Back before the age of computers, doing something like a term paper required a certain amount of forethought. Papers, strangely enough, were far longer then, and required more research, with extensive footnotes and bibliographies. Typing them required more time, and anything more than punctuation revisions could not be made without retyping the entire page. Tables had to be carefully measured and hand-typed. Graphs were hand-drawn. What can be done in minutes today on a computer took hours and then some.

Today’s students are used to getting everything “instantly.” When I was a student, unless you were wealthy, telephone calls required either lots of quarters and a pay phone [now nearly obsolete] or a recipient who would accept the charges. That necessitated at least some forethought. Today, it’s just flip open the cellphone and call. There was exactly one fast food restaurant in the town where my alma mater is located, and it was a long walk from campus, and the college grill closed at 10:00 p.m. And late late or Sunday shopping for paper or supplies… forget it.

Now… I’m not praising the “good old days.” I’m just saying that they were different, and that difference required a basic understanding that you couldn’t do everything at the last moment, because very little in society was “instant.” Even so, some students procrastinated… and flunked out. Today, they can procrastinate, and technology sort of allows them to throw something together… but it’s often a mess… and they end up stressed out.

No matter what anyone says, it just doesn’t occur to most of them to plan ahead. Why should it? Between watered-down high school curricula where last minute preparation usually suffices, especially for the brighter students, and a society that caters to instant gratification on all levels, very few of them have ever had to plan ahead in terms of dealing with day-to-day work and studies.

They’re intelligent; they’re incredibly quick at some things, like video and computer games and tasks and internet searches. What they aren’t good at is foreseeing the convergence of the mundane into a barrier that can’t be surmounted at the last minute. Nor are they all that good at seeing beyond the immediate visual superficiality and assessing how what they see may play out in the long run.

So… we have stressed-out students, many of whom will turn into adults who will end up even more stressed out when it turns out that neither technology nor the instant society have an instant solution for their lack of forethought… when they truly have run out of time.

The Commentator Culture

Last weekend, as with almost every weekend this fall, the college football pundits were proven wrong once more as Oklahoma upset Missouri and West Virginia lost. The commentators were wrong. All this got me to thinking about just that — commentators.

We have sports commentators, who are “experts” on everything from bowling, golf, and football to anything that appears on some form of television — and that’s anything that’s professional, in additional to the collegiate “money” sports. We have financial commentators. We have political commentators. We have news analysts and commentators. We have religious commentators. We even have F&SF reviewers and commentators.

Yet all too many of these commentators are really just dressed-up versions of Monday morning quarterbacks, with explanations of why things happened after they already did. Pardon me, but anyone with a certain amount of intelligence and knowledge about a field ought to be able to explain what did happen. But how many of them, particularly outside of sports, have that good an average in predicting what will happen?

Besides, what about the old idea of thinking for one’s self? Doesn’t anyone think out their own views — by themselves — any more?

While it’s always been obvious that a certain percentage of any population is unable to formulate coherent and logical opinions about much of anything, I have to wonder whether many are even trying these days. Oh, I’m certain that people retain that capability, but with instant polls on everything from whether anyone agrees with what Celebrity X is doing to who leads in what Presidential primary state or whether the results of the Hugo voting are superior to the results of the World Fantasy Awards or whether some other writers and books really deserved the “award,” we’re inundated with commentary and interpretation of news, polls, and events, so much so that it’s often hard to find a complete set of facts by which one might, just might, have the opportunity to make a judgment based on facts, rather than on commentary.

It almost seems that, in more and more fields, commentary is replacing facts and news about the events, as if readers and viewers could not be bothered with learning the facts and deciding by themselves. I know that I have to take and read more and more periodicals, often more obscure ones, just to find information. Even news stories in the local papers are filled with speculations and commentaries on why something happened, so much so that it’s difficult, if not sometimes impossible, to discover the facts.

I’m dating myself, but I really miss the attitude of Jack Webb on the old Dragnet, when he’d say, “Just the facts, sir, just the facts.”

That’s one reason why I’ve been so pleased with the unpredictability of the college football season. At least somewhere, real life is destroying the false image of the infallibility of “professional” commentators.