Archive for November, 2011

Tolerance and Hypocrisy

Tolerance of the unjust, the unequal, and the discriminatory is anything but a virtue, nor is fiction that brings to light such problems in society a vice.  Yet among some readers and reviewers there seems to be a dislike of work that touches upon such issues. Some have even gone so far as to suggest such fiction, in portraying accurately patterns of intolerance, inequality, and gender discrimination that such fiction, actually reinforces support of such behaviors.  Over the past few years, I’ve seen reviews and comments about my fiction and that of other writers denigrated because we’ve portrayed patterns of discrimination, either on the basis of gender, race, ethnicity, or sexual orientation.  I certainly hope what I’ve seen are isolated incidences, but even if they are isolated incidences, I find them troubling, especially when readers or reviewers complain that illustrating in fiction what occurred either historically or continues to occur in present-day society constitutes some form of discrimination and showing how it operates is hateful and insulting.

Discrimination is hateful, insulting, and degrading, but pretending it doesn’t exist while preaching tolerance is merely a more tasteful way of discriminating while pretending not to do so… and that’s not only a form of discrimination, but also a form of hypocrisy. It somehow reminds me of those Victorians who exalted the noble virtues of family and morality and who avoided reading “unpleasant” books, while their “upstanding” life-style was supported at least in part by child-labor, union-breaking tactics that including brutality and firearms, and sweat-shop labor in which young women were grossly underpaid.

Are such conditions better than they were a century ago?  Of course they are – in the United States and much of the developed world.  But gender/sexual discrimination still exists even here – it’s just far more subtle – and it remains rampant in much of the developing and third world.  So… for a writer to bring up such issues, whether in historical or fantasy or futuristic science fiction is scarcely unrealistic, nor is it “preaching” anything.  To this day, Sheri Tepper’s Gate to Women’s Country is often violently criticized – if seldom in “respectable” print, but often in male-oriented discussion – because it postulates a quietly feministically-dominated future society and portrays men as dominated by excessive aggression and sexual conquest, yet a huge percentage of fantasy has in fact historically portrayed men almost “heroically” in such a light. Why the criticism of writers such as Tepper?  Might it just be that too many readers, largely male, don’t like reading and seeing historically accurate patterns of sexual discrimination reversed?  And how much easier it is to complain about Tepper and others than to consider the past and present in our world today.

There’s an old saying about what’s sauce for the goose is sauce for the gander…

 

Helpful Technology?

A week or so ago, my trusty and ancient writing computer bit the dust, and I replaced it with a brand-new version, equipped with the latest version of Word.  After a fair amount of muttered expletives, I managed to figure out the peculiarities of the latest word processing miracle from Microsoft, or at least enough to do what I do.  Then I discovered that every time I closed the program, the new defaults for page setup and font that I’d established vanished when I opened the program.  My local techs couldn’t figure out why, but they did give me a support number for Microsoft.  The first tech was cheerful, and when we quickly established that I’d been doing all the right things, and she couldn’t figure it out either, she referred me to another tech.  In less than five minutes, he’d guided me through things and solved the problem – and it wasn’t my fault, but that of a piece of software installed by the computer manufacturer.  Word now retains my defaults, and we won’t talk about some of the other aspects of the program [since I’ve dwelt on those before].

All that brings me to the next incredible discovery – and that’s the blundering idiocy known as a grammar checker.  Unfortunately, the Microsoft people didn’t retain a wonderful feature of my old Word 7.0 – the separation of the spell-check and grammar features.  So… if I want to spell-check a document – which I do, because my typing is far from perfect – I must endure a grammar check.  Now… I wouldn’t mind an accurate grammar check, but what passes for a grammar check is an abomination for anyone who writes sentences more complex than subject-verb-object, and especially someone who likes a certain complexity in his prose. The truly stupid program [or programmers who wrote it] cannot distinguish between the subject in the main sentence and the subject in an embedded subordinate clause, and if one is plural and the other singular, it insists that the verb in the subordinate clause be changed to match the subject in the main sentence.

[It also doesn’t recognize the subjunctive, but even most copy-editors ignore that, so I can’t complain about that in a mere program.]  There are also a number of other less glaring glitches, but I’m not about to enumerate them all.

For me, all this isn’t a problem, although it’s truly an annoyance. But for all those students learning to write on computers it is a problem, especially since most of them have absolutely no idea about the basics of grammar, let alone about how to write correct complex sentences – and now we have a computer grammar-checking program that can only make the situation worse!

There are definitely times when “helpful” technology is anything but, and this definitely qualifies as such.

 

Good-bye?

When I returned to Cedar City after going to the World Fantasy Convention in early November, I was surprised – and appalled – to find merchants, especially our single “big-box” chain store – busy replacing the Halloween displays and immediately putting up Christmas decorations and sales promotions.  There was little space or mention given to Thanksgiving.  And I wondered if this happened to be a mere local phenomenon.  Then I went on my whirlwind tour for Scholar and discovered that in all the cities I visited, the same thing was happening.  In fact, more than two weeks before Thanksgiving, I didn’t seen any commercial references to Thanksgiving, only to Christmas, and in most stores and malls Christmas music was playing.  Then I read where some merchants were pressing to begin the Christmas madness sales at midnight on Thanksgiving Day, forcing sales personnel to stay up all night or to do with little sleep – to cram in a few more hours of sales madness, pushing “black Friday” into Thanksgiving Thursday.

Years ago, I remember reading a short story by Fred Pohl called “Happy Birthday, Dear Jesus,” that was set in a future where the “Christmas season” begins in September, and, of course, I’m sure that many readers found that delightfully exaggerated back in 1956, when the story was first published, but Fred certainly anticipated a point we’ve almost reached.

To say that I find this trend disturbing would be an understatement.  Halloween and Christmas squeezing out Thanksgiving?  A Christmas buying season now beginning in October?

Yet, on reflection, it’s certainly understandable.  Thanksgiving was a holiday originally celebrated for giving thanks for having survived hard times and having attained modest prosperity.  And how many people really give thanks today?  After all, don’t we deserve all the goods and goodies we have?  Aren’t we entitled to them?  Then, too, Thanksgiving doesn’t put that much loot in the pockets of the merchants.  It’s a time for reflection and quiet celebration at home.  It requires personal time and preparation to be celebrated properly.  You just can’t go out and spend money and buy love or assuage your guilt with material gifts.  You have to consider what your blessings are, and what you’re thankful for… and reflect upon those who don’t have much for which to be thankful.

Christmas and Halloween have much in common in current American culture.  They’ve become all about the goodies – both for the consumer and the merchants… and both our son, who manages an upscale men’s fashion outlet in New York City and my editor have made the point that the comparative success or failure of the year depends on how much they sell in the “Christmas” season.  They’re certainly not alone, and many jobs, and the earnings of many workers, depend on such sales.  Yet, the economic health of a nation depending on holiday conspicuous consumption?  That’s frightening in itself. Add to that that such consumption is crowding out times of personal family reflection and an appreciation of what we do have for a frenzy devoted to what we don’t have.

Economic necessity or not… couldn’t we still reserve a small space of dedicated time for Thanksgiving between the buying and selling frenzy?

 

 

 

 

 

 

 

 

Return to the Past?

After finishing a whirlwind tour – seven cities and some of their suburbs in seven days – I’ve seen a trend I noticed years ago becoming even stronger… and more than a little disturbing.  Once upon a time, books were so expensive and hard to come by that only the very wealthy possessed more than a few, and most people had none.  Libraries were few and reserved effectively for the well-off, because few of those less than well-off could read or could manage access to them.

What does that have to do with today or my tour?

Only the fact that, despite such innovations as ebooks and e-readers, in a subtle yet substantive way we’re on a path toward the past in so far as books are concerned.  Yes, millions of books are printed and millions are now available, or soon will be, in electronic formats, but obtaining access to those books is actually becoming more and more difficult for an increasing percentage of the population across the United States.  With the phase-out of small mall bookstores, more than 2,000 bookstores that existed thirty years ago are now gone.  While they were initially replaced by some 1300 “big-box” bookstores, with the collapse and disappearance of Borders and consolidation by other chains, the numbers of chain bookstores has now dropped at least 25%, if not more, in the last few years.  Add to that the number of independent bookstores that have closed, and the total shrinkage in bookstores is dramatic.

Unhappily, there’s another aspect of this change that’s far worse.  Overwhelming numbers – over 90%  – of large bookstores in the United States are situated in “destination” locations, invariably near or in wealthy areas of cities and suburbs, reachable easily only by automobile.  At the same time, funding for public and school libraries is declining drastically, and, in many cases, funds for books are slim or non-existent and have been for years.

But what about electronic books… ebooks?

To read an ebook, one needs an e-reader of some sort, or a computer.  In these economically straitened times, adults and children from less affluent backgrounds, especially those near or below the poverty level, have difficulty purchasing an e-reader, let alone ebooks. Somehow, this fact tends to be overlooked, again, as if reading might not even be considered a problem for the economically disadvantaged

In the seven cities I visited on my recent book tour, every single chain bookstore or large independent was located in or adjacent to an affluent area. Not a single major bookstore remains in less affluent areas.  As I mentioned in a much earlier blog, this is not a new pattern, but the trend is becoming almost an absolute necessity, apparently, for new bookstore locations. Yet who can blame the bookstores? Small mall bookstores aren’t nearly so profitable as trendy clothes retailers, and most mall rents are based on the most profitable stores. Hard times in the book industry have resulted in the closure of unprofitable stores, and those stores are almost invariably located in less affluent areas. These economic realities also affect the WalMart and grocery store book sections as well.  In particular, grocery retailers in less affluent areas are less likely to carry books at all.

But no matter what the reason, what the economic considerations may be, when a city and suburbs totaling more than two million people have less than ten major bookstores, with only one major independent, and all of those stores are located in economically well-off areas, I can’t help but worry that we are indeed on a road to a past that we shouldn’t be revisiting.

 

 

 

The Comparative Species

For all our striving as a species to find clear and absolute answers to everything, from what is “right” to the deepest mysteries of the universe, at heart, human beings remain a highly comparative species.  In its best form, this compulsive comparativeness can fuel high achievement in science and technology.  Whether we like it or not, competitive comparativeness fueled the space program that landed men on the moon, the early development of the airplane, even the development of commercial and residential electrification, not to mention untold advancements in many fields.

The worst aspects of comparativeness remind me, however, of the old saying that all comparisons are odious.

In personal affairs, comparisons tend to be subjective and unfair, particularly in politics and business.  The late Richard Nixon was pilloried for taping conversations in the White House, yet White House taping had gone on in several previous administrations.  He resigned under threat of impeachment for covering up the Watergate burglaries, yet cover-ups have occurred in government for generations.  The full extent of the naval oil reserve scandals in the Harding administration didn’t come out for decades, nor did the extent of Jack Kennedy’s extensive philandering in the White House.  While both Kennedy and Nixon had grave faults, in point of fact, Nixon actually had many accomplishments as president, while Kennedy’s sole measurable achievement was averting nuclear war in the Cuban Missile crisis, yet in popular opinion, there’s essentially no comparison.  The ballyhooed Presidential debate between Kennedy and Nixon was another example of the fickleness of comparativeness.  Among those who heard the debate on radio, a significant majority felt Nixon had “won.”  Among those who watched it on television, a majority opted for Kennedy.  Same debate, same words – but physical appearance carried the day.

Likewise, study after study has shown that men who are taller, regardless of other qualifications, receive more pay and more respect than shorter men, even those more able in terms of ability and achievement, and interestingly enough, in almost all U.S. presidential elections, the taller candidate has been the winner.

Another example surfaced with the recent deaths of Steve Jobs and Dennis Ritchie.  While the entire world seemed to know about Jobs, and mourn his early and untimely death, only a comparative handful of people seemed to know about Dennis Ritchie, who was the pioneer who developed the first widespread and fundamental computer languages [the C programming language and the UNIX system] which made possible the later success of both Steve Jobs and Bill Gates. Yet Jobs’ death appeared everywhere, and Ritchie rated a small paragraph buried somewhere in newspapers, if that.  Although Ritchie’s death was widely mentioned in technical and professional journals, it went almost unnoticed in the popular media.

In the end, the question may be: Is it that comparisons are so odious, or that the grounds on which we make those comparisons are so odious?

 

Unforeseen Results

Just before I left on this book tour [and yes, I’m writing this on the road, which I usually don’t do], I read an article on how unprepared recent college graduates and even those getting advanced degrees happen to be in terms of what one might call personal preparedness.  The article, by a professional business recruiter, stated that most graduates had little idea of even what to wear to an interview, let alone how to get one.

Then, on one of the airplane jaunts, I read about how college students are moving out of engineering and science courses because “they’re too hard,” despite the fact that the average college undergraduate studies half as much today as the average student did thirty years ago.  To complete this depressing litany, I finished up with an opinion piece by a scientist who lectures occasionally, and who cited figures to show that today’s students have trouble learning anything in science without repeated repetition of the material because they don’t easily retain what they’ve heard in the classroom without that repetition.

But to top it all off, last night I ran into an attorney who teaches part-time at a prestigious southern law school, and we got to talking after the signing at the bookstore.  What she told me was truly astounding.   She set up a class where attorneys in various fields came and discussed the actual practice of law and where the students, all in their final year of law school, were told to be prepared to ask questions and were given the time and opportunity to do so.  First off, all were buried in their laptops, and not a single one could ask a question without reference to the laptop or notebook.  Second, not a one could ask a follow-up question or one not already prepared on the computer.  Third, not a one engaged in extended eye-to-eye contact with the visiting attorneys, and fourth, not a single one asked any of the visiting attorneys for a business card, despite the fact that none of them had job offers and all would be looking for positions in six months.  Considering the fact that almost all law firms are becoming very picky about new hires and that many have actually laid off experienced attorneys, none of these law students seemed to have a clue about personal interaction or personal networking.  Oh… and almost none of them actually wore better clothes to that class.

If this is what the new, computerized interactive net-based society has to offer, we’re all in big trouble, and those of us who are approaching senior citizen status may well have to keep working a lot longer for more reasons than economic necessity.

 

No Objective Truth?

The other day, one commenter on a blog asked if I wanted to write about the growth of a belief structure in American society that essentially denies the existence of “objective truth.”  Actually, I’ve written about aspects of this before, particularly as a subset of the selective use of information to reinforce existing confirmation bias, but I find the growth of the feeling that there is no objective truth, or that scientifically confirmed “facts” remain a matter of opinion – and that everyone’s opinion is equal – to be a disturbing but almost inevitable outcome of the fragmentation of the media along lines corresponding to existing belief structures, as well as of the increasing role that the internet and social media play in the day-to-day life of most people.

The basic ground rule of any successful marketing effort is to get the target audience to identify with your product.  Usually that’s accomplished by positioning the product to appeal to biases and beliefs.  Information – which is largely no longer news or factual/objective reporting outside of stringently peer-reviewed scientific journals – apparently must no longer have more than a tangential relationship to facts or objectivity, but has its content manipulated to appeal to its desired target audience.  Now… this is scarcely new.  Modern yellow journalism dates back more than a century, but because the economics of journalistic production limited the number of perspectives that could be specifically pandered to, because the law did have an effect in so far as actual facts were concerned, and because there remained a certain basic integrity among at least some media outlets until comparatively recently, facts were not quite so routinely ignored or distorted in quite so many ways.

One of the mixed blessings of technology is that millions and millions of people in every high-tech society have access to and the ability to use comparatively sophisticated media techniques (particularly compared to those available even a generation ago) to spread their views and versions of the “facts” in ways that can be appealing and often compelling.  In turn, the multiplicity of ways of presentation and distortion of existing verified facts now in existence creates the impression that such facts are not fixed, and the next step for many people is the belief that facts are only a matter of opinion… and since everyone’s opinion is valid… why then, “my” view of which fact or interpretation is correct, or can be ignored, is just as good as anyone else’s.

This “personalization of truth” leads to many rather amazing results such as, for example, that, as the scientific consensus on the issue of global warming has become almost unanimous in the fact that, first, such global warming is occurring, and, second, that there is a strong anthropomorphic component to such warming, popular opinion agreeing with these findings has dropped almost twenty percent.

Unfortunately, occurrences such as global warming or mechanisms such as oncoming vehicles combined with high-volume earbuds, famines and political unrest, viruses and bacteria, and high-speed collisions are all present in our world. Consequently, rising sea levels, violent weather changes, fatalities due to disease among the unvaccinated, starvation, or due to failure to wear seatbelts will all take their toll, regardless of the beliefs of those who ignore the facts.

Belief is not a viable defense or preventative measure against climate change, biology, on-coming heavy objects, or other objective impingements upon subjective solipsistic unreality… no matter what or how you believe.

 

“Downers,” Stereotypes, and Literary Quality

Yesterday, when I was flying back from the World Fantasy Convention in San Diego, I read through two “best-of-the-year” anthologies… and landed in Cedar City thoroughly depressed… somewhat disgusted… and more than a little irritated.  No… I wasn’t irritated that the anthologists hadn’t picked one of my infrequent short stories.  In the first place, I know that it’s unlikely anything I write will find favor with most anthologists [although there have been exceptions].  In the second place, I hadn’t published any short fiction for the year in question.

My anger, irritation, and depression came from the same root cause.  Out of something like fifty stories, all but perhaps ten were downers.  Out of the ten that weren’t, eight were perhaps neutral or bitter-sweet, and only two could be called upbeat.  Now… I don’t have anything against downbeat or depressing stories.  I don’t even have anything against them being singled out as good stories.  Certainly they were all at least better than competently written, and some were indeed superbly written.  And I definitely don’t think a story has to be upbeat to be great or “best-of-the-year,” but after many, many years of writing professionally, and even more of reading all manner of books and stories, ranging from “genre” fiction to the acclaimed classics, it’s clear to me that the excessive citation of literarily depressing stories as classics and excellence is hardly a mark of intellectual distinction, let alone impartial judgment.

All this, of course, reinforces my feelings about those critics and anthologists who seem to dismiss anything with any upbeat feel or positive aspects… or anything that isn’t “literary mainstream.”

The latest New York Times book section contains a review with an opening along the line of “A literary novelist writing a genre novel is like an intellectual dating a porn star.”  Supposedly, reviewers who write about books should be able to transcend stereotypes, not reinforce them, but then, snobbery is often based on the enshrinement of stereotypes contrary to the snob’s world view, and all too many critics, reviewers, and even some anthologists are little more than snobs.

A good story is a good story, and a bad story is a bad story, whether it’s “literary” or genre.