The World of “Now”

Once upon a time – and I suppose a fairy-tale beginning is appropriate – when young people were asked what they wanted to be when they grew up, the questioner would receive a plethora of answers:  president, a fireman, a police officer, an astronaut, a baseball star, etc..  Today… the most common response is: “I want to be famous.”

The pop art icon Andy Warhol once said something to the effect that everyone would have fifteen minutes of fame, and whether “everyone” will ever have that, Warhol was certainly right about the fifteen minutes part.

As I see it, though, never has fame been so short-lived, and the problem with this mindset is that the incredibly fleeting nature of present-day fame has also tarnished the value of experience, which is far different from fame.  Fame results from the praise of others;  experience is a combination of knowledge, skills, and understanding gained over time, yet an older practitioner in almost any field is usually regarded as old-fashioned and less able than a younger, more “vital” person.  And frankly, outside of the limited field of athletics, that’s a total fallacy.  Yet the fame and media culture has sold this image, and people, especially the American people, seem to have bought it lock, stock, and barrel.

Of course, the fact that fame is fleeting has always been acknowledged by human beings. The Romans had a slave whisper to a conqueror during his triumphal chariot ride through Rome, over and over, “Fame is Fleeting.”  A.E. Housman wrote the poem “To an Athlete Dying Young” with the lines:

Runners whom renown outran
And the name died before the man…

I can remember a time when there was at least grudging respect for age and experience and when such scorn for anything not current was expressed by phrases such as “that’s so yesterday.”  Once, American students actually had to know who the past presidents were and what they did. Once, most actors, not just a fortunate few, had careers lasting more than a handful of years.  Once, executives had to have experience in the business they were running. As I noted earlier, I doubt it was coincidental that Borders Books failed, given that the company, in its later and declining years, kept hiring CEOs and other executives who had no experience in the book industry, although they were semi-“famous” for accomplishments elsewhere.

Studies of CEOs have shown that, in general, the most effective CEOs are the ones who are the least famous, but the highest-paid ones [who are seldom the best] are the tallest and best-looking. And yet, with the growing cult of “fame,” companies go for “big names” and impressive appearance, whether they have the experience and the talent for the position, and at least one major financial company is headed by a big-name whose lack of competence has already been publicly shown.

The problem with the “now” culture is that it’s the culture of the moment, and that’s the culture of lemmings, where everyone follows the current fad, the “flavor du jour,” and current fads never last.  Because they don’t, and because they tend to exclude those with experience beyond the present and the accepted, when times change, and they always do, those in control make mistakes that older and wiser heads would caution against.  Or put another way, there’s a very good reason why Warren Buffet is one of the richest men in the world… and why Donald Trump has had to be bailed out of the majority of his projects, despite the “celebrity apprentice” un-reality reality show.

Fame and public personality are all too often just a flash in the pan, fool’s gold.  So why do so many people seek fame and try to emulate the merely famous, all too often ignoring the people who’ve actually accomplished something lasting more than minutes or months?

 

Fiction

There’s a certain amount of accuracy in the old saying “Truth is stranger than fiction.”  Every professional writer also knows, whether he or she will admit it or not, that the best fiction is far more “true” to life than life is.  Seemingly “impossible” coincidences and occurrences happen in life.  Almost everyone knows of one or has experienced it, but, especially if it’s a happy or fortunate one, it won’t ring true in fiction.

Is that because, at heart, we all know that impossible doesn’t happen most of the time? There’s a rule of thumb in writing that the only kind of coincidence or improbable happening that will work is one that goes against the protagonist… and even that’s iffy.

The word “fiction” comes from a form of the Latin verb “fingo,” meaning to fix or make, but, in Latin, there was usually a connotation of falsity attached to its use, such as putting on a brave face, and even a statue was an “imago fictum,” a made image, if you will. So how has fiction, or at least “serious” fiction, come to the point where it has to be, if you will, truer than true, or seeming to be more true to life than life is?

For that matter, why is “uplifting” fiction seldom considered “great” literature?  Does fiction have to be depressing, with a dark ending, to be considered “great”… or is it just the critics who think so?

Then, the other day, I ran across a forum entitled something like “Worst/Most Overrated Books.”  I couldn’t help myself – sometimes I do have a morbid streak – and I read through the entries.  I was amazed, because only two of the entries I read described what I would have thought were really “bad” books.  Interestingly enough, both those entries came from librarians.  Almost all the complaints were about books that someone, and sometimes, lots of someones and critics, had suggested were good books, books by people like J.D. Salinger, Ian M. Banks, Orson Scott Card, Stephen R. Donaldson, Ernest Hemingway, Victor Hugo, Nathaniel Hawthorne, John Steinbeck, and a number of others. Two entries even mentioned the Bible and Shakespeare as vastly overrated.  Even the very worst works by any of these writers are vastly superior to much of the true garbage being published today… and yet… why do readers pick on what they think of as the weakest work of good writers?  Because those works don’t meet the readers’ expectations?

And how many of those expectations come from the readers’ perceptions of what “reality” is and by how much the writer fails to portray the “reality” the reader wants?

Of course, if that’s so, it does suggest that most “professional” critics either lead pretty dismal lives or have a rather poor opinion of life and the world in general.

Miscellaneous Thoughts

There’s more than one kind of wisdom.  One way of classifying wisdom is by category: what to do; how to do it; when to do it.  But there’s also the other side:  What NOT to do [i.e., bad idea]; how not to do it [i.e., bad implementation of a good idea]; and when not to do anything [i.e., when to leave well enough alone].

One of the biggest problems in politics today is the fixity with which both politicians and voters hold their ideas.  Those on the far right insist that cutting taxes and spending is always the right thing to do, while those on the far left are all for the opposite. At times, each has been correct, but it’s not just knowing what to do.  It’s knowing how and when to do it… and when to leave well enough alone.  Yet the ideologues insist that there’s only one “right” answer, and that, essentially, it’s right all the time.

There’s also the tinkerer’s philosophy:  If one idea doesn’t work, try something else. That’s even before asking whether the implementation or the timing was good. Unfortunately, while there are times when it works, it’s often corrupted into a version where even when things are going well, the tinkerers decide that they could be better if something else were tried.  I’ve seen all too many organizations, from government to education to private industry, where goals and missions and organizational structure changed so quickly that nothing was going to work.  What’s so often forgotten is that the larger the structure, just like a massive ocean liner, the longer it takes to change course.  Why?  Because any organization that has survived has developed practices and procedures that work.  They may not work as well as other practices, but because they do work in most cases and for most people, changing takes time and explanation, and Americans, in particular, are often far too impatient.

One of the ideas behind the American government is the idea that power must be shared, and that the party in power gets the chance to implement its ideas, and if they don’t work, then the people can vote them out. For most politicians, though, the idea of sharing anything is a total anathema. Congressional districts need to be gerrymandered so that the seat always remains with one party.  Political appointees of the other party must be kept from their positions to which they have been appointed by a president of the other party, no matter what.  By using a “hold,” a single senator can keep a nomination from ever even being voted on by the Senate, yet, so far as I can tell, that particular procedure appears nowhere in the Senate parliamentary procedures.

What’s almost fatally amusing about this is that over the past generation, neither party has been exactly either effective at improving government or the living conditions of anyone but the wealthy, and yet each holds to both its ideas and as much power as it can, claiming that if it only had more seats and power, it would fix things.  If asked exactly how, each side falls back on generalities, and when the few politicians who actually want to do something come up with specifics, such as adding a year to the retirement age some ten years from now, or eliminating tax subsidies for billion dollar corporations with record profits, or suggesting spending federal funds on concrete improvements in infrastructure, the entire political system turns on them.

Looking at it from where I sit, it seems as though most people aren’t happy with things as they are, but they’re even less happy with anyone who wants to change things, and when they do want change, they want it their way, or no way at all… and that’s no way to make things better, at least not in a representative democratic republic.

 

The Charity “Model”

Before and during the holiday season, we were inundated with supplications from various charities, especially the ones to whom we’ve given in the past. We’ve managed to gently request that most of them stop calling – which has to be done on a charity by charity basis, because they’re exempt from the blanket provisions of the “do not call” list – and we’ve also informed them that we will not EVER pledge or respond to telecommunications requests for funds.  Even so, the postal and internet supplications continue ad infinitum… or so it seems.

No matter what one gives, it’s never enough. There are more homeless orphans, political prisoners, third world inhabitants needing medical care, starving refugees, endangered species, abandoned and homeless pets… the list and the needs are truly endless.  I understand that.

What drives me up the wall is that many of those charities and causes in which I believe and which I support seem to increase their petitions – even though my wife and I only give to them once each year and request that they not bother us more than once each year.  Now… I know that almost all fundraisers are taught to “develop” their clientele and press for more funds from those whose donations show they are sympathetic.  For what it’s worth, I’ve served notice that pestering us for more support is more likely to get them less… and that other worthy and less obnoxious causes may well get what they used to receive.

There’s also the question of “gratitude.”  One state university with which my wife and I are acquainted has adopted a de facto policy of not acknowledging “small” contributions, those under $1,000.  Apparently, the development office can’t be bothered.  Interestingly enough, the small “Ivy League” college from which I graduated responds to donations of any size with not only a receipt, but a personal letter, often with a hand-penned personal notation, to donations of any size – and in the early years after my graduation, some of my contributions were modest indeed.  Just guess which institution has been more successful in raising funds, and has an alumni participation rate of over 70%.

In her time as the head of several local non-profit arts/music organizations, my wife has had to raise funds, and she made it a policy to hand-write thank-yous to every single donor.  In every case, the organization was in debt when she took over, and in every case, the number of donors rose, and she turned it over to her successor with a healthy surplus.  She’s adopted a similar policy as the chair of a national educational music association… and again the outcome of recognizing donors has resulted in a significant and healthy increase in donations and support.

Yes, in economic hard-times, people often cannot contribute as much or as often as once they may have, even though the needs are often greater, but those who give don’t like to be pestered and guilt-tripped, and they would like a little personal recognition for their concern and generosity.

It’s something to think about anyway.

 

More Musings on Morality

What is morality?  Or ethics?  The simple answer is “doing the right thing.”  But the simple answer merely substitutes one definition for another, unless one can come up with a description or definition of what “right” or “ethical” or “moral” might be.  A few days ago, a reader (and writer) asked what would seem to many to be an absurdly abhorrent question along the lines of, “If morality represents what is best for a culture or society, then isn’t what maximizes that society’s survival moral, and under those circumstances, why would a society that used death camps [like the Nazis] be immoral?”

Abhorrent as this type of question is, it raises a valid series of points.  The first question, to my way of thinking, is whether ethics [or morality] exist as an absolute or whether all ethics are relative.  As I argued in The Ethos Effect, I believe that in any given situation there is an absolutely objectively correct moral way of acting, but the problem is that in a universe filled with infinite combinations of individuals and events, one cannot aggregate those individual moral “absolutes” into a relatively simple and practical moral code or set of laws because every situation is different.  Thus, in practice, a moral code has to be simplified and relative to something. And relativity can be used to justify almost anything.

Taking, however, that survival on some level has a moral value, can a so-called “death camp” society ever be moral?  I’d say no, for several reasons.  If survival is a moral imperative, the first issue is on what level it is a moral imperative.  If one says individual survival is paramount, taken admittedly to the point of absurdity, in theory, that would give the individual the right to destroy anyone or anything that might be a threat. Under those circumstances, there is not only no morality, but no need of it, because that individual recognizes no constraints on his or her actions.  But what about group or tribal survival?  Is a tribe or country that uses ethnic cleansing or death camps being “moral” – relative to survival of that group?

Again… I’d say no, even if I agreed with the postulate that survival trumps everything, because tactics/practices that enhance one group’s survival by the forced elimination or reduction of others within that society, particularly if the elimination of other individuals is based on whether those eliminated possess certain genetic characteristics, or fail to possess them, is almost always likely to reduce the genetic variability of the species and thus run counter to species survival, since a limited genetic pool makes a species more vulnerable to disease or even the effects of other global or universal factors from climate change to all manner of environmental changes.  Furthermore, use of “ethic cleansing” puts an extraordinary premium on physical/military power or other forms of control, and while that control may, in effect, represent cultural/genetic “superiority” in the short run, or in a specific geographic area, it may actually be counter-productive, as it was for the Third Reich, when much of the rest of the world decided they’d had enough.  Or it may result in the stagnation of the entire culture, which is also not in the interests of species survival.

The principal problem with a situation such as that created by the Third Reich and others [where so-called “ethic cleansing” is or has been practiced] is that such a “solution” is actually counter to species survival.  The so-called Nazi-ideal was a human phenotype of a very narrow physical range and the admitted goal was to reduce or eliminate all other types as “inferior.”  While there’s almost universal agreement that all other types of human beings were not inferior, even had they been so, eliminating them would have been immoral if the highest morality in fact is species survival.

Over the primate/human history various characteristics and capabilities have evolved and proved useful at different times and differing climes.  The stocky body type and small-group culture of the Neanderthals proved well-suited to pre-glacial times, but did not survive massive climate shift. For various reasons, other human types also did not survive. As a side note, the Tasmanian Devil is now threatened by extinction, not by human beings, but because the genetics of all existing Tasmanian Devils is so alike that all of them are susceptible to a virulent cancer – an example of what could happen when all members of a species become too similar… or “racially pure.”

Thus, at least from my point of view, if we’re talking about survival as a moral imperative, that survival has to be predicated on long-term species survival, not on individual survival or survival/superiority of one political or cultural subgroup.