Archive for the ‘General’ Category

The Bell Curve Revisited

A number of years ago, a book called The Bell Curve was published and immediately became the center of an intellectual firestorm. In retrospect, one could almost say that it was a case of “While I don’t like your statistics, I don’t have any better figures, but because your statistics conflict with what I believe (or have seen on an individual basis), they can’t possibly be so.”

As Murray and Hernstein, the authors, stated, statistics are not valid for individuals, but well-developed statistics are almost always accurate for large populations. Their statistics appeared to raise disturbing implications in two areas: (1) individuals with higher IQs — on average — are more successful in our society, and (2) certain minorities, notably blacks — on average — have lower IQs. The authors also claimed that IQ does not change significantly for most people after an early (pre-school) age. Recent research has raised some issues with the last point, but only about the threshold age after which IQ seldom changes, although it seems clear that certainly IQ does not usually change significantly after puberty, and may be determined considerably earlier.

Whether the authors are correct or not should be assessed, not by philosophical predilections or by anecdotal evidence, since exceptions make both bad law and bad policies, but by a broad-based study which addresses such specific issues as:

(1) Is IQ a valid predictor of economic/societal success [not whether it should be, but whether it is]?

(2) If IQ does have validity as such a predictive tool, to what degree is IQ genetically determined, and what other factors can scientifically and effectively be determined to change IQ [i.e., do prenatal care, maternal nutrition, very early childhood education and support, etc., play a significant or a minor role]?

Finally, regardless of causal factors, the authors addressed one simple and basic problem: the fact that, in an information-based hierarchy, those who show higher IQs are more likely to be successful than those who do not. Even if methods and techniques can be developed to ensure all individuals realize their maximum potential IQ, in our society those with higher IQ levels will continue to become an increasingly powerful and self-selecting elite. Isn’t that really the controversy? That we have developed a culture where some individuals, no matter how hard-working, will never be among the most successful so long as success is measured by hierarchical power and economic success and that such success requires the skills measured by higher IQs?

We also seem ready to reject any “scientific” method that may indicate some groups will be either more or less successful than others in areas requiring mental prowess, even while we readily acknowledge such inequality in athletic areas. Why? Is it because we are unwilling to admit that most individuals cannot alter their basic mental capacities, and that such capacities are fixed by outside factors and the actions of others?

In the end, much of the controversy over The Bell Curve seemed to have been generated by individuals — on both sides — whose beliefs were deeply affected — those who either wished to use the statistics presented to justify their already-existing negative feelings and actions about minorities or those who rejected those findings because the findings were antithetical to their very beliefs.

Yet, more than ten years after the publication of The Bell Curve, I have yet to see any evidence whatsoever addressing the authors’ point that, like it or not, economic and professional success in the present-day United States can be predicted largely on the basis of IQ. I have to emphasize that I am not saying this is as it necessarily should be, but the fact that this finding has been quietly buried and remains unrefuted is more than disturbing in itself.

Thoughts on Music

The in-depth and devoted study of music is perceived by many as either fluff or irrelevant to today’s education and world. It is neither. Archeological excavations have discovered various musical instruments that predate historical society, and every human culture, without exception, has some form of musical expression. Music, in particular classical music, is a discipline based entirely upon rigorously applied mathematics, requiring intellectual and physical abilities developed over a period of years. Music has been a key element in culture and politics for at least 50,000 years, and cultural musical achievements are inseparable from a culture’s political, economic, and even military power.

Yet, even today, some politicians and educators question the value of music as a subject of educational study, assigning higher priorities to everything from driver education and athletics. After all, with American Idol, the message is that anyone can sing. With such skepticism and ignorance about the disciplined study of music, one must ask the basic question: Is music important to a culture, and if so, why, and to what degree?

The music enjoyed, played, and composed by a culture defines the soul of that society, and how music is taught in that culture, and to whom, not only illustrates the strengths and weaknesses of its education system, but foreshadows the fate of that education system — and of the society itself.

Aristotle called music the keystone of education. In practical terms, more than any other single discipline, music improves intellectual functioning, emotional intelligence, and understanding of and ability to integrate multiple intellectual and physical activities. PET brain imaging studies show that sight-reading and performing engages more areas of the brain than any other activity.

As noted by a number of scholarly presentations over the past decade, music increases emotional intelligence, and as pointed out by the neurobiologist A.A. Anastasio [Decartes’ Error], intelligence devoid of emotional content is an impaired and reduced intelligence. It is not exactly happenstance that Thomas Jefferson and Albert Einstein were both violinists, or that a high percentage of physicians have musical talents and abilities.

Ensemble musical performances also require cooperation and coordination under time pressure. This is a useful skill in a society that exalts individual success at any cost, particularly since we live in a complex society that rests on cooperation. One has only to look a various third-world societies or Middle Eastern cultures — or even western situations such as Northern Ireland or Basque Spain — to see the devastating impact of societal divisiveness.

Although it is scarcely politically correct to declare so publicly, all music is not equal, either within a society, or in comparing music from different societies. Because almost every human being can do something that can be called music, all too many humans equate what they like with excellence. Such popular personal taste does not necessarily recognize or reward technical expertise and genius. As in many fields, understanding and appreciating excellence in music takes education and talent.

In terms of the larger implications for American society, all too often overlooked and obvious is the fact that for the past 600 years western European music has been the most advanced, most technologically diverse, and most multifaceted… and that western European culture dominates the world — politically and in terms of economic and military power — and has ever since its music developed in its present form. The only cultures that have been able to challenge western-European-derived ones economically, politically, and militarily are those that have adopted — if by adapting — western European music.

Music is indeed complex. Like all of the most worthwhile disciplines, it requires study, long hours of practice, and is expensive to teach. But… as in all matters, what is cheap and popular does not survive. In that sense, it is far too expensive for the future for universities, especially state universities, NOT to teach music. Americans live in a nation that is increasingly polarized by two opposing straight-line, single-value camps of thought. Americans also live in a nation whose popular music has been degenerating technically and compositionally as this polarization has increased. This is scarcely coincidence or mere happenstance correlation.

Likewise, music teaches its students how to handle multiply faceted values and inputs, a skill more and more valuable in a complex and multifaceted world. Because music does increase intellectual and practical abilities, eliminating and/or reducing the study of music at state schools is another critical factor in effectively limiting, if not destroying, the position of the United States as the principal dominant society of the world.

That is because music will only be taught at elite state and private universities, and, when taught at other schools, educators are increasingly pressured to simplify and dumb-down the curriculum, because true musical education on the collegiate level is anything but easy, and difficult courses are less popular and have lower enrollments. This combination of exclusivity and content degradation will only help to increase the division between the privileged and the rest of the population at a time when the economic gap between these groups is already increasing. In addition, it will contribute to other trends already reducing the proportion of the population with the range of skills necessary to analyze, manage, and innovate in a complex world society.

Our Cheating Credentialed Society

From all the articles and cases, there’s clearly a problem in U.S. schools with cheating, and another one with grade inflation. There’s also a problem with too many students not mastering skills. All three problems are linked to a single societal perceptual problem — the false equation of credentials with skills.

In music, for example, mastery of an instrument or the voice is not demonstrated by how fast a musician can get through the piece, nor how many works can be quickly learned, nor by a piece of paper that says the student has a B.M., M.M., or D.M.A. Mastery is singing or playing on key, in tempo, with flawless tone and/or diction, and precise emotional expression.

In recent years, time after time, various studies have trotted out statistics to show that people with degrees make more money than those without degrees. Seldom, if ever, has anyone addressed first, what the studies actually show, and second, their actual applicability to life. The initial studies reflected the difference in earnings between those with a college degree and those without one. And the key term remains “degree.” Once upon a time, a degree signified a mastery of a certain set of skills, and the degree was the certification of those skills. Today, a degree is viewed by students and society alike as either a passport to a better job or the credential to another degree which is a passport to an even better job. The emphasis is on the credential, not on the process of education, not on learning the skills necessary to do the job. Given this emphasis the symbols of success — the grades, the honors, the degrees — is it any wonder that students — and their parents — cheat?

Those teachers who try to emphasize the need to learn fundamentals well, to master skills, and who grade rigorously, are overwhelmed by a society that wants quick results and easy-to-verify credentials and that has lost its understanding of the true basics. The “answer” to a test is only a small part of the learning process. The idea behind learning is to gain the abilities and understanding necessary to find answers on one’s own, especially in new and different situations. This emphasis is being lost behind the demands for testing and accountability.

Students are far from stupid. They see that only the result matters in most cases. The answer obtained on-line or through cheating, if done successfully, counts as much as the one sweated out the hard way. The well-publicized Kansas case of several years ago was not an exception, but far more common than most politicians and school boards want to admit. Just talk to the teachers — well off the record.

This emphasis on the credential, rather than the skills, is everywhere. High school students want to get into the prestigious college so that they can get the good grades there in order to get into the prestigious graduate school in order to get the best job/highest compensation. More and more money and effort are being poured into testing students as to what they are learning. Here, again, we run the risk of focusing on “credentials” — the good test score. Tests like the SAT and the ACT, the GRE, the LSAT, the MEDCAT all purport to measure two things — a certain level of knowledge and the ability to recall that knowledge in a short period of time. Individuals who know their subject matter in great depth, but do not recall the material either swiftly or under time pressure will score less well than those with lesser knowledge but greater test-taking skills.

While there are certain occupations where time is of the essence, and one must act in seconds or minutes — most high-level occupations don’t — and shouldn’t — require such haste. Most occupations are those where a thoughtful complete mastery of the subject and skills is far more preferable to incomplete knowledge and speed. We don’t need an architect who can design a building quickly; we need one who designs it well and safely. We don’t need medical researchers who experiment quickly, but ones who do so thoughtfully and thoroughly. We don’t need financial analysts who can design new financial instruments that magnify credit and the money supply nearly instantly — and then crash and plunge us into financial and economic chaos, but analysts and “quants” who fully understand the ramifications of their work and who can also explain it clearly and concisely… and who will.

There is an old proverb that seems to have been forgotten in our desire for easy credentials, quick measurements, and instant gratification: Haste makes waste.

Never before was this more applicable than in education today. “Accountability” and all the other buzzwords being used are in danger of creating an even greater charade in education than the present sad situation. Universities tout the percentage of their faculty with a Ph.D. Can all those highly degreed professors actually teach? How many actually do? Which ones are effective? Is there any serious effort to evaluate whether candidate A with a masters degree is actually a better and more effective teacher than candidate B with a Ph.D. or candidate C with a mere bachelors degree, but with twenty years practical experience?

A number of studies and articles have also appeared recently suggesting that student evaluations of professors at universities have become both omnipresent and are focused more on the grades that the professors give than upon their teaching effectiveness. That is, in general, the more high grades a professor gives, the better the student evaluation. Once more, both the students and the administrations which rely on such evaluations are focusing on the “credential,” the grade given by students largely ignorant of the requirements of the discipline they are learning, rather than on the process of learning and the skills attained by the students. Yet when such elite schools as Harvard set the example by giving half the student body As in all courses, it becomes increasing difficult for others to go against the example. In the state of Utah, the governor and the legislature have been pressing the universities to graduate students more quickly so that they can get into the work force more quickly, and presumably pay taxes more quickly. Yet, even as the number of students swells, the resources available on a per student basis decrease, and the buzz-word “efficiency” gets bandied around wildly, as if the only important measure is how quickly students get a piece of paper in hand — a credential.

All of these examples have one factor in common — the failure to understand that education is a process, and that mastery of the skills involved is what leads to eventual long-term success for the student — not merely a credential that, without the skills mastery that it is supposed to represent, means little. Most Americans understand that a basketball or football coach cannot merely have a players attend three practices a week for nine months for four years, give them high grades without rigorous examinations, and then graduate them all to a professional sport, saying that they are all equivalent. Yet, in many ways this is exactly what the American public is asking of its undergraduate colleges and universities.

Unfortunately, the problem doesn’t end with graduation. It goes on. Credentials take the place of judgment in the business and academic hiring world. The recommendations of the highly credentialed analysts at the Wall Street brokerage houses were accepted unquestioningly in the cases of Enron, Tyco, Global Crossings, and all the other high-level corporate disasters. So were those of the accountants at Arthur Anderson, AIG, Lehman Brothers, and innumerable banks. Everyone focused on “credentials” — reported profits — rather than on the process of the businesses at hand. Instead, the financial world went on focusing on paper credentials, just as the education world seems prepared to do.

Credentials have become more and more divorced from the abilities and results they were once supposed to measure and have in fact become almost a substitute for ability and accomplishment, yet so long as this continues, we as a society will continue to pay the high price for that practice.

The Illusion of Permanence

A week or so ago, a number of Facebook users got extremely irritated when Facebook tried to change its terms of service to claim the rights of all content posted there in perpetuity. On the surface, that seems to be a bit extreme and might warrant an outcry.

Except… is anything electronic and on the web really permanent? Just look at how fast sites change. Exactly where is the record of what was there yesterday… or last week… let alone last month or last year?

I got to thinking about this for the latest time when I considered my Boeing Graph program. It was a wonderful graphing tool back when I was doing computer graphics for various businesses. It still might be, except that I never bothered to convert the 5 1/2 inch floppies into another format, and I haven’t had a computer with that capability for years, nor have I seen a version of it for sale in an updated format. In fact, I still use 3 inch disks, and I’ve been informed that they’re nearly obsolete. And I’m still using Word 7.0 to write books, because it will also access all the older WordPerfect files so that I don’t have to convert some twenty years of writing and notes. And besides, it doesn’t require as much use of the mouse, which is an advantage for someone who likes the keyboard. Yes, I know, I could program or learn all the alternative keystrokes for the current version of Word, at least until there’s another newer and improved version. But it’s not just me. There’s all sorts of NASA data that’s virtually lost because the electronic systems have changed and because no one thought to convert it — or perhaps they didn’t have the budget to do so.

That’s the thing about paper. We still have books that are hundreds of years old. They may be fragile, but just how much of all the electronic data we’re archiving right now is really going to be accessible in a decade or two, let alone a century? My wife has pointed out that all the old letters in her grandmother’s trunk were priceless. They showed how people thought and felt. Somehow, I don’t see my grandchildren being able to even find my emails. More than a few times, I’ve been able to go back and dig out data from my old consulting reports — those that I was smart enough to print out. I’d be surprised if much of that data exists anywhere else.

And, by the way, there are a few institutions and even one religion that keep revising their tenets. You can see this when you compare print versions, but such comparisons get harder and harder when everything’s electronic.

I haven’t mentioned the problem of servers and their impermanence, either. Or electronic worms and viruses. The old-fashioned book worms took months, if not years, to destroy a single book. The electronic variety can wipe out entire databases in instants. Something like ten years ago, a movie called The Net came out, and it showed exactly what could happen in a society with too great a reliance on electronic systems and too few safeguards. Certainly, there are greater safeguards today than people envisioned back then, but think about the President’s proposal to set up universal electronic medical records. Yes, those records can be accessed from anywhere, but that also means they can be altered or destroyed from anywhere. With paper records in each hospital, someone intent on destroying large amounts of records would have to visit every hospital. Not so once everything’s electronic.

The most obvious price for easier electronic access and convenience is potentially greater vulnerability. There’s also another price, and that’s mandatory standardization, because standardization also increases vulnerability.

It’s certainly a lot more convenient to manipulate electronic text, and it’s been a boon to all of those of us who write, but I would note that all my contracts with my publisher specify that I’m supposed to keep a “hard” copy of every book… just in case.

What will happen if we end up going to E-books, because paperbacks and hardcovers are too expensive?

We can still read Sumerian, Babylonian, Hittite, and Egyptian texts thousands of years old, especially those inscribed on clay. I have my doubts about the survival of much current and future “literature” disseminated as electrons on a screen, but then, given where entertainment is headed, that might just be a blessing.

Taxes and Taxes

Over the years, various commentators have made various tax comparisons between the United States and European countries on the amount of taxes that are paid, or the tax rates that are paid. Unfortunately, most of the comparisons are anything but “apples to apples” comparisons. For example, regardless of what the various tables say, U.S. tax rates on personally earned income range effectively from 0% to somewhere above 50%. A self-employed person who lives totally on cash, with an income below $20,000, who doesn’t report income can often get away with paying no taxes [yes, I know it’s illegal, but it still happens]. On the other hand, an individual who makes, say, $450,000 in salary and lives in New York City might easily pay more than 40% in taxes and have a marginal tax rate of close to 60% when one includes FICA, state and local income taxes. And those rates are actually higher than those in many European countries that Americans consider “high tax.”

How does this happen? First, because the United States is a federal representative republic, the states can levy income taxes, and in some states, such as New York and Maryland, so can local jurisdictions. That means as much as 8% – 10% on top of federal income taxes. Add to that FICA, which can amount to 17% on approximately the first $100,000 of income earned by self-employed individuals [which averages out to more than an additional 4% on $400,000 of income].

Then add property taxes and sales taxes on top of those.

Canada has provincial taxes, but from what I’ve been able to dig up, most European countries don’t have the equivalent of state taxes, but a number do have the equivalent of FICA/Medicare taxes. Some include that in the individual income tax rate. On the other hand, in places like France, they also have occupation taxes [paid whether you own or rent property] and “wealth taxes” based on net worth, which are also paid annually.

According to an Australian study, Australia has the highest rates for high earners, but the USA isn’t all that far behind if you factor in state taxes [unless you live in one of the few U.S. states that doesn’t have an income tax].

At the same time, the United States now has the highest tax rates in the world on corporate income, not that many corporations are likely to be paying much of it after deducting last year’s losses.

All told, there doesn’t seem to be that much difference between those so-called high foreign tax countries and the United States, not in terms of the tax rates. What the taxes are spent on, though, is another question, and one that would take far more space and time than I have at the moment.

The Name of the Game Is… Over-Reaction

For months all the major stock exchange indices have been plummeting… until the past few days. So have been employment numbers, and the unemployment rate is higher than it has been in 25 years. In the publishing field, not all that large to begin with, more than a 1,000 jobs have vanished in the last few months. So have at least half a dozen “name” imprints, as well as one of the major wholesale distributors. The second largest retail book chain — Borders — is teetering on the brink of financial and sales disaster. Some of the larger newspapers across the country have either closed — like Denver’s Rocky Mountain News — or are threatening to do so.

And everyone seems to be speculating on just how bad things will “really” get. Will the drop in stock prices rival the percentage decline of the Great Depression? Will oil prices get so low that oil companies will stop drilling? Will and should General Motors go bankrupt? What about Citicorp? Will publishers stop buying debut novels for the next year or so… or longer?

Yet, some two years ago, financial pundits were talking about the possibility of the Dow reaching 30,000 [instead of plummeting toward 6,000, as it was early last week], and seasoned homebuilders were hammering out new houses at a record rate. New publishing imprints seemed everywhere.

Now… while I was trained as an economist, and while I do follow the economic indicators and the economy very closely, I’m not about to predict how bad matters will get… or when. The one thing I did learn during my years of doing such things for a living was that the only thing you can be sure of is that the more economists agree on something, the less likely they are to be right. In fact, I actually wrote a short study on that subject at the behest of one employer.

What I am convinced of, however, is that, just as people followed trends “upward” far, far longer than made any rational sense, so too will they follow trends downward far, far longer than makes any rational sense. Already, thousands and thousands of investors are buying Treasury notes which yield almost nothing, because they feel T-bills are “safe.” In a sense, they are, because not much will be worth much of anything if the government collapses, but buying them at such rates guarantees an absolute loss. That’s because, if the interest rates go up, the value of the notes goes down, and when the yield is close to nothing, the only things that rates can do is stay stable or go up. So the best all those investors can do is end up with the same amount of money. That’s the best they can do. In the meantime, the short-sellers in the stock market are betting heavily that investors will overreact and continue to sell short on the downside, exacerbating the pressures for overreaction. The current “up” bounce may very well be another over-reaction, because the basic economic news hasn’t really changed.

The previous “up” and “down” trends, as well as the present short “up” trend, illustrate, to my mind, the fact that human beings in groups always over-react. Teen-aged girls in crowds at rock concerts over-react. Young men in gangs over-react. Bankers in groups over-react. So do groups of politicians linked by party symbols. Young and middle-aged male-investors linked together by the internet over-react.

And while I’ve observed this for years, I don’t have an answer… except to consider that to over-react is clearly human.

The Vampire…A Continuing Trope?

While there are a number of variations on the theme, the basic vampire plot is that an old and ageless vampire preys on young and beautiful woman or women, whose blood keeps him forever young. From that point on, the vampire can be indifferent, vicious, fall in love, etc. At the moment, vampire novels are, it’s fair to say, the very big thing in the publishing world, and some readers are under the impression that the theme is relatively new. More sophisticated readers know that’s not true, and that the first vampire book in English dates to Bram Stoker’s Dracula, first published in 1897, while legends of vampires date back at least some six hundred years to Vlad the Impaler, if not before that.

Interestingly enough the first “modern” vampire novel was not Dracula, but Carmilla, published in 1871, about a lesbian vampire who preys on young women, but Carmilla has generally been forgotten, largely, I suspect, because it does not play on the basic trope underlying the Dracula-style vampire myth, even though there are historical antecedents to a female vampire, as well with the reputed blood-bathing of the Countess Elizabeth Bathory of Hungary. Throughout western history runs a consistent theme, particularly prominent in patriarchal societies, that various forms of contact by an old man with young women improve health and re-create his youth. The vampire myth, thus, is merely a variation on that theme.

Even the Bible notes in the first book of Kings that the aging King David slept with a virgin, without intimate relations, in order for him to keep warm and hold to his health. Today in Africa, the superstition that a sex with a young virgin can cure a man of AIDS is not only a variation on this theme, but one deadly to millions of young women, just as a vampire might be.

Polygamy can be understood as another variation on this theme, particularly when one notes the age differential between the “husbands” and most multiple wives in the recent cases involving Utah, Arizona, and Texas polygamists. Another manifestation of this trope might well be the current trophy wives of older men of wealth and position, women who bolster the men’s self-esteem and create an illusion that such men are younger than they are. But, in a real way, aren’t such men in fact a form of vampire?

Also interesting is the fact that the underlying trope appears not only in strong patriarchal cultures but also at times when there is a struggle over gender roles and power, as was the case at the transition from the 19th to the 20th century. The most recent example of this might well be the “Twilight” books, written as they are by a woman educated in one of the last bastions of male supremacy in higher education and a member of a faith that has institutionalized and rationalized a “traditional” and subservient gender role for women.

And so the vampire trope lives on.

The Difference between Theoretical High Tech and Working Applications

A number of years ago, I found myself in an on-line discussion with a reader who insisted that there was absolutely no need for a manned military perimeter on a colony world to defend against invaders landed from space. As I recall the “discussion,” his point was that either the invader had the high ground of space and could use orbital bombardment to destroy the defense perimeter or that the defenders could do the same. A number of years before that, another author wrote about the idea of smart rocks as an effective military weapon and defense. Just recently, I saw another discussion on the idea, and all of it left me shaking my head.

Orbital bombardment is a wonderful way to destroy a planet or the culture on it. I’ve done just that in one or two of my books. But so far as I can figure, it’s a terrible and cost-ineffective way to conquer or defend anything. To begin with, even small planets are big, and inhabited planets have atmospheres, gravitational fields, and probably magnetic fields. From what we’ve so far determined, for carbon-based life they also need oceans and liquid water. Now that might not be necessary for an extraordinarily high-tech civilization, but any civilization with that level of technology likely wouldn’t be worth conquering. Destroying, perhaps, if it were viciously inimical, but not conquering. And all of these characteristics, plus a few others, make dropping anything from orbit, particularly to a small point, anything but easy or simple… or cheap.

So how does one put together a “targeted” orbital bombardment? First, you need mass, and if that mass is to survive atmospheric re-entry it needs to be compact and dense, and you need enough separate chunks of that mass to reduce the objective, again assuming you’re interesting in merely taking out military targets and not leveling and churning whole sections of the planet. In a planetary orbit, exactly where does one get such mass? If there’s a moon, the mass has to be mined and broken into the right sizes and shapes. If there’s no moon, such mass must be lifted off the surface [highly unlikely, because if invaders control the orbital area, the locals won’t get that far, and if the invaders need bombardment to control the planetary surface, they obviously can’t get to the surface to obtain the mass required for bombardment]. That leaves asteroid or other out-system mining, all of which require yet more equipment and transportation methods, adding time, cost, and yet more technology.

Second, the bombardment “projectiles” need to be of almost identical size and composition in order for there to be any chance of being dropped into a re-entry path that will get them anywhere near the target. There’s also the problem and the need to compute such paths, and against a series of objects, such as defense installations, that amounts to considerably different computations… and the equipment and software to do so. Even so, in all probability, given all the variables involved, even precisely engineered objects will spread or shift in re-entry and descent so that they’ll be unlikely to land within a kilometer of the target or targets, let alone within yards. An independent guidance system, with the equivalent of steering jets, is most likely required to assure impact near the target — but that’s effectively the definition of a missile, and would require rather large factories somewhere, plus miniature AIs and fuel, etc.

In short, orbital bombardment with “sharp stones” doesn’t look too likely as a candidate for precision ground targeting, either for practical or technical reasons. And if you want to destroy the planet or the culture, you only need one smallish asteroid or comet.

While I’ve oversimplified somewhat, the point is that a number of so-called high-tech solutions advocated to replace more “conventional” weapons really won’t work in practice. Some, of course, do, but that’s when the economics, the technology, and the battlefield environment go hand in hand. When a modern jet costs upwards of $50 million, and when it costs $5 million plus to train the pilot, you can afford to lose a great number of far smaller and less expensive RPVs for the most dangerous missions, but you still can’t afford to use them against individual soldiers or terrorists on a wide-spread basis.

Recent wars and conflicts, including the drug war in Mexico, continue to illustrate the same dichotomy as the orbital bombardment issue I outlined above. Focused military high-technology is extraordinarily good at annihilating discrete objects, often quite large objects, but it is expensive to develop and deploy and has considerable limitations in dealing with smaller targets, particularly those mixed in with objects and people you don’t want to destroy. Also, using expensive high tech indiscriminately against multiple and numerous low-tech targets has a tendency to bankrupt the high-tech user.

While times and technology change, they change equally over time for the attacker and the defender, and several thousand years of military history suggests that every technology runs into limits and that both conquest and resisting conquest require soldiers with weapons, and that many wonderful ideas like targeted orbital bombardment remain wonderful ideas… and little else.

The Not-So-Free World-Wide Web

There are several underlying assumptions that all too many internet users have. Actually, there are more than several, but I’m going to discuss one aspect that is both tacitly accepted… and erroneous.

That’s the belief that the content on the web largely is and should be “free.” None of it is truly free. It can’t be, by definition. Now some content is obviously and effectively “pay-to-view.” If you want to access certain services, certain libraries, and the like, someone has to pay. I can’t access the scholarly articles on JSTOR, not without subscribing, but my wife can, IF she accesses them from her university computer, because the university has paid for that service for its faculty and staff. Likewise, because I’ve written a number of stories for Jim Baen’s Universe magazine, I can access the stories there, but she can’t, not without my password and ID. Some library systems have also paid for access to otherwise “pay-restricted” content, and if you use their computers, you also can access that material.

But…doesn’t the rest of the web offer a wide range of “free” content?

Not on your life, it doesn’t. First, there are all the ads, pop-up or otherwise. Every time you access a site with such ads or banners, that site is being supported in part or whole by advertising, and you’re paying with either delays or in reading or watching, even if momentarily, that ad content. Even on this site, which has no overt ads, Tor is paying for the site and the technical maintenance, and I’m devoting probably entirely too much time in trying to intrigue and entertain you so that you will read and buy more of my books. Just how long do you think Tor would do that if no one bought my books? Most sites by professional writers, or writers trying to be professionals, are set up and maintained for the purpose of selling the writer and his or her works. They’re “free” only in the sense that the viewer doesn’t have to come up with payment on the spot.

When the Bush Administration asked for Big Brother powers, and Congress granted them under the Patriot Act, at least some Americans rose up and asked why. Some protested the erosion of long-held civil liberties. But it seems like many of those who did so now have surrendered to the commercialized versions of Big Brother. Yes, indeed, give this advertiser or web merchant your sales profile and your tastes. Provide your address here, and your birthday here. Post all your friends and preferences there…

It wasn’t the government that destroyed the American financial system; it was the banking and commercial interests, as well as the average American, all seeking something for nothing, or for far less than it was worth. Do we have to fear that it will be the government that destroys personal privacy and possibly civil liberties? Or will we do it to ourselves for the lure of “free” content, wanting to “join our friends or online communities,” or for apparent ease of communications and shopping?

Free? Think again.

Not Everyone Can…

Over the past few years, there have been endless commentaries about the younger generations, and those have ranged from praising them as the most capable, most intelligent and most promising generation yet to total condemnation as spoiled brats who believe that they’re entitled to whatever they want by virtue of their mere existence. One problem with such assessments is that they all tend to be blanket judgments, and each generation is made up of a range of individuals with differing abilities and capabilities. Another difficulty with such judgments is that each up-and-coming “generation” reacts to the societal environment in which they grow up.

I see a certain amount of truth in the observation that at least a significant portion of the young people who are entering the work force or who will do so in the next few years do in fact feel “entitled” to privileges, income, and position for which they are not yet equipped. But I don’t see too many of those who are already in the work force asking, “Why do they feel this way?”

The answer, to me, at least, is that a large number of parents, and, unfortunately, also a large number of teachers who have felt forced by parental pressure, all have conveyed the message that these young people are “special,” not by dint of achievement, academic superiority, or sheer perseverance, but simply because they exist. With this has also come the totally erroneous idea that self-esteem must precede competence. Coupled with these messages is another insidious idea — that each of them can be anything he or she wants to be. Then, the third leg of this proposition is the underlying assumption that “my” child isn’t like all the others. My child is special; the others aren’t. This leads to the assumption and incredible legal pressures to bend, break, or discard the rules.

My wife is a college professor, and she has been threatened with legal action on several occasions, all of which occurred when she insisted on applying exactly the same standards to one student as those applied to others. These were not extreme standards, but the issues of attending classes and rehearsals, of turning in work on time, of being in class and taking tests. In all cases, the students involved, and/or their families, were incensed that she did not see that family picture-taking sessions took priority over dress rehearsals or tests, whose dates are were announced in writing months before, or that students who illegally used university copying equipment to print defamatory material with sexual implications against other innocent students should face disciplinary actions, or that dropping a class three-quarters of the way through the semester without doing the work resulted in failing the class. More and more often, parents are insisting that “special” circumstances apply to their children, and they’re threatening teachers and schools who don’t grant such exceptions, so much so that a number of universities have begun to hold sessions for professors, briefing them on such possibilities.

In a functioning society, there are limits. If you can’t pay the mortgage, sooner or later, you will lose the house. If you commit violent acts repeatedly, eventually you will get caught and punished. If you don’t do you job the way it needs to be done, before long you’ll be working somewhere else… or not working.

Yet many of these very same parents fret about the “entitlement generation.” And just who raised that generation?

Song? What Song?

I did break away from the computer on Sunday night to watch some of the Academy Awards ceremony… and one part of the awards absolutely appalled and floored me, and that was the award for the best original song.

It could be that I was so negatively astonished because I’ve always been under the impression that songs were supposed to have more than a dozen words, rather than the same ten repeated endlessly. Or it could be that I believe that they’re supposed to have melodies longer than a commercial jingle, and that those melodies should be better than those of a commercial jingle.

Of the three “nominees,” I don’t think any single one had a phrase longer than about four bars [maybe six?], before repeating. All had simplistic and droning repetitive rhythms, as well as equally simplistic lyrics. All three “songs” were the song-writer’s equivalent of classical music minimalism crossed with repetitive rhythm… and people claim opera is boring? Compared to what I heard on Sunday night, I’m not so certain I’d rather not sit through all sixteen hours of Wagner’s “Ring” cycle than spend fifteen minutes listening to such so-called songs… and supposedly these were the “best” of 2008?

In this light, one of the most ironic aspects of the ceremony was the music number performed by Hugh Jackman and a hundred or so others that proclaimed that “the musical is back.” Oh… virtually all the lyrics were from older musicals, cannibalized or perhaps zombie-ized, for the production. No… it appears that musicals may be resurrected by Hollywood, but they’re not being created or born because no one seems able or willing to write songs that are actually songs… or, if they can, those who produce the movies seem unwilling to include anything that’s actually an original song.

Now, I’ve seen a number of very good new movies over the past year, with a number of good songs as part of the soundtracks, or even performed. BUT…none of those songs were new or original. Where are the modern equivalents of Love is a Many Splendored Thing, White Christmas, Over the Rainbow, Moon River, You Light Up My Life, The Way We Were, The Windmills of Your Mind, or The Shadow of Your Smile? Even the losers of past years — such as Nobody Does it Better, On the Road Again, I’ve Got You Under My Skin, They Can’t Take That Away From Me, Our Love Affair, That Old Black Magic — are far better than recent winners.

Just what happened to actual song-writers? The ones who knew what melody and lyrics were and who could move listeners without the aid of gyrating dancers and near-lethal percussion?

Observations on Book Recommendations for 2008

It’s now time for the usual scramble for F&SF readers — or at least the hard-core and devoted ones — to put forth their nominations for the Hugo Awards, the annual reader popularity contest of the World Science Fiction Convention, which, by the way, will be held in Montreal in early August. As a prelude to this exercise, all sorts of “best” lists and “recommended” lists have popped up everywhere.

In looking over the reading lists that I’ve so far seen for the year, several things stand out.

First, most of the novels — well over 80% — appearing on most “recommended” lists come from “big” publishers, but most of the works that are shorter than novel length don’t come from the larger name print magazines, but from a variety of sources.

As others have observed, there’s also an increasing “British” flavor to the recommended novels. Whether this is a result of changing reviewer/recommender tastes or better British writing or just happenstance… who knows?

Another interesting fact was that, although there is no Hugo category for author collections, several recommended lists do mention them, and almost no recommended single author collection was listed as from a “major” publisher, although a number of authors with recommended collections from smaller presses are in fact novelists who are published by major publishers. This, along with the proliferation of stories recommended from smaller sources, tends to suggest that the story market is a far different market than the novel market. I can certainly recall when far more story collections were issued by major publishers.

This trend is also reinforced by the recent demise of several long-running “best of” anthologies, as well as the movement of other “best” anthologies from major publishers to smaller presses. I fretted about the decline of short fiction reading in an earlier post, and this trend seems to be continuing, with short fiction retreating to various lower-volume/total revenue venues, such as online or limited print-run magazines, so that the number of short stories published may actually have increased in recent years, without the total volume of readership increasing significantly… or perhaps even declining.

Since I clearly couldn’t read all of the books listed, my overall impressions are based on the comparative few I did read, the authors, and the summary recommendations, but I have the feeling that many of the recommended books listed as science fiction verge on the edge of adventure science fantasy, while of those listed as fantasy, few would actually fall into the “popular” definition of fantasy. I suspect this illustrates a trend more among reviewers than among readers.

All in all, it’s an interesting time in the world of F&SF… and I just hope it’s not “interesting” in the sense of that ancient Chinese curse.

The Self-Deceptive Society

With all the publicity about the greed in the financial and auto industries, everyone’s asking how it all happened, and pointing the finger… but very few are pointing it in the right direction. I’d like to suggest where the finger belongs can be determined by looking at a few numbers and what they represent… and what they don’t.

To begin with, the “real” U.S growth rate over the past half century has seldom exceeded 3.5% annually. Over the past 20 years or so, productivity growth has generally been around or slightly below 10%. Over the 1900-2005 period, corporate profits averaged around 5%., yet since 2002, corporate profits were running more than 50% above that average, and in 2007, they were almost 60% above that average, yet inflation was reported as “nominal.” How could this possibly be? Productivity wasn’t up that much, nor were costs down. In fact, until 6 months ago, energy costs were skyrocketing. What caused the reported profit increases was leveraged liquidity, since the costs of all those derivative-based funds weren’t shown as costs on anyone’s books, and even turned up as assets on many companies’ accounting ledgers.

Yet, the stock market kept climbing, largely because the analysts kept pointing out that the P/E ratios [price/earnings ratio] of stocks were far below historic highs and at a 10 year low. The only problem with that was that the earnings were in all too many cases deceptively inflated.

Housing prices continued to inflate, based on demand-fueled, statistically flawed lending models that resulted in far too many people being given loans that could never be repaid.

Add to that the feeling that inflation was low and under control, based on government statistics. According to those figures, except for a period in the 1970s, inflation has been below 4.0%… except… the measurements for inflation, as I noted in an earlier blog post, have been changed to eliminate such key aspects of daily living as housing, food, and energy, and including those would increase the number by as much as 40%, and be far more realistic. What this meant was that Americans on the lower side of the income scale were getting squeezed, because so many of their benefits, from Medicare to Social Security, weren’t keeping up with real price increases, since those benefits were indexed to the “core” inflation numbers. Then, just before everything turned sour, consumer debt peaked at an all-time high.

Throughout this entire situation, I doubt that any number produced by anyone was essentially accurate, or that, no matter how hard any analyst tried, any statistical assessment could be more than an approximation.

For various reasons, ranging from out-and-out greed to misguided altruism, we’ve created a system where few if any of the metrics industries and government use are accurate, and some are so far from such accuracy as to be laughable… if the results weren’t so tragic. Yet, in case after case, when those few analysts who did understand and had the nerve to speak out tried to point this out, they were ignored, if not pushed out, because the deception was so much more comfortable to so many people. Just look at the analyst who tried for years to get the SEC to investigate Bernard Madoff. He couldn’t “prove” through evidence; he only knew that such reported returns were impossible in practical terms, just as the returns on all the derivatives turned out to be.

No one can save us from our own self-deceptions… except us.

The Opiate of…The Capitalists?

A recent edition of The Economist graphed popular belief in the theory of evolution on a country by country basis and noted that, with one exception, belief in evolution tended to follow prosperity. The exception to this finding was, of course, the United States. While the Scandinavian nations were listed as those with the greatest percentage of the population [75% and over] believing in evolution, believers in evolution in all of the western European nations surveyed exceeded 60%, while only 38% of Americans believed in evolution as a true description, and over 45% of Americans surveyed believe in Biblical versions of human creation, as opposed to evolution.

I looked at the numbers slightly differently. To me, it appeared that, in those nations where belief in evolution was highest, the social and political infrastructures were more “socialistic,” in that health care and income security were near-universal, whereas in the United States, there is greater variation in both income and in health-care security than in all the other industrialized nations surveyed. Put more bluntly, this suggests that people turn to religion when they perceive their lives as less certain or economically secure.

This correlation might well suggest that the more “free-market oriented” and less economically regulated a nation’s economy is, the more religious its population tends to be. This might well lead to other disturbing corollaries. Do the arbitrary nature of most deities and the rules of their faith come from a culture’s economic inequality and social uncertainty? Or does the arbitrariness of religion essentially justify income inequality? Certainly, there seems to be a correlation, because the nations where the people have democratically rejected gross income inequality also seem to be the ones who have turned most from religion.

Could it be that religion is really a covert apologist for inequality? And that Marx just might have had it right?

Distribution System Failures

Many years ago I spent two years as a Congressional staffer who, among with my other duties, followed the deliberations of the House Appropriations subcommittee dealing with the U.S. Post Office [yes, that long ago]. I tried to get my boss to push for a change in the way the Post Office calculated costs, because from what I figured then, and what I still figure with the U.S. Postal Service, the cost model they used and continue to use undercharges for bulk and pre-sorted mail because it doesn’t take fully into account the need for extra capacity created by lower rates on huge volumes, but priced the costs of bulk mail on the marginal costs. I never had a problem with magazine rates, but I’ve always had a problem with catalogues and advertising or “junk” mail, which has a far greater volume than do magazines. That Postal Service model doesn’t take into account the additional capital and equipment investment required for all that bulk. Now, as a result of current and projected deficits, the Postal Service is recommending that mail deliveries be reduced to five days a week, yet the bulk advertising mail rate is low enough that all too many mail order operations can still afford to send my household multiple copies of their junk rather than clean up their mailing lists. Anytime it’s cheaper to print and send excess copies than to streamline internal mailing lists, the “distribution costs” are too low.

I may be unusual, but I’m more than willing to pay more for a paper copy of the periodicals I value, and if their prices go up, I still pay for them [so far, at least] because the paper copies allow me more efficient time-allocation in a crowded day.

Now… over the past week, it has come to light that Anderson News unilaterally decreed a seven cent a copy surcharge on magazines and books it distributed. Since Anderson serves as the wholesale distributor for something like half the magazines distributed in the United States, as well as half the paperback books that aren’t sold to the bookstore chains directly by the publishers and book wholesalers such as Ingram, this surcharge represents an enormous additional cost to the publishing industry, and some publishing companies, notably Time, Inc., reportedly refused to pay the surcharge. As a result, Anderson “temporarily suspended” its entire distribution business and informed the bulk of those employees that they were on personal or vacation time… if they had any.

While I confess that I don’t know all the details of the Anderson finances, I do know that some twenty years ago, the wholesale magazine and book distribution business imploded from more than 1,000 local and regional distributors into an increasingly consolidated distribution system that now essentially consists of less than a handful of national distributors and leaves most of the country with no effective choice of distributors. This system has also resulted in an increasing restriction of choice for those accounts serviced by the major distributors, while generating a surfeit of waste paper in terms of magazines and paperback books being pulped, rather than being returned.

So, with Anderson, we have a service that, over the years, has gotten more and more centralized, with less and less competition, offers less and less choice, and creates significant wastage because consumers do have less choice. Yet, I’m told by those in the business that the magazine side, which has even more wastage than pulped paperback books, is the most profitable, and that any attempt to create greater choice on the book side is highly resisted.

In the case of both Anderson and the Postal Service, effectively, those who depend on their services are faced with monopolies. In the case of the Postal Service, first class users, no matter what the economists claim [and having been one, I don’t buy their number crunching], pay more than they should while advertising usage pays too little. In the case of Anderson, there’s effectively no alternative.

Economic history has shown, time and time again, that monopolies have serious drawbacks… as these two cases illustrate. So… why are all the banking bailout plans concentrating so much on reducing the number of banks?

Sand Castles… and the Fascination with Destruction

From the time I was very small, when the opportunity arose, I built sand castles, and always just above the wave line of the ocean. And always, eventually, they succumbed to the tides and the water… as I knew they would, even as I built them to attempt to withstand the oncoming water as well as possible. Occasionally, I just happened to build one at high tide and above the high water mark, yet, invariably, if I came back to look at it the next morning, someone had always stomped it flat. Over all the years… and on all the beaches, those that the water didn’t get, humans did.

Then there are the hunters, and there are essentially two kinds, those who like or need the meat and hides and use most or all of the animal and those who trophy hunt. I understand and respect the first type, but I don’t respect or really understand trophy hunting. It’s a form of destruction to prove that the hunter can do it. To me, that’s a lose-lose game. After a million years of evolution and sophisticated tool-making, if you can’t eventually kill an animal that has no high-powered weapons, no clothing to conceal itself better than natural camouflage, and no concentrated food-stuffs to allow itself to hunt [or evade hunters] continuously, you’ve wasted your evolution. And if you do kill the animal, it’s dead, and you really don’t have any use for it, except to prove that you’re not a failure. Destruction to prove you’re not a failure? But isn’t that failing?

One of the latest aspects of destruction to prove one isn’t a failure seems to be the hackers, or whatever the latest term is for individuals who go to great lengths to invade and destroy other people’s computers and networks. Exactly what’s the point? Destruction is far, far easier than construction. Look at all the effort it takes to develop the hardware and software for computers and networks, and with a bit of skill and cleverness, or theft of codes and ideas, millions of computers get crashed. Again… for what? To prove that you can destroy something? To ruin people’s work and lives? To cost them time and money?

I’ve always wondered why so many human beings have such a fascination with destruction. Is it instinctual on some primal level to want to destroy anything anyone else built? Is it an instinct to prove one is stronger by destroying someone else or their creation?

One could argue that destruction forces greater achievement by the doers. It does, but so much of that “achievement” is wasteful. How much in resources do societies spend on security? On multiple levels of computer systems? On home-security systems? On guards and police?

To me, willful destruction is the mark of the second-rater, or of the failure. Those who can’t build, or who don’t respect those who do, resort to destruction, character-assassination, back-biting, and all the lesser forms of destruction.

Compared to creation and building, destruction is easy… and, one way or another, all the rest of us pay for it.

Books and Movies… Never the Twain

While movies often are inspired by a book, and once in a great while a book is written based on a movie, they’re very different creatures. Now this is clearly obvious on the surface, but the implications go far beyond the easily apparent.

For one thing, there are far fewer movies released each year than books. I won’t even try to get into all the figures, but some comparisons are indicative. There might be twenty F&SF-related movies released in one year, and less than a handful in an off year, but according to the latest Locus survey, over 1,600 new F&SF titles were released in 2008, and that total was down slightly from 2007. Major studios release somewhere between a hundred and a hundred-fifty “significant” pictures, those with production and marketing budgets over $100 million, and perhaps another three hundred “lesser” films every year, compared to 50,000 new fiction book titles released every year.

Then there’s the complexity. Most movies have a plot line that’s the equivalent of either a short story or a very simple graphic novel. Books range from age-two cardboard picture books to the incredibly complex multi-volume series.

Movies, obviously, are visual, and that means that no visualization or imagination is required of the audience, unlike a book, which requires far more effort on the part of the reader. What’s more interesting is that the growth of CGI and other special effects has further reduced the “imagination quotient” of movies. In turn, this tends to reduce subtlety and/or to replace it with easily recognized visual hooks that tie into easily recognized cultural tropes and references.

One of the greatest differences lies in the marketing. The average major studio picture costs around $110 million, and approximately a third of that is marketing and advertising. In effect the marketing budget for two week’s worth of new film releases is most likely more than the total annual marketing budget for the entire fiction publishing industry. The vast majority of books receive essentially no publicity, except in the publisher’s catalogue and in a few targeted “trade” publications, if that. That’s why so many writers not only do, but must, resort to self-publicizing, going to conventions, blogging, etc.

There is one similarity. Reviews, either of movies or of books, play a minimal role in determining the success of either a film or a book. Movie-makers aren’t interested in the views of 50 year-old critics when most movies are emotionally targeted at the under-25 audience. On the book side, however, I suspect that reviews play a small part because there are so few compared to the number of titles published every year and the diversity of the reading audience, and the number of “official” review outlets is dwindling, although online reviews, by pros, semi-pros, and everyone else are increasing rapidly.

Another interesting comparison is market segmentation. Essentially, cinema marketers see their market in four quadrants plus one: men under 25, women under 25, women over 25, and men over 25… and children. I suspect those quadrants refer more to emotional ages than chronological ages, but that’s still the targeting. Needless to say, in general, males under 25 prefer action and more action, neat gadgets, and sex, while women prefer romance and only hints of sex. A movie needs to appeal to at least two quadrants to be guaranteed of a chance at profitability, but… men over 25 don’t go to nearly as many movies as the viewers in other three quadrants, and women over 25 tend to be much more choosy about what they see. So… guess what audiences most movies are designed to attract? By comparison, books have a wide range of genres, which in turn have great numbers of subgenres.

In summary… we have in books thousands and thousands of titles on every subject for every taste at every level of literacy, but in a form that requires a certain amount of thought, concentration, and imagination and with numbers of individual titles that preclude wide-scale and intensive marketing and result in overall low profit margins, while movies are a high profile, exceedingly heavily advertised and marketed, visually in-your-face medium released in limited numbers and largely marketed to the least sophisticated under-25s.

And people ask, why are movies profitable and books struggling?

Not the Strongest… or Even the Brightest

More than 200 years after the birth of Charles Darwin, most people, even some very intelligent individuals, don’t fully understand “natural selection” or evolution. Now… I’m not about to try to explain all the fallacies inherent in most popularized misconceptions, but there’s one that’s so blatantly misunderstood that I just can’t help myself.

Most people equate the term “survival of the fittest” to survival of the strongest individuals, or occasionally, the survival of the most intelligent. The problem with this equation is that individual survival and success don’t necessarily translate into species survival. One of the most fearsome predators of our time is the tiger, and it’s endangered and may not survive. The equally well-equipped polar bear faces similar problems. Recent studies indicate that Neanderthals had every bit as much brain power as homo sapiens, perhaps more, and they were physically stronger to boot. The most muscular and intelligent human being, stripped down to a loin-cloth, wouldn’t last more than a few minutes against most large predators.

But humans have brains and tools, and we no longer have to face predators bare-handed or with crude tools that one person could make. That’s absolutely true, but that also points out a corollary. What makes us deadly as a species is not that we are stronger, which we are not, nor that we are more intelligent, which we generally are, but that we cooperate. No human, no matter how brilliant, has the ability to make the sophisticated tools and weapons we possess by himself or herself. Even an individual placed in a Robinson Crusoe situation who creates tools and survives does not do that by himself or herself, because the knowledge required to create such tools is a product of the human culture that has facilitated cooperative learning.

We tend to pride ourselves on our species’ accomplishments, but we’re newcomers to the world. The world has been around some four billion years, and human beings are lucky to be pushing a million years as a species. Cockroaches aren’t particularly strong on an absolute scale, nor are they particularly bright as individuals, but they’ve been around for over 200 million years. Virtually all other species on the planet have been around longer than humans, and dinosaurs lasted for hundreds of millions of years.

Yet, day after day, in forum after forum, people extol the “survival of the fittest” to justify oppression of those weaker, less intelligent, or less fortunate by individuals who are stronger, brighter, and more fortunate. This overlooks the fact that the fittest aren’t those who are the best predators; they’re the ones who are best at dealing with the predators… and that’s another reason why we developed customs, rules and laws, because not all predators are from other species.

That brings up another corollary. In all times in human history, the most successful cultures have been those who have been most successful in dealing with both external and internal predators. Over time, there’s close to a direct negative correlation between the percentage of a culture that dies violently and its degree of “civilization” and success. That is, the percentage of violent deaths always goes down, again, measured over time, as the culture is more successful. Some anthropologists suggest that prosperity reduces violence. I doubt it strongly. Reducing violence increases prosperity, but only by the application of cooperation and social and sometimes physical force, but with minimal violence. One doesn’t reduce violence by relying strictly on violence to do so.

So… let’s have a little less rhetoric and indirect glorification of the abuse of power disguised as “survival of the fittest.”

The Image Culture

Over the years I’ve been bothered by the fact that, in so many areas, from job interviews to popular entertainment, western culture, particularly in the United States, has moved more and more to making judgments and decisions based essentially on image. This trend, unfortunately, despite more and more “popular,” as well as detailed and statistical evidence that illustrates the faults of such an approach, seems to be accelerating, in large part, I suspect, because of the excessive intrusiveness of the media in all aspects of our lives.

The complement to “image” is “ego-stroking” or flattery. We all like to feel good about ourselves, and most of us respond positively to those who seem to go out of their way to bolster our self-images.

The financial crisis that has besieged the United States and threatened to swamp the entire world financial system was generated primarily by the interplay of image and ego-stroking. “Everyone deserves a home of their own” — whether they can afford to pay for it or not. “You can have it all” — with a home-equity line of credit to use inflated real estate as a personal piggy bank… at least until the bottom drops out. “Of course, you can trust the derivatives of Lehman [or Merrill Lynch, AIG, CITI Group, or any other well-known investment bank of insurance conglomerate]” — even though we can’t even explain them accurately to our own CEO. “We’ve figured out the stock market, and there’s no way the Dow won’t hit 36,000” — just ignore the fact that such a gain would require either tripling the US GDP or 400% inflation. “Bernard Madoff has an impeccable reputation, and pays incredibly good returns” — with new investors’ money, just like every other Ponzi scheme.

Even the McDonald’s commercials get into the act with “you deserve a break today.” Perhaps you do, but that doesn’t have much to do with whether you really should, given the state of your finances. More to the point, it creates an atmosphere and attitude that what you “deserve” is far more important than what you can afford. It’s blatant ego-stroking, and it’s so obvious and prevalent that very few people even consider the society-wide implications.

But… as is often the case, there’s an even darker side to image and ego-stroking, and that’s a societal turn away from the recognition of and appreciation for ability and competence — unless those qualities also come with a great image. Unfortunately, and in real life, they usually don’t. The accountant or product analyst who tells the CEO that the product isn’t that good, despite the image, is more likely to be fired than praised. The critic who suggests that the singers on American Idol are less than fifth rate will be pilloried or ignored. The job-seeker who’s shy or tongue-tied under interrogation, but who’s brilliant in analyzing or writing or developing new products, usually loses out to the candidate who’s better-looking and glib, even if the substantive skills of the good-looker are weaker or non-existent — which is another cause of the financial meltdown, because the CEOs of the big brokerages and investment banks all had great images and lousy understanding of what they approved…or a lack of ethics, if they did understand.

Put another way, as a culture we’ve come more and more to reward flash over substance, to demand ego-stroking over honest evaluation, and to value the shallowness of quick rewards over long-term substantive accomplishments.

And now we’re paying for it… and most people still don’t understand why. American Idol and all the ego-stroking wish-fulfillment shows still top the popularity charts, and Toyota just became the biggest car manufacturer in the world by spending years building better cars, even as the well-groomed Detroit auto executives in their ego-stroking private jets beg for more federal handouts while trying to keep producing gas-guzzling behemoths that most users buy for their own-ego-stroking reasons. How many drivers really need 400 plus horsepower — except to make themselves feel better?

A hundred years ago, the popular fable was Horatio Alger and how hard work led to success. Today, the most popular books are Harry Potter, and how magic.. and wishing… can make things better.

Doesn’t that say something?

Dividing Lines… and Judgment

What’s the difference between obtaining timely information, products and services and an obsession with instant gratification? Or between being able to obtain a hot meal quickly and suffering fast-food frenzy? Or being able to get in touch with family or job quickly and having an unseen cellphone umbilical cord that’s really a communications chain that leaves you at the mercy of everyone else?

For that matter, what’s the difference between a “reasonable” editorial and an editorial rant? Or a good book and a great one?

For a traffic cop, is it reasonable to ticket a driver going three miles an hour over the speed limit… or is the dividing line five or seven… or where the possible violation takes place? How many school assessment tests for pupils are too many? Or too few? Is it reasonable to apply the same standards for achievement to immigrant children as to upper middle class students?

For tax policy… who’s rich? Where are the dividing lines on income when lawmakers decide to increase or decrease taxes on the wealthy? How can they be fair when the cost of living differs so markedly from one part of the country to another?

For a publisher, how many copies of a first novel must be sold in order to consider buying an author’s second book? If the book is right on the edge of profitability, what tips the decision one way or the other?

All these questions aren’t meant to be examples of the unsolvable, but examples of the daily judgments people in all areas of life must make, one way or another… and what I’ve written touches the barest minimum of the complexity of human society and life. I’m doing this because, again, I tend to get tired of the proliferation of rules and laws that try to answer every single injustice or odd situation.

In the United States, in particular, we seem to have this idea that when there’s a wrong, another rule is just the thing to right it. Except… it doesn’t seem to work that way. The USA probably has more rules dealing with regulation of the financial sector than any nation on the face of the earth… and we’ve just created the biggest misuse of funds in human history by most likely the most corrupt [I didn’t say illegal, just corrupt, although some did break various laws] group of executives ever, at least in dollar terms, all of them trying to find “legal” ways to accomplish the unethical, and from what I can tell, not a single one ever asked whether what they were doing was “right.”

We have among the strictest laws against bribery of public officials, and yet during the past session, Congress enacted the greatest number of payoffs ever in U.S. history with public funds — through the entirely legal process of “earmarking” — often in response to perfectly legal, if significant, campaign contributions.

Recently, in a number of locations across the industrialized world, various cities have taken the step of removing speed limits, traffic lights, stop signs, and in some cases even sidewalks from streets. At first thought, this seems like a recipe for disaster… except average automobile speeds have dropped, as have fatalities and accidents. Apparently, when people have to use their judgment, rather than rely on rules blindly, they behave better.Part of that also might be that the “rebels” have nothing against which to rebel.

I bring this up because it’s an illustration, at least to me, of how the proliferation of laws and rules causes people to focus on legality rather than upon ethicality, and what can happen when people have to rely on their individual judgment. What if the law just declared that misrepresentation of facts to obtain funds constituted fraud, and the greater the misrepresentation and the greater the funds obtained, and the harm created, the greater the crime… and left the sentencing in cases where guilt was proven to the judge and jury?

Ah… but that wouldn’t work. We really can’t trust human judgment, and we need all those hundreds of thousands of pages of regulations and laws… don’t we?