Archive for November, 2009

Failure of Economics and the Market System?

Recently, there have been a number of articles about the failure of economics and its metrics in predicting and reflecting on the “health” of the United States. Much of the criticism has focused on the use of GDP [Gross Domestic Product] as a leading indicator. Unfortunately, such criticisms, while having statistical and economic validity, have the result, whether intended or not, of shifting debate from the larger problem.

The larger debate has been around for a very long time, but with the growth and power of the “market economy” and those who benefit directly, and often excessively, from it, those earlier misgivings tend to be buried in the detritus of history. There was a reason why William Jennings Bryan rallied millions behind his presidential campaign in 1896 when he campaigned against what he saw as the Republican plutocrats with his slogan that “you shall not crucify mankind upon a cross of gold” Although technically a speech for bimetallism, the slogan reverberated though the west, the laboring class, and poor farmers. In his poem “The Gods of the Copybook Headings,” Rudyard Kipling pointed out exactly what happens when the “gods of the marketplace” become paramount. The Russian revolution, while it might have been directed by goons and disaffected intellectuals, was paid for by the blood of the poor and disadvantaged, as was the more recent Cuban revolution and the rise of Fidel Castro. The current American rage against investment bankers and the “market” also reflects a gut-level feeling on the part of most Americans that valuing everything in dollar terms is somehow wrong, even as we react to commercial after commercial that insists happiness and success come from acquiring this and that, and more in general.

Yet the reaction of the financial, economic, and political leaders has been to address the shortcomings of the “market system.” Too many of these leaders and too many of both the critics and those who feel that the “GDP Problem” is resolvable are ignoring the critical assumptions that lie behind the use of economic statistics to define, for want of a better term, “national prosperity.” The first assumption is simply that, given modern methods, anything of value is commercial and can and should be able to be valued and quantified accurately. The second is that, in economic terms, those things that cannot be valued and quantified in hard and measurable terms are of lesser or no value. Now, I’m well aware that my statement of the second assumption will scarcely go unchallenged, but in economic and public policy terms, there shouldn’t be any dispute. For example, almost no business or corporation put an economic value on their acts that degraded the environment until governments stepped in and assigned values, essentially by fiat, in the form of fines and regulations. At that point, and only at that point, did the environment become valued in economic terms. The same sort of reaction occurred with regulations on child labor and wages.

Before going on, for those who may think that I am being excessively “liberal,” I want to make several basic points. First, like it or not, every working nation or region needs to maintain over time a viable market-based economy. You cannot trade, purchase, or sell goods or services without a societal mechanism for doing so, although there are many variations on “market economies,” some better and some worse. Second, market systems work best for goods and services that can be easily quantified and valued, and the harder and more removed such quantification and valuation are from the day-to-day ebb and flow of commerce, the less accurate and the less reliable any valuation is. Third, because market systems are imperfect, large systems need various restraints or rules. Too few restrictions, and one has the worst excesses of the American robber barons or the current Russian commercial oligarchs. Too many restrictions, and one eventually has no market system at all, but a government-run and badly administered [because it cannot be administered well by anyone, given the complexity involved] command-and-control system, which usually results in a black market, if not several.

The rush to find better quantification of everything in life effectively presupposes that everything can be quantified absolutely. But can everything of value and worth really be quantified in economic terms? By adopting a market-based approach to everything in society, as we seem well on the way to doing, we seem to have forgotten, at least in terms of laws and national policy, that when we try to place a dollar [or euro or yuan or yen] value on everything, that which cannot be quantified accurately, or quantified at all, tends to be undervalued or not valued at all.

Recently, I’ve been seeing ads on television citing the fact that the United States has one hundred years worth of undeveloped natural gas — with the implication that this is some vast enormous reserve that should be immediately exploited. The question that comes to my mind is: “And then what?” What energy sources will be available to my children’s grandchildren? This ad points out, effectively by example, that there is little or no value to preserving resources for future generations. It ignores the costs to future generations of having to use more expensive fuel sources — or perhaps having none at all.

What few policy-makers seem willing to admit is that there are whole sectors of life and the world that the market system cannot value accurately, nor will ever be able to do so in cold economic terms. Some of these are: the value of an individual life; the value of the survival of the human species; the value of an integrated and functioning world eco-system; the good health of an individual; the pursuit of happiness; freedom; freedom from hunger… That list is far longer than any policy-maker wants to consider in realistic political or legal terms — and none of them can be valued accurately in economic terms.

For example, how does one value a human life? Some economists will say that we have established a de facto value for human life by the terms of either life insurance or the health and safety regulations we have put in place over the past century or so. But consider the terms of those regulations, or of life insurance. In life insurance, the death benefit is based strictly on the level of premium one is willing to pay. In government regulations, the value of a human life is determined by comparing the cost of implementing the regulation and dividing those costs by the total estimate of “lives saved” by the regulation. In addition to the very real difficulty in estimating the number of people who might have died, there is also the problem of, if you will, quality control. Are all lives the same? Are all lives of people of the same age even the same? Will a child grow up to be a drug addict who is a drain on society or a Nobel prize-winning scientist? If all lives are valued the same, then the process says that human accomplishment means nothing. If they are not, how does one determine what makes one life more valuable than another?

Geology and science suggest rather emphatically that at some time in the future, a rather large or moderate chunk of rock or other cosmic debris will slam into our lovely planet, and millions, or billions, or all of our species will die. It’s not a question of if, but only of when, or whether we do ourselves in before that occurs. We have yet to come up with the comparatively few millions of dollars necessary to scan our solar system to see everything that might be headed our way. And why is the value of species survival quantified at less than the cost of a few bridges to nowhere?

As a culture, we seem unable not only to grasp, but to act in realization of the fact that there are real values, perhaps greater values, to aspects of life that cannot be quantified than to those to which a dollar value can be firmly pinned. Yet dollar certitude remains what we as a society hold to. Is that because Madison Avenue has told us so… or because the majority of us are unable to say that some things are more important than the no-longer-so-mighty dollar?

The Consensus-Driven Society

Have you noticed that very few teenagers actually “date” any more? Instead, they just “hang-out” with their friends. From what I can tell, the “consensus” is that this is more “natural.” Well… it’s more in line with the patterns of our simian relations and ancestors, but “natural,” contrary to current pop culture, isn’t always better. Dating, as practiced by previous generations, required the male to request that a particular female accompany him to a predetermined event, such as a movie or a dance or dinner, for a limited period of time. This required advance planning, preparation by both parties, conversation between both, or an attempt by both, and an expenditure of time by both parties, as well as resources on the part of the male. While some women contend that the expenditure of resources by the male was an attempt to gain sexual favors, that attempt did cost the male resources. “Hanging out” achieves the same results, perhaps even more easily for males, from what I can see, without any commitment of anything on the part of the male. It also dispenses with advance thought, planning, and similar other activities required of adults in society.

Along a similar line, the “consensus” appears to be, in general, that the single-sex college dorms of the past are outmoded, and that college students are better prepared for life by co-ed dorms. While this view has not been universally adopted by all universities, most appear to have given in to providing at least some co-ed dorms. Yet a study published last week indicates that co-ed dorms result in nearly twice the rate of binge drinking among their inhabitants.

Then, there’s clothing. The teen-aged consensus, in recent years, appears to have been to minimize personal appeal and maximize bad features. Low-slung pants too tight above exposed midsections create an impression of corpulence for all but the anorexic woman. Baggy trousers drooping to the back of the knees give even the most trim of young men the impression of bad personal sanitation and slovenliness. Backwards baseball caps not only don’t shade the face, but they also heighten the vacantness of expression in the eyes of all too many young men. Watching the results of teen-aged girls’ consensus decisions on what to wear is frightening, because so very few of them ever choose clothing that is either attractive and tasteful or maximizes their attributes and features. Yet… they talk about what “looks good” when they really mean that they want to wear what everyone else is wearing, no matter how awful it appears on them. It reminds me of an ancient SF story where the men come out of the latest “fashion show” green and nauseated, unable to even approach the women wearing the latest “high fashion” — later revealed to have been designed by aliens to stop human reproduction.

Bad consensus-driven decisions aren’t limited to teenagers, by any means. Wall Street exemplified that with its thoughtless consensus agreements to leverage capital to the hilt through excessive reliance on financial derivatives and similar Ponzi-like devices, and the heads of all too many firms embraced devices they didn’t understand because everyone else on Wall Street was doing the same thing, another form of consensus.

Another consensus is the American idea that every teenager should get a college education. The problem is that possibly as many as half of those young adults either aren’t capable of doing true college level work or aren’t interested enough to do so. Rather than debunk this “consensus” idea, American society has pressured public institutions to water down higher education, although they don’t call it that. The terms that are used include making education “more accessible” or “more relevant” or “more appealing” or “adapted to individual learning styles,” etc. The result is that something like half of entering college freshmen cannot write a coherent and logical essay totally on their own and that to obtain a true higher education now requires additional years of post-graduate study. The other result is that society wastes an incredible amount of resources on individuals who benefit little — except in getting a paper credential that has become increasingly devalued.

The consensus problem isn’t new to society, although it’s more pervasive in the U.S. today. There is a famous line in Handel’s Messiah — “we, like sheep, are gone astray.” Those words were penned in 1741, but they’re even truer today because consensus is based on comfort and agreement, just as in a herd of sheep, and in difficult times, the best decisions are seldom developed through consensus. There’s a tremendous difference between forging consensus and deciding through consensus. The consensus of the British people in 1938 was that appeasement was the best way to handle Hitler and that Winston Churchill was a warmongering firebrand. The consensus of the American people in 1940 was that the United States could avoid war. The consensus in the U.S. in 2006 was that prosperity would continue indefinitely.

In these cases, consensus was wrong, with disastrous results.

Obviously, every society needs to reach consensus on its laws, customs, and political practices and decisions, but that doesn’t mean that sheep-like group-think is the way to reach that consensus. In the past, hard issues were debated, legislated, modified, to a large degree by those who had some considerable knowledge of the subject. Today, in all too many groups and organizations, for all the talk of innovation, ideas that are unpopular are too often dismissed as unworkable.

The problem here is a failure to distinguish between workable ideas, which are unpopular because they have a cost to those of the group, and popular ideas that are technically unworkable. “Taxing the rich” is always popular because few in any society or group are rich; it’s also generally less effective in practice because the truly rich have enough resources to avoid taxation or leave the society, and the practice is almost always detrimental to society because the tax burden falls most heavily on the productive upper middle class or lower upper class [depending on definition] who are the group that determines the course and success of a society. Taxing everyone at a lower rate works better in raising revenue and in allocating resources, and is actually “fairer” because taxes fund general services used more intensively by the non-rich. Unfortunately, flatter tax rates are highly unpopular, and so the general compromise consensus is to keep tax rates at a point where the upper middle class doesn’t scream too much, while not taxing the majority of the populace enough to adequately support the services that they demand. The result is that government barely squeaks by in times of prosperity and faces either ruinous deficits or drastically reduced services, if not both, in economically hard times.

Then, add in our modern communications technology, as I’ve previously discussed, with niche marketing and self-identification, and we’re getting massive societal polarization as various group consensuses harden into total intractability, in effect creating social and political group anarchy without even the benefit of individual creativity.

All those for “natural” consensus…?

Favorite Books?

Recently, I was asked, as part of a profile article that will appear in a “genre” magazine in February, to name my five favorite F&SF books. Usually, I resist the “top five” syndrome, but the article writer and her editor insisted that they weren’t asking for my rating as to the “best” five books, but my favorite five books. So I gave them a list, but when I looked at the list a day later, I discovered that the “newest” book was something like ten years old. However, in my defense, I must add that I didn’t read it until three years ago. So it’s not just that I’m stuck in the past, or not entirely. And no, I’m not going to reveal the list, at least not until the article is published, out of courtesy to the publication, but I will discuss the entire business of favorites.

Are “favorites” something that strike us when we’re younger and more impressionable and never let go? That’s a simple and easy answer… and like all simple and easy answers, I don’t think it’s accurate, although there may be a tiny grain of partial truth buried there. Why do I think that? Well… first off, I’m not one of the younger readers or writers in this field. Without totally giving away my age, I will point out that I read my first “real” SF book, at least the first I remember reading, in 1955 [and for those who want to know, the book was A.E. Van Vogt’s Slan and it wasn’t one of my listed “favorites” because it has too many impossibilities and improbabilities]. After that, I read science fiction and then fantasy fairly voraciously for the next 20-25 years, not that I didn’t also read mysteries, history, poetry, and other works avidly as well. Now, while three of my F&SF favorites were published in the late 1960s and early 1970s, I didn’t read them then, because I happened to be occupied in other endeavors in Southeast Asia, until a good ten years later, when I was a political staffer in Washington. D.C., already cynical, and anything but an impressionable young reader.

Still… I don’t find that many books published in recent years resonate the way those favorites do. Occasionally, one does, as did the “newest” one on my list, and as do others that I find good and enjoyable, but not quite in the top five. Part of that is clearly that I’m a curmudgeon of sorts who doesn’t much care for action for the sake of action, shock for the sake of shock, newness for the sake of newness, but part of it is that, in my personal opinion, for too many current writers in the field story-telling trumps writing. By that, I mean, for me, a truly memorable book is one where the style and the story-telling are both good — and seamless. That’s certainly what I strive for as a writer, and what I look for as a reader. But it’s also clear to me, particularly in reading the reader reviews thrown up [and I use that term advisedly] by many younger people, anything that resembles grace, style, and depth is unwanted if it slows down the action or the sex or the bloodshed. This viewpoint reflects a society that values degrees, credentials, prestige, and money over education, actual accomplishment, and understanding, and while I certainly can’t change a changing society, except perhaps through my writing, which reaches only a comparatively limited number of readers, and generally the more educated ones at that, I don’t find the superficial values rewarding, and there are comparatively fewer books written that exemplify the values I do find rewarding.

So… I’m left to conclude that favorites reflect values, and that’s often why the favorite books, movies, and the like of older people are reflected in a disproportionate weighting of older works, and not merely because they read or saw them when they were young and “impressionable.”

The Nation of More

The divisive debate over the health care bill reveals a certain culturally inherited and continually propagated commonality that most Americans refuse to acknowledge.

What exactly is that “commonality”? Nothing other than a burning desire for “more.”

To begin with, in the current American culture, “better health care” really translates into “more health care,” but the way in which the partisans on both sides of the debate are arguing sheds an unpleasant light on a certain aspect of our “national character,” in so far as any country has a “national character.”

From what I can discern, those who might be characterized as “haves” are attacking the recently unveiled versions of the health care bill as adding to the national deficit, reducing individual choice of doctors, penalizing those who don’t buy adequate insurance, failing to rein in the depredations of the ambulance-chasing trial lawyers, raising taxes on those who already pay the vast majority of federal and state income taxes, and in general penalizing those who’ve been successful through hard work. All this amounts to a statement that government is going to “take” from them, or, if you will, reduce their share of “more.”

On the other hand, those who would not generally consider themselves as “haves,” and their supporters, are insisting that health insurance is essentially a “right” for all Americans, that every American should have affordable [i.e., cheap] health insurance, that the insurance companies have padded their profits by practices that disenfranchise tens of millions of working Americans from health care through denial of care and coverage by every legal [and sometimes not so legal] means possible, that the cost of health insurance and medical procedures should not drive people into bankruptcy, and that doctors and health care providers reap enormous profits while failing to improve the overall health care systems.

All of these points on both sides have some degree of validity, and I’m not about to assess the comparative merits of each point. I will note, however, that almost all of them bear on the issue of who gets “more.”

Now… whether Americans like it or not, the current nation is based on immigrants who traveled here in order to get more, whether they were failed aristocrats or second or third sons of old world nobility, crafters who saw no hope of advancement, Irish and other ethnic immigrants fleeing starvation or worse, debtors, or those leaving behind a myriad of other problems, the vast majority came seeking “more,” whether it was more freedom, greater prosperity, more land, better opportunities for children…

The endless and continual striving for more has its good and its not-so-good sides. The good that has resulted from this drive for “more” is considerable, including a political system that over time has managed to offer a wide range of political freedoms and to transfer power with less disruption than most advanced nations, a level of technology and prosperity for the majority of Americans that is unprecedented in world history, an openness to social and technological change, and a culture that allows those with great abilities to prosper, usually without regard to their social and economic position at birth.

Unfortunately, the evil is also significant, if less obvious, and less talked about, even by so-called liberals. We have spawned a culture of consumption that equates well-being with possession and use of an ever-higher level of goods, possessions, services, and personal space in housing. We have come to measure success almost entirely in terms of the material. We have increasingly come to devalue those who are less able or less fortunate, to the point where we have the greatest discrepancy in income between the poor and the wealthy of any industrialized and technically advanced nation on the planet. We have increased the debt that must be paid by our children and their children to unbelievable levels. We have equated excellence with popularity and material prosperity.

But… the furor over the current health legislation underscores what might be called a sea change for the culture of “more.” In the past, the culture of “more” was based largely on “undeveloped” and cheap land, advances in technology, in means of production, and in the greater and greater use of energy, almost exclusively of fossil fuels. All of these are now running into the inexorable law of diminishing returns. For example, we communicate instantly; and there’s nothing faster than instantly. The energy and technology costs of traveling faster seem to preclude much improvement in current speed of transport. Production efficiencies result in fewer jobs required for reach unit of output, and this has certainly contributed to an economy that economists claim is recovering, even as unemployment increases.

As for the health care issue, we now possess the technology and knowledge to allow “more” in terms of health — more procedures to extend and improve life, but what we lack is the resources, under our current socio-political customs and procedures, to apply those procedures to a population of over 300 million people.

For the first time in U.S. history, it appears that we have reached a point where we can’t have “more” of everything, where technology and energy cannot meet all the needs and wants we as a society demand be fulfilled — and the health care legislation represents the first political presentation of this conflict… or the first one that clearly impacts every single American in some way… and almost none of us like the options.

So… which “more” will prevail — that of better health care and life-style or that of bigger and better consumerism? Will we find some sort of compromise? Or will the struggle deteriorate into an undeclared conflict between the haves and have-nots? Or will the result be a stand-off that amounts to a collective burying of heads in the sand?

The Human Future

Where exactly is the human species headed? How will we get there? Is any great improvement in human culture and technology really possible… or are we close to the end of the line? Throughout history, various authorities and pundits have suggested such, most recently at the end of the nineteenth century, when some suggested closing the U.S. patent office because significant new discoveries would be impossible. We all know how accurate that prediction was. And yet… are there limits to what we as a species can do?

A perhaps apocryphal statement attributed variously to either General Hoyt Vandenberg or Senator Arthur Vandenberg supposedly doubted the feasibility of developing the atomic bomb because such a project would require doubling the electrical power generation capacity of the United States in wartime. In fact, such a doubling was required and did take place, largely based on the TVA Project. Whether or not either man did make such a statement, the underlying truth is that large advances in technology have always resulted in or required, if not both, an increase in the use of power. The industrial revolution was effectively supported by the widespread coal mining; the technological developments of the twentieth century by massive use of oil and natural gas.

Currently, the United States with roughly five percent of the world’s population, employs/consumes/uses more than a quarter of all the world’s energy and resources, yet most experts in the fossil fuels field believe that any significant increases in oil and gas production are not possible and that sustaining current production levels for more than a century at the outside is highly unlikely. Given the fact that world population shows no signs of rapid decreases and that major powers such as China and India are becoming increasingly industrialized and technology-driven, with increasing demand for energy and goods, it doesn’t take much intelligence to realize that the human species either has to become far more efficient in energy usage and production or face increasing conflicts over energy supplies… OR develop new science and technology to utilize far vaster energy sources. The problem here is that renewable sources, such as wind and solar power, do not provide energy that is easily concentrated — and concentrated energy is necessary for high technology and our current society — not to mention mass and long distance transport.

Yet each advance in power sources has required a greater energy input. It takes more energy to mine coal than to gather or and cut wood, more energy to drill oil wells, especially now, and refine the product than to burn coal. Fission power plants cost far more than natural gas, coal, or oil-fired power plants. The next apparent step in concentrated energy production is fusion power, but even the research into developing fusion power is hideously expensive… so expensive that there are only a comparative handful of research projects pressing forward.

The next related problem is that, without something like fusion power, and with the current world population levels, maintaining a standard of living even remotely close to the present level of industrialized nations will not be possible for longer than a few generations, if that. Over the long term, the prognosis is even less rosy.

With all our species’ eggs, so to speak, in the basket that is Earth, we’re not only vulnerable to energy depletion, but to species extinction, sooner or later. But there are no other habitable planets in our solar system, not without massive terraforming — and that also requires huge amounts of technology and energy. So… what about interstellar travel?

At the moment, with what we know now, travel to even the nearest star systems will effectively take generations, because current physics doesn’t provide any ways around the apparent limitations of the speed of light in terms of attaining speeds conducive to what one might call real-time interstellar travel. The one possible loophole might be the creation of something along the lines of a Hawking wormhole, but preliminary calculations suggest that the energy necessary to create such a tunnel through space/time would approximate that used/radiated by a black hole. And that leads us back to the energy problem once more… and to the question that no one seems to want to ask.

Given what lies before us, why aren’t we devoting more research resources to high-energy power generation possibilities?

Knowledge, Education, and Mere Information

I’ve heard or read innumerable times, including at least once in the comments to this blog, that younger Americans don’t need to learn as much as older generations did because the young folks can find information quickly on the web. I’m certain that they can find “information” quickly, but that argument ignores a number of basic points.

The first is the assumption that these younger Americans will always have instant access to the web, via their Iphones or Blackberries or whatever. Perhaps, but there are many times and places where accessing those devices is difficult, if not dangerous, or impossible. It can also be time-consuming, particularly if the young American in question doesn’t know very much, especially since, in more complex areas of learning and life, a wider knowledge base is necessary in order to know what to look up and how to apply such information. My wife has watched scores of supposedly intelligent students — they tested well — have great difficulty in “looking up” simple quotations about musical subjects. Why? Because their subject matter vocabularies didn’t contain enough synonyms and similar terms, and because computers only search for what you ask for, not everything that you should have asked for, had you known more. The more complex the subject, the greater this problem becomes.

The second problem is that trying to evaluate a mass of newly acquired information leads to greater mistakes than if the acquirer already has a knowledge base and is merely updating that knowledge.

Third is the fact that operating on an “I can look it up basis” tends to postpone dealing with problems until the last moment. In turn, planning skills atrophy, a fact to which all too many college professors and supervisors of recent graduates can testify.

Fourth, the “look it up attitude” does not distinguish between discrete bits of information and knowledge. For example, one blog commenter made the point that much of the information handed out by teachers and much of the required reading was “useless.” In the context of the comment, “useless” translated into “it wasn’t on the test.” Speaking as a former college instructor, I have to point out that only a fraction of the material that should be learned in a college-level course could ever be tested for, even if every class period were devoted solely to testing. These days, all too many college professors are either giving up or over-testing in response to a student — and societal — attitude that seems all too often to say, “It’s not important to learn anything except to pass tests.” In addition, tests that merely require regurgitation of information or the plugging of values into formulae do nothing to enhance thinking and real-life problem solving. In short, what’s overlooked by all those who rely on tests is that test results do not equate to education, nor do they build a wider professional knowledge base for the student.

Fifth, without a personal knowledge base, how can you evaluate the accuracy of the information you’re seeking? With every day the amount of information available increases, and with wider access the amount of misinformation increases — to the point where a substantial amount of erroneous information is being promulgated on subjects where the accuracy has been scientifically established without any doubt — such as in the case of vaccinations, as I noted earlier. Without a personal knowledge base, either a greater amount of cross-checking is required, which takes time, or more errors will likely result.

Sixth, as noted in earlier blogs, continual reliance on instant information access dulls memory skills, and there are many, many occupations where reliance on instant “outside” information is not feasible and could be fatal. Pilots have to remember air controller instructions and procedures. Paramedics need to know emergency medical procedures cold. While rote memorization is not usually required in such occupations, a good memory is vital if one is to learn the skills to be highly professional… and looking up everything doesn’t help develop memory or skills.

Finally, lack of a broad knowledge and information base, one firmly anchored within one’s own skull, leads to narrow-mindedness and contributes to the ongoing societal fragmentation already being accelerated by our “bias-reinforcing” electronic technology.

But… of course, you can always ignore these points and look “it” up — if you can figure out how to get the precise information you need and whether it’s accurate, if you have the time to assimilate it… and if you can remember it long enough to use it — but then, you can just plug it into the Iphone… and hope you’ve got access and sufficient battery power.

Rationalized Irrationality

Recently, there’s been a fair amount of resentment expressed in the media and elsewhere, if in a scattered manner, about the “bonuses” still being paid to the already high-paid and most likely overpaid senior executives in the financial industry. Here in Utah, one state agency dealing with trust lands paid bonuses to senior personnel early, just in order to avoid the legislature’s pending ban on such bonuses. I not only understand, but also share, a certain amount of the public outrage at monies above and beyond salaries going to those who have created the financial catastrophe the world is trying to muddle through, as well as at all sorts of maneuvers to keep such extravagant pseudo-compensation.

But… very few of those professing the outrage are looking beyond the obvious sins of the financial, real estate, and other malefactors to the even larger underlying problem. Exactly how rational is a society that pays — or allows to be paid — tens and hundreds of billions of dollars to a relative handful of people who manipulate paper, while underpaying and laying off those who are the backbone of a functioning society?

Everyone professes that education is essential to an information/high tech society. So why are legislators and their constituents allowing teacher layoffs, salary freezes for educators on all levels at a time when school enrollments are growing — particularly college enrollments? Again, here in Utah, college enrollments increased almost ten percent this year, and the higher education budget was cut something like 15%. Next year, enrollments are projected to increase another 15%, and more budget cuts are already before the legislature, while faculty numbers are declining, and, as a result, because many students cannot get into already overcrowded required classes, some may take as long as six or seven years to graduate. Some faculty are so overloaded that they literally have neither time nor space to take on more classes and students. This problem isn’t confined to Utah. Similar problems face other localities, including states like Virginia and California.

Order and law are also another support of society, and more than a few police forces have laid off personnel or stopped hiring and let attrition reduce their numbers. Prisons are so overcrowded in state after state that even dangerous felons are being released early.

Over the past several decades, governments on the federal, state, and local level have neglected infrastructure maintenance, to the point that we’ve had bridges and highways collapse. While a few of these problems are being addressed, most are not… and, by the way, such maintenance problems resulted in the closure of the Bay Bridge in San Francisco for nearly a week — because five thousand pounds of metal dropped out of the bridge and onto the roadway.

On the other hand, the federal government can hand out billions so that Americans can buy new cars — another bailout for the incompetent automakers.

As the retired senior corporate vice president of a large high tech firm once put it, “You can tell how people are valued by what they’re paid.”

So… why, exactly, are we as a society continuing to pay excessive millions to those who’ve already endangered us while underpaying and laying off those who support our society? By what logic do we rationalize the irrational?

Why Can’t They Remember?

The other day, my exhausted wife the professor came home from the university, late again, and collapsed into a chair. After sipping some liquid — and not non-alcoholic — refreshment, she asked, “Why can’t they remember anything? Why can’t they remember to open their mouths?” Now… my wife is a professor of voice and opera, and she teaches singers. One of the very basic rules behind singing is very simple: open your your mouth. It’s difficult to project sound with your mouth closed or barely open, especially if you’re trying to sing opera.

It’s a basic, very fundamental, point. And it’s not just my wife. Last week, I heard another voice instructor complaining about the same thing. So why is it that these young students, who love nothing more than to open their mouths to use their cellphones, won’t do so when they’re supposed to? And this is after months, if not years, of instruction.

Unfortunately, it goes beyond that. A good third of the students in her literature and diction class tend to forget when assignments are due… or ask in class, “When is that due?” Of course, they got a syllabus with all their assignments on the first day of class, and one page even listed the “important dates.” So… not only can they not remember, but apparently many of them can’t read, either, or they can’t remember what they read. My own suspicion is that they can’t remember because they can’t concentrate and weren’t really listening. Or they immediately lost their syllabus.

There’s been much debate over the past year about the problems of so-called multi-tasking and how all tasks are done poorly when people attempt to do more than one at a time. Ask any good voice teacher about it. They can testify to the problem. Most undergraduate students can’t handle remembering words, music, and keeping their mouth open at the same time until they’ve had several years of training… if then. Given this, why, exactly, do we as a society think that these same individuals are able to handle automobiles and cellphones simultaneously?

For several years, I taught writing and literature courses on the college level. I occasionally still do, and I learned early on that a considerable proportion of students don’t truly listen unless threatened with pain, i.e., tests, lowered grades, or embarrassment. Even then, the results are mixed. They all want good grades, and the better jobs that tend to follow higher education, but it’s apparently a real chore to remember the little things that comprise good grammar, such as the fact that adverbs aren’t conjunctions, or that independent clauses can’t be joined just by commas, or that spell-checkers don’t pick out wrong word choices spelled correctly… or that plagiarism has some very nasty consequences.

But they don’t have much trouble remembering idiotic lyrics sung off-key by models pretending to be singers… or the rules and strategies for a dozen video games. And why is it that so many teenagers and young adults, when corrected, immediately say, “I know.” If they know so much, why are so much repetition and reminding required?

And this is the generation that so many pundits have claimed will save the world from the sins of the baby-boomers?