Archive for February, 2016

One Person’s Waste [Part I]

During my years in government, then as a consultant dealing with government regulations and environmental and energy issues, and even afterward, I’ve heard thousands of people say that we could just solve the budget problem by getting rid of the “waste” in government.

And when I hear that tired old phrase, I want to strangle whoever has last uttered it, because “waste” – at least in the way it’s commonly used – is a tiny fraction of federal or state spending. Now… before you start screaming, let me at least try to explain.

First, I’m defining waste as unnecessary spending for no purpose and that accomplishes nothing. Second, I do believe that government spends a great deal of money on programs and projects which have little to do with the basic objectives of government as outlined by the Founding Fathers… and I suspect most intelligent individuals believe something along the same lines.

The problem is that one person’s waste is all too often another person’s gain or livelihood. For example:

The Georgia Christmas Tree Association got $50,000 from the Department of Agriculture for ads designed to spur the buying of natural Christmas trees. To the Christmas tree growers of Georgia, this was not waste, but advertising designed to help them sell trees and make money.

The Department of Agriculture spent $93,000 to “test the French fry potential of certain potatoes.” Do you think the potato growers objected to this?

$15,000 from the Environmental Protection Agency to create a device that monitors how long hotel guests spend in the shower. Is this so wasteful, given the water crises in the west and southwest?

And then there’s Donald Trump’s use of a $40 million tax credit to renovate the Old Post Office in Washington, D.C. into a luxury hotel. I’m certain that the city would support another tax-paying and revenue generating hotel.

The Department of Agriculture’s Market Access Program provided $400,000 to the liquor lobby, which used part of those funds to transport foreign journalists to different breweries and distilleries in the southeastern United States. The liquor industry doubtless feels that this will boost liquor exports.

At the same time, there is definite out-and-out waste. According to the Government Accountability Office, in 2014 the federal government spent $125 billion in duplicative and improper payments. GAO made 440 recommendations to Congress for fixing these problems. To date, it appears that Congress has addressed none of them.

One waste-watching outfit came up with $30 billion in supposedly wasteful projects for FY 2013, including studies of the threatened gnatcatcher bird species. The only problem with the gnatcatcher “waste” was that such a study is mandated by federal law when an endangered or threatened species may be adversely affected by building or expanding a federal facility.

More to the point, however, is the fact that these self-proclaimed waste-finders only came up with $30 billion worth of waste out of federal outlays totaling $3.5 trillion – so their waste amounted to less than one percent of federal spending. Even if Congress addressed the GAO’s much more sweeping findings, such actions would only reduce federal outlays by less than 4%.

Now… I’m not condoning waste in any amount, but when the federal deficit has been ranging from $440 billion to $670 billion in recent years, it doesn’t take much brain power to figure out that merely getting rid of even all the obvious waste isn’t going to do much for constraining federal spending, assuming Congress would agree, which, as an institution, it doesn’t despite the scores of politicians who claim they’re against waste.

And all those who support a strong national defense should be appalled at some aspects of defense spending. Right now, DOD has stated that as many as 20% of the 523 U.S. military installations are unneeded. This doesn’t even count the more than 700 U.S. bases and facilities outside the United States, yet the present Congress has enacted specific language in the appropriations bill for the current fiscal year that absolutely forbids base closures.

What about my “favorite” airplane, the oh-so-lovely-and-over-budget F-35? A recent report cited DOD officials stating that “essentially every aircraft bought to date requires modifications prior to use in combat.” A plane that isn’t yet ready for combat for which the government has already committed $400 billion? An aircraft that was outmaneuvered by a much older F-16?

DOD also wants to build a new long-range strike bomber with full stealth capabilities, 100 of them at a projected cost of $565 million each.

As a former Navy pilot, I don’t object to better planes; I do have problems with very expensive aircraft that don’t seem to be better than their predecessors, and especially attack aircraft that can’t defend themselves. I also have problems with politicians who decry waste, but won’t allow DOD to reduce it because that “waste” is in their districts. Those are far more expensive examples of waste than $50,000 studies on laughter or Christmas tree promotions. It reminds me of shell game misdirection – look at these ridiculous examples of waste, and, and for heaven’s sake, don’t look at that man over there behind the curtain… or at the pork in my district. And yet, politicians, especially Republican representatives and senators, continue to attack “waste” while doing absolutely nothing meaningful about it… and they get re-elected.

The Religious Selfie

One of the basic underpinnings of religion, almost any religion, is the worship of something or some deity bigger than oneself, and the overt acknowledgment that the individual worshipper is less than the deity worshipped. Some religions even incorporate that acknowledgment as part of liturgy and/or ritual. Such acknowledgments can also be part of “secular religions,” such as Nazism, Fascism, and Communism.

Today, however, there’s a totally different secular religion on the rise, with many of the old trappings in a new form, which might be called the “New Narcissism,” the elevation and exaltation of the individual, or the “self,” to the point where all other beliefs and deities are secondary.

Exaggeration? Not necessarily. What one believes in is reflected in the altars before which one prostrates oneself. Throughout history the altars of the faithful have either held images of a deity, perhaps accompanied by those of less deity, or no images whatsoever. While images of private individuals have also existed throughout history, those images or sculptures were created for posterity, of for the afterlife, so that others would have something to remember them by… or to allow them to remember themselves as they were. At one point in time, only the wealthy or the powerful could afford such images. Even until very recently, obtaining an image of one’s self required either the cooperation of others or special tools not particularly convenient to use. This tended to restrict the proliferation of self-images.

The combination of the personal communicator/camera/ computer and the internet has changed all that. Using Facebook, Instagram, Twitter, and the internet, now each individual has the ability to create themselves as a virtual deity – and tens, if not hundreds, of millions of people are doing just that, with post after post, selfie after selfie, proclaiming their presence, image, and power to the universe [with all three possibly altered for the best effect].

It’s the triumph of “pure” self. One no longer has to accomplish something for this presence and recognition. One can just proclaim it, just the way the prophets of the past proclaimed their deity. And given what positions and in how many ways people have prostrated themselves before their portable communications devices in order to obtain yet another selfie, another image of self, it does seem to resemble old-fashioned religious prostration.

Of course, one major problem with a culture obsessed with self and selfies is that such narcissism effectively means self is bigger than anything, including a deity or a country, and I have to wonder if and when organized religions will see this threat to their deity and belief system.

Another problem is that selfies have to be current; so everyone involved in the selfie culture is continually updating and taking more selfies, almost as if yesterday’s selfie has vanished [which it likely has] and that mere memory of the past and past actions mean nothing. All that counts is the latest moment and selfie. That, in turn, can easily foster an attitude of impermanence, and that attitude makes it hard for a society to build for the future when so many people’s attention is so focused on the present, with little understanding of the past and less interest in building the future… and more in scrambling for the next selfie.

All hail Narcissus, near-forgotten prophet of our multi-mirrored, selfie-dominated present.

Cultural Appropriation

Over the past several years, there’s been a great deal of talk about the need for “diversity.” So far as I can tell, this means stories set in cultures other than those of white, Western-European males and told by protagonists other than white males. I certainly have no problem with this.

I do, however, have some misgivings about the idea that such stories must always be written by authors from those cultures, and the equally disturbing idea that when someone other than a member or a descendent of those cultures writes about them, even when projected into the future, or into a fantasy setting, that is “cultural appropriation,” and a literary sin of the first level. The rationale behind this judgment appears to be that no one who is not a member of a different or a minority culture can do justice to representing that culture in a fictional setting.

Beside that fallacious point, what is exactly the point of fiction? Is it just to be culturally accurate? Or to entertain? To make the reader think? And for that matter, how does one determine “cultural accuracy,” especially when there are significant social and even geographic differences within most cultures?

Taken to extremes, one could classify Rob Sawyer’s hominid series, about an alternate world populated by Neandertals, as “cultural appropriation,” since most of us only have a tiny fraction of Neandertal genes. Roger Zelazny’s Lord of Light could easily be classed as cultural appropriation of Hindu beliefs and myths. For that matter, Tolkien certainly used the Elder Edda of Iceland as a significant basis of Lord of the Rings. And I wrote The Ghost of the Revelator even though I wasn’t born in Utah and I’m not LDS [although I have lived here for more than twenty years].

Obviously, writers should take seriously the advice to write what they know, and know what they write, but “non-members” of a minority or another culture may well know and understand that culture as well as or even better than members of that culture. Should they be precluded from writing fiction based on those cultures because editors fear the charge of “cultural appropriation”?

This concern, unfortunately, isn’t just academic. I’ve heard editors talk time and time again about how they want more diversity, but… In one case, the significant other of a Chinese-American born and raised in Hawaii wrote and offered a YA fantasy novel based on Hawaiian myth to a number of editors. When several agents and editors found out that the writer was not Hawaiian genetically, they decided against considering the book. Several well-known authors have also told me that they wouldn’t have considered the book either, because dealing with Hawaiian beliefs would be too controversial.

Shouldn’t it just be about the book…and not the genetics/cultural background of who wrote it?

Teachers

In yesterday’s local paper, there was a front page article headlining the coming teacher shortage in Utah, to which I wanted to reply, “How could there not be?”

The beginning salary for a Utah teacher in most systems is not far above the poverty level for a family of four, and the average Utah teacher’s salary is the lowest in the United States. Utah spends the least money per pupil in primary and secondary schools of any state in the United States. Nationwide, anywhere from twenty to fifty percent of newly certified teachers drop out of teaching in five years or less [depending on whose figures you trust], and that rate is even higher in Utah. In 2015, half of all newly hired teachers in Utah quit after just one year. Yet studies also show that the longer teachers teach, the more effective they become. Add to that the fact that Utah has on average the largest class sizes in the United States. The academic curriculum leading to a teaching degree has also become more demanding [at least at the local university], and it often takes even the best students more than the standard four years to complete a course of study that leads to teacher certification, especially if they have to work to help pay for their studies.

Despite the often dismal salaries, study after study shows the comparatively poor level of pay is down the list for why teachers walk away from teaching. Almost all prospective teachers know that teaching isn’t a high-paid profession. What they don’t know is just how effectively hostile the teaching environment is to a healthy and balanced life.

Here in Utah, for example, there are state legislators who complain about pampered and lazy teachers. They’re obviously unaware of the unpaid after-school, weekend, and evening workload required to support an eight-hour teaching day. Or of the number of parents who complain about their darling children’s grades – such as the one who wanted to know how his son could possibly flunk an art class [which turned out to be the fact that said son failed to attend most of the classes and never did a single art activity]. Or about the increasing reliance on testing to determine teaching effectiveness [when the testing itself reduces instructional time, when the test results determine teacher retention and ratings, and when the tests tend to measure factoids, and fill-in-the-blank skills, rather than thinking or being able to write even a coherent paragraph].

It also doesn’t help when the local papers are filled with pages and pages about the sports activities of the local high schools, with seldom a word about academic achievements or other more academic successes, such as plays, concerts, success in engineering competitions and the like.

Nor is it exactly encouraging when school administrators offer little understanding or support of their teaching faculty. That’s more commonplace than one might realize, although national surveys show it’s a significant factor in contributing to teacher drop-out/burnout. Certainly, a number of former students of my wife the university professor have mentioned this as a difficulty in their middle school or high school teaching positions.

And finally, in the end, what’s also overlooked is that it’s actually more expensive to continually replace a high number of departing teachers than to take the necessary steps to cut the teacher drop-out rate. But based on the current public view of education and the unwillingness to make meaningful changes, I don’t see this problem changing any time soon. In fact, it’s only going to get worse… far worse.

There’s Always Someone…

I admit it. I did watch the Super Bowl. How could I not when my grandfather was one of the first season ticket holders back in the days when the Broncos were truly horrible? I can still remember him taking me to a game, and he went, rain, shine, or snow, until he was physically no longer able. I wasn’t able to go with him, unfortunately, because by then I was working in Washington, D.C.

And yes, I was definitely happy that the Broncos won, particularly since I’ve always felt that Peyton Manning is a class act, but that brings me to the point — Cam Newton’s postgame interview, if it could be called that, which was anything but a class act. Yes, he was disappointed, and he wasn’t the first great quarterback to be disappointed, and certainly won’t be the last.

Newton’s real problem is that he is so physically gifted and also has a mind good enough to use those gifts that he’s never considered a few key matters. First, in anything, no matter how big you are, how fast you, how strong you are, how intelligent you are… there’s always someone bigger, faster, stronger, and more intelligent. Second, football is a team game, and the team that plays better as a team usually wins. Third, sometimes you get the breaks, and sometimes you don’t. Fourth, you don’t win just because you have the better record or the better offense – as Denver found out two years ago. Fifth, it is a game, if a very serious one played for high stakes.

Newton also needs to realize that he’s paid extraordinarily well to do exactly the same thing that every writer does, except few of us, indeed, are paid as well as he is. He’s paid to entertain the fans, and while that means winning as much as possible, it also means not pissing everyone off and coming off like a spoiled kid. This is also something writers need to keep in mind.

Given his talent, I’m sure Newton will be a factor for years to come, but it would be nice to see a bit more class when things don’t go well. You don’t have to like losing, but in the end, as even the great Peyton Manning has discovered, we all lose… and the mark of the truly great is to show class both when things go well and when they don’t.

High Tech – Low Productivity

The United States is one of the high-tech nations of the world, yet our productivity has hovered around a measly two percent per year for almost a decade. In the depths of the great recession that made a sort of sense, but the “recovery” from the recession has been anemic, to say the least. With all this technology, shouldn’t we be doing better?

Well… in manufacturing, productivity has to be up, whether the statistics show it or not, considering we’re producing more with fewer workers, and that has to mean greater output per worker. Despite the precipitous drop in the price of crude oil, the oil industry is almost maintaining output with far fewer rigs drilling and far fewer workers.

But perhaps what matters is what technology is productive and how it is used. I ran across an article in The Economist discussing “collaboration” with statistics indicating that electronic communications were taking more than half the work-week time of knowledge workers, and that more and more workers ended up doing their “real work” away from work because of the burden of dealing with electronic communications such as email and Twitter. And, unhappily, a significant proportion of the added burden comes under the “rubric” of accountability and assessment. But when you’re explaining what you’re doing and how you’re accountable, you’re not producing.

This is anything but the productive use of technology, and it may provide even greater incentive for businesses to computerize lower-level knowledge jobs even faster than is already happening. It just might be that, if you want to keep your job, less email is better. But then, if your boss doesn’t get that message as well, that puts you in an awkward position. I suppose you could console yourself, once you’re replaced by a computerized system, that your supervisor will soon have no one to badger with those endless emails demanding more and more status reports… before he or she is also replaced by an artificial intelligence.

We’ve already learned, despite the fact that too many Americans ignore the knowledge, that texting while driving runs a higher risk of causing fatalities than DUI. Will the supervisory types ever learn that excessive emailing may just lead not only to lower productivity, but eventual occupational suicide?

They Can’t Listen

Some of the complaints that the older generation has about the younger generation have been voiced almost as far back as there has been a way of recording those complaints, and they’re all familiar enough. They young don’t respect their elders; they don’t listen to their elders; they have no respect for tradition; they think they deserve something without really working for it, etc., etc. And, frankly, there’s some validity to those complaints today, and there always has been. That’s the nature of youth, to be headstrong, self-centered, and impatient with anything that hampers what they want.

But being adjacent, shall we say, to a university, I’m hearing what seems to be a variation on an old complaint, except it’s really not a variation, but a very troubling concern. What I’m hearing from a significant number of professors is that a growing percentage of their students can’t listen. They’re totally unable to maintain any focus on anything, often even visual presentations, for more than a few seconds – even when they seem to be trying. When they’re asked what they heard or saw, especially what they heard, they can’t recall anything in detail. We’re not talking about lack of intelligence – they do well on written multiple-guess tests – but an apparent inability to recall and process auditory input.

Unless there’s something of extraordinary interest, their attention span darts from one thing to another in a few seconds. Whether this is the result of a media driven culture, earlier teaching methods pandering to learning in sound-bites, a lack of discipline in enforcing focus, or some combination of these or other factors, I can’t say. But, whatever the reason, far too many students cannot focus on learning, especially auditory learning.

Unfortunately, the response of higher education has been to attempt to make learning “more interesting” or “more inspiring” or, the latest fad, “more experiential.” Learning through experience is an excellent means for attaining certain skills, provided the student has the background knowledge. But when a student hasn’t obtained that background knowledge, experiential learning is just meaningless and a waste of time and resources. And, generally speaking, learning has to begin with at least some listening.

Furthermore, in the “real world,” employers and bosses don’t provide “experiential learning.” They give instructions, usually vocally, and someone who can’t listen and assimilate knowledge from listening is going to have problems, possibly very large ones.

Despite all the academic rhetoric about students being unable to learn from lectures, lectures worked, if not perfectly, for most of human history. That suggests that much of the problem isn’t with the method, but with the listener. And it’s not just with professors. They can’t listen to each other, either. That’s likely why they’re always exchanging text messages. If this keeps up, I shudder to think what will happen if there’s a massive power loss, because they apparently can’t communicate except through electronic screens.

The “Federal Lands Fight”

The state legislature here in Utah has proposed setting aside $14 million for legal action against the federal government to “force” the United States to turn over all public lands to the state. This is just the latest effort in Utah to grab federal lands.

There are several aspects of this hullabaloo over federal lands that neither the legislature nor the Bundyites seem to understand… or want to. First, the Constitution vests public lands in the federal government, and numerous court cases have upheld that reading of the Constitution. Second, a 2012 study calculated that managing those lands would cost the state of Utah something like $278 million a year, and while much of that cost might be initially reclaimed by oil, gas, and coal leases, once the resources were extracted, the costs of management would remain, and the lands would have even less value. Third, if the grasslands were leased to ranchers, either the grazing fees would have to increase, since the BLM only charges about a third of what it costs the BLM for management [and one of the problems now is that the BLM doesn’t have enough money to manage the wild horse problem and a few others], or the state would have to pick up the difference, which it can’t afford.

In short, not only is what the legislature proposes illegal and unconstitutional, but the federal government is actually subsidizing the ranchers and the state of Utah, something the legislators don’t seem able to grasp.

The ranchers here in southern Utah are furious that the BLM doesn’t essentially round up all the wild horses so that there’s more forage for their cattle, but even if the BLM had the resources to do that, which it doesn’t, because Congress has insisted on not fully funding the BLM and upon keeping grazing fees low, that still wouldn’t solve the problem, because not only were western water rights predicated on the climate of the early part of the nineteenth century [which geologists have discovered was one of the wettest times here in the west in something like 10,000 years], but so were grazing rights. That is why the BLM has cut down on the number of animals allowed per acre, which is yet another rancher complaint.

In short, the ranchers, the legislature, and the Bundyites are precluded from doing as they please by the Constitution, the climate situation, and the Congress, and they’re so unhappy about it that they think the second amendment is the only answer. So, despite all their railing about their Constitutional rights, I guess they really mean that they intend to comply with just those parts of the Constitution whose they agree with, and that they’ll continue to insist that the Supreme Court has been wrong about what the Constitution means for over a century.