Archive for the ‘General’ Category


As I mentioned in an earlier post, the local university is transitioning to a trimester system so that students can theoretically obtain their bachelor’s degree in three years. This is a bit of a misnomer, because there are certain degrees where that’s likely to be impossible, given the technical content and other requirements, and the university is soft-pedalling those possibilities for the moment.

One of the other aspects of this “degree speed up” that bothers me, and more than a few in the higher education community, is that the push for getting degrees faster represents the commodification of higher education and the fact that the emphasis is getting to be more and more upon the degree as a credential. In turn, these forces represent a growing mindset that having that degree is a guarantee of a better job and higher income.

While statistics show that this correlation was true in the past, there’s an old saying that correlation is not causation.

The fact is that higher education represents an opportunity for economic and personal improvement, but even in the past it was not an absolute guarantee of either, or necessarily of economic success. Today, and in the years to come, with the growing glut of college degrees among the younger generation, blanket economic opportunity for degree holders is certainly not guaranteed. Some studies indicate that, today, there are twice as many college graduates each year as there are jobs for them that require a college degree. After WW II, by comparison, only about ten percent of high school graduates received a degree, while now seventy percent of all high school graduates go to college, and four in ten Americans have at least an undergraduate degree.

The one guarantee that does exist is that those Americans without either higher education or specialized technical or trade training will be largely frozen out of decent-paying occupations, but with the growing number of college graduates, the increase in computerization and automation of many former white collar jobs, the number of higher paying jobs is not growing nearly as fast as the number of graduates seeking those jobs.

And that means that a college education isn’t nearly the guarantee of economic and professional success it once was, and far less of a guarantee than people now believe. It’s more like a high-priced gamble where the odds are only slightly in favor of the degree holder.

If This Keeps Up…

Trump will win re-election. What do I mean by “this”? It’s not any one thing, but a combination of factors.

First, Trump is solidifying his base into an immovable monolith. Admittedly, that “monolith” amounts to “only” 36-46% of the electorate, depending on how it’s measured, but for voting purposes, I’d submit it’s 45-48% of likely voters, and a large percentage of this group consistently votes, and that’s more than enough to win with a “fragmented” electorate.

Second, the Democrats are splintering and arguing over which liberal “theme” is most important and which will galvanize their supporters.

Third, anything that’s radical enough to electrify or galvanize a particular group of supporters won’t be of paramount interest to the rest of Democratic voters and will alienate some, while galvanizing Trump supporters in opposition.

Fourth, with something like fourteen different candidates with considerably different priorities, the media will have a field day targeting or trumpeting those differences, which will in turn create an impression of incompetence and disorganization among Democrats, and that will reduce support and interest by independent voters.

Fifth, the proliferation of specific interests will make it difficult, if not impossible, for Democrats to come up with a single, meaningful, and unifying theme and to unite enthusiastically behind a single candidate.

Sixth, unless the Democrats in the House can come up with factual, indisputable, and hard evidence that Trump, or someone in the Trump family, did something that was criminal and major – and soon – any continued pursuit of Trump and his family will eventually be regarded as the “witch hunt” Trump claims it already is and will discredit the Democrats.

Seventh, because of the Democratic takeover of the House of Representatives, Democrats believe Trump is so vulnerable that any Democratic candidate can win. What they don’t want to acknowledge is that they won because most of their candidates were tailored to win in those specific districts. While this is a viable and successful Congressional strategy, it doesn’t translate to a national presidential strategy, especially since too many of those running are assuming they have or can obtain national mandate, which they don’t have and mostly like won’t obtain.

Eighth, none of the Democratic candidates seem to realize any of the above, and while Nancy Pelosi does, her influence over who runs and what they say is limited to advice and counsel, which none of the young hotshots are likely to heed.

So brace yourselves for a second Trump term, unless his health fails suddenly… and that might not even keep him from winning, so long as he’s alive and able to tweet.

Not Listening, Not Being Taught… or Not Caring?

The 1960s, and especially 1968, were a tumultuous time in U.S. culture and history. In the middle of the Vietnam War, there were continual protests and flag burning and draft card burnings across the country. Students attacked nearly 200 ROTC buildings on college campuses, and there were violent protests against the war at more than 250 colleges. There were protests everywhere, especially in Washington, D.C. At one protest at Kent State, actually in 1970, National Guardsmen shot student protesters, killing four and wounding nine others. Between 30,000 and 40,000 young men fled to Canada, rather than be drafted into the army and fight in Vietnam.

There were more political killings and attempts than at any other time in U.S. history. President Kennedy was killed; Texas Governor John Connally was wounded in the same attack; Senator Robert Kennedy was killed while running for the Presidency. George Lincoln Rockwell, head of the American Nazi Party, was assassinated. Numerous black leaders were killed: Martin Luther King, Medgar Evers, Malcolm X, and James Chaney are the most notable, but the Civil Rights Memorial in Montgomery, Alabama, lists 41 civil rights workers who were killed because of their efforts to obtain civil rights.

After Martin Luther King’s assassination in 1968, riots erupted across the nation in more than 100 cities, including Washington, D.C., Chicago, Baltimore, Kansas City, Detroit, Louisville, New York, Pittsburgh, Cincinnati. More than 40 people died, and over 1500 were injured, with more than 15,000 people being arrested.

The beating of an African American motorist by LAPD officers in August 1965 set off riots in the Watts area of LA that lasted six days, with 34 deaths, over a thousand serious injuries, more than 3,400 arrests, and property damage in excess of $40 million [roughly $300 million in today’s dollars].

When my wife the professor brings this up to her students, most of them look blank. When she pointed out to her female students out that little more than a century ago women were essentially property and couldn’t vote in most of the U.S. until 1920, or that women couldn’t get credit cards without the approval of husband or father until the late 1950s, they didn’t believe her initially. Then they just shrugged.

The other day, I got an email from a young woman, an educated young woman in her late twenties, who asked me why I’d said the 1960s were as more turbulent time than the present. So I started asking other educators I know about this, wondering if what we’d seen was just an outlying oddity. It might still be, but the half-dozen other educators I talked to had similar stories.

From what I’ve seen, it’s almost as if the younger generation doesn’t know recent U.S. history, and, to me, at least, this seems to be a recent phenomenon. I was taught about World War I, the Great Depression, and other historic events that occurred in the generation before I was born. What bothers me about this is that there seems to be an assumption on the part of the younger generation that progress is a given. A study of history shows it’s not, but those who don’t know history won’t see what can and has happened. Rome did fall. So did the Eastern Roman Empire, and the Ottoman Empire, not to mention the British Empire (although it didn’t fall so much as was relinquished because of economic and political pressures), and a lot of others. Germany went from being a comparatively open and free nation into Nazism. For centuries, Europe was racked by wars and uncounted deaths because religion dominated politics.

In some ways, there’s nothing new under the sun, that is, if you know what came before. If not, you’ll get what you deserve. The problem is that so will those of us who saw what could happen and were ignored because the majority believed progress would continue without work and without an understanding of the past.

So You Want to Be A College Professor?

Once upon a time, being a college professor was thought to be an existence of intellectual pursuits and the imparting of knowledge to students who truly wanted to learn. Like all fairy tales, or nostalgia for the past, it has never been that ideal or idyllic, but the impact of the current world on collegiate teaching has been significant… and often brutal.

A generation ago, and certainly two generations back, if you were financially and intellectually able to obtain a doctorate, your odds of obtaining a full-time, tenure track position were far, far better than now, given that in the immediate post-World War II period, close to eighty percent of teaching faculty were in full-time positions. Today, 73% of all college instructors or professors are part-time adjuncts without benefits, a high percentage of whom have doctorates and are unable to find a full-time position with benefits. Part of the reason for this is that more and more students have gone on to gain terminal professional degrees – far more than there are full-time academic positions. At the same time, the massive demand for college degrees has coincided with a growing reluctance of state governments to support higher education. Two very predictable results have been the massive hiring of cheaper adjunct instructors and the burgeoning amounts of student debt.

Then there’s the problem of how students have changed. Undergraduate degrees are now regarded as “credentials,” particularly by politicians, parents, and even students. The combination of skyrocketing tuition, the consumerism of student evaluations, and the need for credentials have taken a huge toll on academic rigor. For their money, most students expect to receive a grade of A, and they’re disappointed, if not angry, if they don’t get it, and they’ll take that anger out through evaluations of any professor who denies them the grade they think they deserve. All too many of them are also ultra-sensitive, and any professor who uses sarcasm, particularly in written form, is risking disciplinary action in many universities. And in this age of educational consumerism, colleges and universities are factoring in student evaluations into decisions on faculty raises, tenure, and promotion. The predictable result is less academic rigor and a gradual dumbing down of course content.

Recent studies have also shown that students now entering college have a social and emotional maturity some 3-5 years less than students of a generation ago, which is why teaching courses taken in the first two years of college is often more like teaching high school used to be – especially in state universities and colleges. In addition, because of the proliferation of electronic devices, especially I-phones, most college students today have difficulty concentrating and maintaining a focus on anything – except electronics – for more than a few minutes. This combination, along with increased student fragility and sensitivity, is another reason why university after university has had to hire more and more counselors and psychologists. Too many of these students literally do not know how to learn on their own, or to handle the smallest adversity, and they’re overwhelmed.

To cope with all of this, administrators and politicians keep looking for the Holy Grail of education, trying new methods, new means of teaching, reinventing a new wheel, so to speak, even before they can determine whether the last wheel they tried really worked.

One local university here has announced just last week that it is going to a new “trimester” system, starting next January, so that students can graduate in three years. This will shorten each semester from 15 academic weeks to 12 weeks, which will likely result in more dumbing down of course content because teaching is not like higher speed automation. Cutting out roughly twenty percent of teaching time will mean less will be taught, and less will be learned. The university faculty is aghast at the timetable, because none of this was discussed with faculty. Higher level courses aren’t developed in cookie-cutter fashion. It takes time to develop an effective way to present material, and there isn’t time to carefully redo every course all in a few months, especially while teaching a full load at the same time. The impact will be even worse for adjunct faculty, because they don’t get paid for course development, and most are barely making ends meet anyway.

The result will likely be a disaster, and will take several years to straighten out, if it even can be, but the university president is clearly responding to parental and political pressure to make education quicker and more affordable so that students can get that “credential” sooner and cheaper. No one is talking about whether they’ll learn as much.

Now… do you really want to be a university professor?

Priority By Budget

In early March, President Trump released his budget proposal for the 2020 fiscal year, a proposal that would set federal research spending at $151 billion, or roughly 3% of total federal spending, which would cut overall federal research spending by 11%, or almost $17 billion. Now, that’s only his proposal, and the final say on federal spending lies with the Congress, but proposals do indicate the President’s priorities. Under Trump’s priorities, the National Institutes of Health, the National Science Foundation, and the Department of Energy’s Office of Science would all face cuts of more than 12%, while science funding at the Environmental Protection Agency would drop by 40%.

After World War II, the U.S. funded almost 70% of research and development funding world-wide. Today, that figure is 28%, and while that shift can be partly explained by the ability of the rest of the world to be able to fund research, the fact is that the U.S. is being badly outspent, particularly in the area of basic research.

At present, total U.S. spending on basic research comprises less than 17% of all U.S. R&D spending. About three-quarters of U.S. basic research is funded by the federal government (44%), state governments (3%), institutions of higher education (13%), and other non-profits (13%).

To make matters worse, the majority of R&D spending by U.S. businesses goes toward product development, with only about six percent of business R&D funds going to basic research, and over the last four decades, the contribution of U.S. corporations to new basic research has dropped from 30% of published research to less than 10%. This isn’t surprising, because basic research is unpredictable and often expensive, but without basic research, in time, product development will slow dramatically, if not come to a virtual halt. That’s why federal support of basic research is absolutely necessary if U.S. industry is to continue to compete in a global market.

Then add to that the fact that climate change and its environmental effects are a persistent and real future problem… and Trump wants to cut environment research by 40%?

All that suggests that the President’s priorities are anything but for the future.

Not So Fast!

A great many people in the United States feel that progress is an unmitigated good. I’d even agree that real progress is indeed good, BUT… these days so much of what is considered progress is either a commercial scam or someone putting out a product that’s really not progress at all… or a marginal improvement to an existing product or system, and all too often those “new” products that are really incremental improvements are rushed to market with bugs in them.

When it’s a case of computer software, such bugs can be an annoyance, as in the case of a personal computer, or far worse, if that software is part of something much larger.

The latest tragic example of this is the Boeing 737 Max 8, an aircraft that, in terms of actual improvement in passenger convenience, travel time, and maintenance time, is at best a marginal improvement, BUT it’s nine feet longer than the 737 Max 7 and carries 21 more passengers in standard configuration, and it’s fourteen percent more fuel efficient. The cabin design has to be less expensive and lighter because Boeing removed all the passenger seat consoles in favor of “streaming entertainment,” meaning that the passenger has to carry his or her own computer, cell phone, or tablet.

Boeing also installed more “pilot-error” proof software, except that, in the rush to get the 737 Max 8 into service in the competition against Airbus, Boeing apparently went “light” on pilot re-training, claiming that not that much was needed since the aircraft was the same “type” as the most recent 737 predecessors… which is largely true… except when certain sensors malfunction and then the aircraft software drops the nose, even on take-off, and the pilot has to know exactly which three switches to turn off… and know that in seconds. And in the Ethiopian and Lion Air crashes, the pilots didn’t know that, even though the Ethiopian captain had more than 4,000 hours in earlier versions of the 737.

Now, with over four thousand 737 Max 8 aircraft sold and delivered, the probabilities of such a malfunction are low… but the consequences can be brutal if and when they occur. In this case, as a result of this rush to market, two airliners crashed and killed everyone on board, and Boeing has admitted that the grounding of the 737 Max 8s will cost Boeing $150 million just in first quarter, possibly over $1 billion before all the glitches are fixed.

Was the rush really worth it? To anyone but Boeing, that is?

After Mueller?

Once again, the anti-Trump factions have underestimated the man. Now, as anyone who’s followed my blog knows, I’m anything but a fan of the President, but his opponents are making the same mistakes that so many have made regarding Trump over the years. First, because Trump makes statements that seem stupid not only to his opponents, but statements that are factually inaccurate, they think Trump is stupid. Second, because they think he’s stupid, they underestimate him.

Trump is extremely dangerous because he ignores the “conventional” rules of politics. In general, conventional politicians either use correct facts in the general ballpark of “truth” or correct facts incorrectly… or they appeal to the beliefs and ideals of their constituents without directly invoking the facts.

Trump will use falsehoods and misstatements in a continuing flurry of tweets and public utterances without any regard for the truth. He doesn’t care in the slightest about the factual truth. He cares only about encouraging his supporters and continually reassuring them that he’s on their side. And because he has the “bully pulpit” of the Presidency [as Teddy Roosevelt once called it] and because the media covers that pulpit non-stop, he has effectively overwhelmed truth and accuracy in energizing his supporters.

He’s also used various powers to have others do the dirty work, as Michael Cohen’s testimony revealed, and, so far as anything disclosed so far has revealed, he’s been careful not to leave his own fingerprints on anything. He’s excellent at suggesting that others should do the dirty work, but he appears to do it in a fashion where he never directly orders them to do something he knows is against the law. It’s not against the law to buy silence from a porn star. It may well be against the law to do so with campaign funds, but Cohen bought off the porn star with his own funds, and then apparently was reimbursed from a non-campaign account.

He suggested that the Russians look into Hillary’s emails and other matters, but he apparently never met with Russian agents to discuss anything specific. Others may have, thinking that was what Trump wanted, but it’s doubtful that Trump ever ordered anything that specific.

In the meantime, with his continual barrage about immigration, more and more Americans are feeling that immigration is a huge problem. While immigration is not an urgent national emergency, one that a wall won’t solve, it is in fact a problem, because the U.S. hasn’t and likely won’t spend the money to deal with immigrants humanely and effectively, and that shortcoming will only increase the problem, especially if the Democrat-led House of Representatives doesn’t do something besides oppose the wall.

With Trump’s insistence that he did not collude with the Russians [and why would he have done so, when they were working on his behalf without any meetings] and the likely conclusion of the Mueller report that there’s no proof of such collusion, more and more Americans are going to believe that the Democrats have in fact been “witch-hunting,” and unless the prosecutors of the Southern District of New York come up with solid and undisputable evidence that Trump personally did something not only illegal, but significantly illegal, it’s very likely that Trump will run for and win a second term, because every minute of his presidency, he’s been selling himself to his base and their friends, while the Democrats are united only in their dislike and disdain for Trump… and still fail to understand exactly what he’s doing.

What he’s doing is the formula followed by every successful dictator. It’s very simple. Demonize and minimize your opponents while reassuring your base – emotionally, because facts count for little – that they’re special and that you’re the only one that they can trust… and that the “elites” have sold them down the river. And the Democrats also have the “small” problem that much of the “old” middle class has indeed been sold down the river, if not directly by elites, but indirectly through technology, computers, and automation. Protesting that “you” didn’t do it doesn’t address the fears and needs of Trump’s base.

Nor does it matter that Trump has rewarded the rich far more than his base, or that he seldom tells the factual truth. He’s selling “emotional truth,” what people want to believe… and he’s anything but stupid.

The stupid ones are those who think he is.

Corporations and “Limited Liability”

As I pointed out in the earlier post about PG&E, the corporate structure shields corporate executives from personal responsibility and effectively allows the corporation to pay large sums of money as recompense or as fines, even for felonious conduct that, if attributed to an individual, could well result in prison time. In the San Bruno pipeline explosion of 2010 that killed eight people, injured 58 others, and destroyed 38 homes, PG&E was found guilty of six felony counts of violating pipeline standards, and not a single individual was held responsible. Damages and fines exceeded $2 billion, but Peter Darbee, the chairman and chief executive of PG&E Corp., the utility’s parent holding company at the time, retired a year later with a golden handshake of some $35 million. Christopher P. Johns, who was president of Pacific Gas & Electric Co., the utility subsidiary, in 2010, retired as its vice chairman in December 2015 with a pension package of $17.8 million.

BP [formerly British Petroleum] has literally pages of environmental and safety violations, including the Deepwater Horizon explosion that killed 11 people and injured 16 others, not to mention totally fouling most of the Gulf of Mexico with crude oil. While the company pleaded guilty to 11 counts of felony manslaughter, two misdemeanors, one felony count of lying to Congress, and agreed to pay more than $4.5 billion in fines and penalties, not a single not a single individual was held responsible. Before that, in 2005 the BP Texas city refinery explosion killed 15 people and injured 180 others, and was followed two years later by toxic chemical releases that injured another 143… and again no one was held personally responsible.

The three largest creators of toxic waste Superfund sites are Honeywell, Chevron, and General Electric. General Electric so polluted the Housatonic River in Massachusetts and Hudson River (some 200 miles worth) that both were classified as Superfund toxic waste sites, and despite lawsuits and EPA action, GE still hasn’t completed the clean-up, more than 30 years later. Honeywell (through its subsidiary, Allied Chemical) dumped mercury into Lake Onondaga for over sixty years, and has so far spent over half a billion in remediation. Chevron has acknowledged that it’s a “responsible party” at 180 Superfund sites, and it has over 20 multimillion dollar fines for environmental violations.

In addition to the issue of no executive being personally responsible for criminal environmental violations and felonies, there’s another large problem with the corporate liability structure. That’s the fact that none of the money paid in fines, damages, and remediation comes out of the pockets of corporate executives. It comes out of corporate revenues, and that means that the executives are not only shielded from criminal charges, but they’ve passed off the costs to others.

While some form of limited corporate liability is likely necessary, letting the CEOs and other executives off scot-free is one of the principal reasons why corporations try to pay their way out of trouble with what amount to shareholder funds… and why ethics mean so little to them. They really don’t answer to anyone.


Back in the dark ages when I wrote my first story, the few computers that existed were generally refrigerator-sized, if not larger, and extremely rare. I’ve never been against technology, and back then, the most advanced off-the-shelf technology for a writer was an IMB Selectric typewriter, non-correcting. I later upgraded to a correcting Selectric with an electronic spellchecker of sorts. Finally, when the 286 processor was developed, I shifted to using a computer to write. That made me a little later in adopting computers than some other writers, but the 286 was the first processor that fit my writing style. That meant that, for the first eleven years of my professional career, I typed out every page of every draft of my stories and novels.

There’s one effect of the shift to electronic production of manuscripts that’s seldom noted, except by those of us who had to struggle with the need to turn in a clean typescript manuscript, laboriously typed out manually, because there was no real alternative. We had to be careful, because, even with correcting tape or Wite-Out, too many mistakes meant getting rejections or retyping, by hand, the entire page – especially if you were working on submitting your first novel. Unless, of course, you were wealthy enough to hire a secretary, which very few struggling writers seldom were, or are, even today.

We had to be careful. There were no electronic spellcheckers and no grammar checkers, and one of the unspoken requirements for a real editor to look at your work was submitting at least a moderately clean manuscript with correct grammar, except where required in dialogue. Also, redrafting a novel took a LONG time. In that time period, one of the great advantages Isaac Asimov had was that he could type well, accurately, and moderately fast and that his understanding of grammar was good enough that he usually only needed to type one draft.

Today, far too many would-be writers don’t really understand grammar well enough, and they leave the “details,” such as spelling, to the computer, and it shows. Unfortunately, this excessive reliance on computers extends far beyond the mechanics of writing. Too many young people don’t understand the limitations of Google or other search engines, and they’re used to multiple-choice tests, and instant answers and satisfaction.

The result is all too often sloppiness in all aspects of their work… and what’s worse, all too many of them don’t see that sloppiness… or care.

Despite all protests to the contrary, technology amplifies everything, including sloppiness.

Cultural Appropriation?

So far as I know, as an author I haven’t been attacked for cultural appropriation, but we’re seeing a continuing barrage of such charges in many areas in the U.S., sometimes justified, sometimes not, and sometimes… it’s just hard to tell.

For example, take Gilbert and Sullivan’s operetta The Mikado. It was written as a spoof and critique of upper-class British manners and mores through the use of “faux-Japanese” culture. There’s no doubt that it does a certain amount of violence to Japanese historical culture, but it was neither meant to be accurate about that culture nor meant to demean it. And, of course, it was written over 130 years ago. But, depending on how one defines cultural appropriation…

Then, in the opera world, there’s a trio of well-known operas that have been the center of the cultural appropriation debate – Aida, Madama Butterfly, and Turandot – set, respectively, in ancient Egypt, 1880s Japan, and in an unnamed year in China, most probably after the twelfth century. All three operas are, as operas usually are, a mish-mash of sources that take great liberties with history. Turandot, for example, is actually based on a story taken from the 12th-century Persian poet Nizami, but transferred from Persia to China. For all that, there’s been an outcry in some quarters about whether these operas should be performed and who should sing what roles. But the music is Italian in nature and doesn’t appropriate the music of the countries in which they are set, and two of the stories [Aida and Turandot] are totally fictional, including much of the supposed “culture”, while the other is based, if loosely, on an actual occurrence.

I think it’s fair to say that, in the case of the opera itself, neither Aida nor Turandot qualify as cultural appropriations, simply because about the only thing we can determine that was “appropriated” was the name of the country. Everything else is effectively fiction without specific definable cultural roots. For Madama Butterfly, however, there is a definite aspect of appropriation in the portrayal of the heroine, simply because, for the most part, she’s portrayed as a demeaning stereotype that was widely accepted in the west at the time Puccini wrote the opera almost a hundred years ago.

The question of whether it’s cultural appropriation if a singer doing the role isn’t of the ethnicity of the role strikes me as largely irrelevant. In recent years, singers doing these roles have been from multiple ethnicities; in my opinion, the question should be about how well they sing and perform, not their ethnicity. The larger question is whether individual opera companies are preferring singers based on ethnicity, as opposed to ability.

In F&SF, the cultural appropriation issue seems to center on whether a writer should write about cultures not his or her own. Now, as my readers know, I’ve never depicted a present-day or near-future culture that’s of a totally different ethnicity from my own. I have speculated at times about where various cultures might end up in the future, based on what I’ve studied and observed, but I don’t believe that constitutes cultural appropriation, and I wouldn’t call it that even if a writer from a different ethnicity speculated on a dismal future for the United States or Caucasians in general.

Speculation about the future is what F&SF does, and no writer should be criticized about what culture she or he writes about, but, by the same token, we should make a concerted effort to be accurate, and to be open to balanced and thoughtful criticism, painful as it sometimes can be. I would hope, of course, that such speculations be done with care, but then, all writing should be done with care.

Newspapers and “News”

I’m beginning to wonder how long print newspapers will last. I happen to like print newspapers, at least the way newspapers used to be printed and distributed. These days, however, I’m finding them of less and less value.

For example, take our local daily newspaper, which theoretically serves Cedar City and St. George. It has a smattering of national news, largely garnered from the USA Today news feed, as well as perhaps two local stories, and a few more St. George stories. It used to cover cultural and entertainment events in Cedar City as well as the local university and high school sports teams. I’ve seen exactly one university sports story in the past several weeks, one high school sports story, and no cultural stories for Cedar City… and maybe one story a day about local Cedar news. And the coverage of the university in St. George isn’t much better, except about the scandal that occurred when the University president tried to fire tenured professors on various trumped up charges.

And for 20% of the content that the paper once delivered, the national conglomerate that owns the paper has now doubled the subscription price.

I also take the Salt Lake Tribune, except I can only get the “paper” edition on Friday, Saturday, and Sunday, and most of the time the paper never gets delivered before 11:00 A.M. and sometimes, it’s after noon. Like the local paper, the Tribune has cut back on staff and coverage. It seldom covers events south of the Salt Lake Metro area, and, from what I can tell, it’s almost given up on covering books, with one Sunday edition dealing with books every month or so.

We do have a weekly “county” newspaper, and it does a far better job on local news and local high school sports, with spotty coverage of the university sports, and minimal, if any, coverage of cultural events at the university.

Electronic media, at least so far, hasn’t filled in the gap, and as a result, attendance at cultural events is down markedly, unsurprisingly, since they’re not being covered, and one concert series that’s been in existence for over 80 years may well phase itself out in the next year or so.

One of the problems, especially in smaller towns and rural areas, is that there are really no “general” electronic news/communication networks. While I hate doing it, I can cobble together national news in some depth from internet electronic services, but for local news…forget it.

Burgeoning Corporate Irresponsibility

On January 29th of this year, the mega utility Pacific Gas and Electric Company (PG&E) filed for bankruptcy, citing billions of dollars in liabilities stemming from wildfires in the 2018 California Camp Fire that destroyed the town of Paradise, California, and killed 86 people. PG&E has acknowledged likely responsibility for triggering the Camp Fire.

Although PG&E has assets estimated at around $70 billion and liabilities of approximately $50 billion, the tort claims are estimated to amount to around $35 billion, and PG&E filed what is termed a defensive bankruptcy to protect its assets, and to cap the amount it may have to pay in claims against it. And, in addition, PG&E canceled the planned distribution of $130 million in bonuses to rank and file employees.

Some have stated that PG&E is the first major corporate “victim” of global warming, but while global warming was certainly a factor in exacerbating the damage caused by the fire, what tends to get overlooked is that PG&E has a half-century long history of environmental irresponsibility.

Not only had PG&E not built and insulated its power distribution system all that well, which led to the Camp Fire disaster, but way back in 1952, PG&E began dumping chromium-tainted wastewater into unlined wastewater spreading ponds around the town of Hinkley, California, located in the Mojave Desert (about 120 miles north-northeast of Los Angeles). From 1952 through 1966, PG&E dumped some about 370 million gallons. To date, PG&E has spent over one billion dollars for damage claims and remediation. The movie Erin Brockovich told the story of the damage claims and how PG&E denied and tried to cover up what it had done. Today, Hinkley is almost a ghost town.

Then PG&E filed for bankruptcy in 2001 because it was caught in a cash-flow squeeze when the wholesale price of the power the company purchased on the open market rose above the rates the company was allowed to charge by the California Public Utilities Commission. Prior to the 2001 filing, PG&E could see in advance what was happening, and, as part of its “bankruptcy estate planning” process, “pushed” extensive cash holdings into subsidiaries for distribution to shareholders, effectively “ring-fencing” those profits to “protect” them from creditors. As a result, the 2001 PG&E bankruptcy resulted in a settlement that would ultimately cost its ratepayers approximately $7 billion.

During none of these disasters were any PG&E executives ever held criminally liable or negligent, and the costs of dealing with the problems never fell on them, while they collected hefty compensation and foisted the costs of on the rate-payers, rather than on either stock-holders or corporate executives.

The original idea of limited corporate liability was to limit liability claims to the corporate body, and not to those who ran the corporation, on the grounds that no one would undertake building massive steel mills, factories, or the like when a single disaster could destroy them personally. That rationale, unhappily, still makes sense, but the evolution of law and the corporate structure means that even when corporate actions, as in the case of PG&E, kill people and destroy entire towns in violation of laws and standards, no individual is ever held responsible, and the corporation shifts the costs of those violations onto its customers, rather than to the stockholders or executives.

Until executives are personally held responsible for such violations, nothing will change, and interestingly enough, right after PG&E declared bankruptcy, the stock price went up almost twenty percent.


A headline in the morning paper caught my attention, largely because it shouldn’t have. The headline? “Border Vote Tough for GOP Senators

And why is this tough for Republicans? Apparently, a great many of them believe that the President’s declaration of a “national emergency” infringes on the rights of Congress under the Constitution. Now, I happen to agree with them. The Constitution is rather specific in declaring that the Congress controls the federal purse strings, but these Republican senators apparently fear that voting their principles isn’t a good idea if it just might “upset” their beloved [or feared] President… and, of course, his far-right supporters.

As I recall, we had eight years of Republicans protesting Presidential “overreach” by the last Democrat President, and he didn’t go nearly so far as to declare a non-existent national emergency to build a wall because Congress hadn’t given him the money he wanted. His action that most upset the far right was to declare he wouldn’t deport teenagers who’d lived the vast majority of their lives as Americans.

Now we have a Republican President who’s gone against the Constitution and against a principle that Republicans claimed for years that they hold dear… and they don’t want to vote for their principles? After years and years of protesting about Executive Branch overreach?

As one fictional movie protagonist said, in protesting McCarthyism, “People are their principles.” But only if they act in accord with those principles. Otherwise, they’re just self-serving hypocrites.


In 2014, David McCullough, Jr., published a book entitled You Are Not Special, in which he took dead aim at society and the education establishment’s efforts to make students feel “special.” McCullough was dead right then, and very little has changed since then, especially not for the better.

But, unhappily, it’s not just students who are demanding to be treated as special. It’s pretty much everyone in the United States, or so it seems.

Most dictionaries define special as “distinguished by exhibiting unique, superior, or outstanding characteristics” or in similar terms.

According to the U.S. Census Bureau, the current U.S. population is 327 million people. I’ll grant that everyone is “unique,” in the same meaningless way that every snowflake is said to be unique, but not everyone, not even a majority of people, is outstanding in any way, except, of course to ourselves and the handful of people who truly care about us.

Being, or not being, of a particular race, creed, ethnicity, gender does not make one “special.” Believing in a particular creed or religion does not make one special. Having great innate intelligence or athletic ability does not make one special. What makes anyone outstanding is not that a person exists, but what that person has done with that existence, particularly what they have done that makes the world, or a part of it, a better place in some fashion.

That view is, of course, somewhat Calvinistic, and definitely at odds with the idea that merely believing in a deity is enough to obtain some sort of stature or theological grace. In the end, what gets things done, especially for the better, are focused and consistent actions to that end.

You’re not special…except through your actions.

“Ethics” and Hypocrisy

For those of my readers who don’t know this, I am and always have been a Republican.  I also have rarely voted for a Republican candidate in the last 15-20 years, except in the primary, where I’ve cast my ballot for the least reactionary candidate [there are no moderate Republican candidates in Utah, nor any with any “liberal” traits].

Today, I only see a handful of Republican office-holders who are actually willing to call out both parties on their self-serving propaganda and who are promulgating positive and workable solutions… and they’re getting scarcer with every election. I support them… and keep hoping.

Back before I was involved directly in politics, the Republican Party had elected officials who ranged from the conservative to the moderately liberal, and even “Mr. Conservative” – Barry Goldwater – was Pro-Choice.  Back then, the GOP endorsed fiscal moderation, and was far less in favor of subsidies [except for those to farmers].  The party was for a strong national defense, but had a president who bluntly warned against the “military-industrial” complex. Most Republicans were perfectly happy to welcome the brains and bodies of bright foreign students who came to the U.S. to study and who wanted to stay.  The GOP believed in “God and Country,” but also in separation of church and state, and felt that NATO and other allies were important in opposing communist adventurism.  There were extreme “rightists” back then, such as the John Birch society, but ultra-conservative members of Congress were a definite minority.

That began to change about the time when I became the legislative director for a conservative Republican congressman after the 1972 election and came to Washington, D.C.  That was the time period when the far-right Republican Study Committee and equally conservative Heritage Foundation were created, largely in reaction to a Democrat-dominated House of Representatives and Senate. Over the next two decades, more and more liberal and moderate Republicans were defeated, and the GOP became more and more stridently conservative on social and religious issues, tacitly [and sometimes more than that] opposing the Equal Rights Amendment, and opposing as much as possible environmental and civil rights issues.

At the same time, any pretense of fiscal conservatism vanished with the Reagan administration and the idea that tax cuts for the wealthier Americans would bring prosperity to everyone, but, in the end, all that meant was that the Republicans wanted welfare for businesses and the wealthy and the Democrats wanted welfare for the poor and underprivileged…and both kinds of welfare were funded increasingly through deficit financing.  Both parties cooperated in adding to the Defense budget by keeping unneeded military bases open, by micromanaging defense procurement in order to maximize defense jobs in the districts of influential members of the House and Senate, and by often legislating the procurement of weapons and equipment not requested by military leaders.

In recent years, Republicans have pushed for more “deregulation,” especially financially, tax cuts for the wealthy, effectively cut back on antitrust enforcement and environmental protection, and failed to fund VA hospitals and health care for all the wounded veterans injured in various combat assignments  all over the world.  They’ve also pushed for “religious” provisions of all sorts in health care and education.

In short, they’ve abandoned fiscal prudence, and rewarded the rich, and created all sorts of indirect subsidies for businesses. They have tried to gut the separation of church and state. They’ve pushed for measures to make it harder for minorities and the less affluent to vote and be politically active.  They’ve tried to overturn and roll back air quality standards affecting the poorest Americans, and they’ve turned over public lands to mining companies. Most of all, they claim that they’re for “working Americans,” when almost everything they espouse these days will hurt those working Americans.   

The Democrats want to spend far too much, and they go too far in the area of political correctness, and they don’t understand that “culture” isn’t the same as “race” or ethnicity, but they’re trying, most imperfectly, to make life better for the majority of Americans, and they have plans to pay for what they want, which, imperfect as some of them are, are far better than the proven unworkable trickle-down economics of the Republicans. What the Republicans support, for all their rhetoric to the contrary, are measures designed to make life better for those who already have the good life and vague promises to dissatisfied workers that will do absolutely nothing for those workers, not to mention wasting money on a wall across the southern border that won’t deal with the real immigration problems and will create severe environmental difficulties. 

If I’ve counted correctly, there are something like 37 individuals connected with the Trump campaign that have either been indicted or pled guilty to various charges of corruption, and they’ve been charged by a Republican prosecutor.  I’m fairly sure that’s a record for such charges, but then, the last time we had such a scandal was Watergate… and, funny thing, that was a Republican campaign and administration, too.  And, oh, yes, the last big Presidential corruption problem before that was the Teapot Dome scandal in which Republicans tried to sell-off, at cut-rate prices, U.S. naval oil reserves to oil moguls. 

But I guess that “ethical” for Republicans these days means cutting back on rights, benefits [including breathing clean air], and health care for the disadvantaged while providing subsidies and tax cuts to businesses and the wealthy and claiming that all those “new” jobs, most of which are “service” jobs that pay far less than the old manufacturing jobs, are a great benefit.

Hypocrisy, anyone?

The “Outsider” Danger

What too many people accept without truly understanding is that society – any society – is held together by two sets of rules – those set forth in law and those adopted and accepted through custom and habit.  Some societies rely far more heavily on religious and historical customs than on laws, and other societies, such as the United States, rely more heavily on written laws and written constitutional frameworks.

But even in the United States, there are customs that have the force of law – until they don’t.  For example, George Washington set an example of a President serving only two terms… and that custom had the force of law for some 154 years… until Franklin D. Roosevelt decided to run for a third term.  Ten years after that, the 22nd amendment, limiting Presidential terms to two, became law, because, once the custom was flouted, anyone else might flout it as well.

We’re now seeing another “custom” flouted.  Until Trump decided that immigrants were cause for a national emergency, no U.S. President would have considered unarmed and often starving refugees or illegal immigrants as a national emergency.  In fact, at one point in our history, no immigrant was “illegal.”

What Trump did in proclaiming a national emergency for political purposes was break the custom that a national emergency be a truly national emergency. No matter what rhetoric is employed, those refugees and illegal immigrants do not threaten the nation’s overall economy or safety, which the “customary” and accepted definition of national emergency historically required. The failure to fund a southern border wall is not an “emergency,” and no historical U.S. politician would ever have considered it so.

Trump is not a politician.  He is a self-serving and narcissistic demagogue who has neither the knowledge nor the understanding of American political culture… and he represents the true downside and danger of investing power in an “outsider.”   Outsiders don’t care about customs, or about why certain practices have been followed. Sometimes, for a nation, that is useful, but it is always dangerous.

Unless Congress acts, as it did in the case of the 22nd Amendment, or unless the Supreme Court rules against Trump, any subsequent President will be able to define on his or her own terms what a national emergency is.  Will it be the widespread possession and use of firearms?  Or perhaps strikes by public employees? Or “unfair tariffs” by other nations? Or possibly the “epidemic” of abortion?

Electing “outsiders” is seldom the best way to break “gridlock” because lots of other things can get broken in the process, often disastrously, as we’re seeing not only here in the United States, but elsewhere in the world.

But then, too often when people get angry or frustrated, habits or customs and the reasons for them get discarded without any thought for what will result.

A Little History, Please

And no, I’m not talking about politics this time.  I’m talking about newer F&SF writers who should know better… not about the history, if there even is any, in their books, but about what’s been written before and by whom. 

I’m beginning to get weary of newer writers or F&SF critics or columnists writing articles or giving interviews or blurbing books who talk about the “new” way they or other writers have addressed an issue or a problem or how this or that issue hasn’t really surfaced before…. or how these books open new vistas… or some similar cliché.

Roger Zelazny was writing SF about cloned bodies and mental transplants in Lord of Light fifty years ago, and also about men becoming women and vice versa.  Ursula K. LeGuin explored basic gender issues and preconceptions pretty thoroughly, also fifty years ago, in The Left Hand of Darkness, as well as environmental issues in The Word for World is Forest.  It’s also been overlooked, until recently (although it’s still not that widely known) that her protagonist in the Earthsea books was a person of color.  J.G. Ballard’s The Drowned World is an earlier take [1962] on global warming and rising seas.

In 1909, E.M. Forster wrote a novelette entitled “The Machine Stops,” a tale of what happens when the mechanical entity that runs all of earth’s civilization fails.  Fred Saberhagen wrote about malevolent AIs [the berserkers] well before the “Terminator” movies.  Frederick Pohl and Cyril Kornbluth wrote about the takeover of the world by advertising executives in The Space Merchants back in 1952.

So… please be careful using phasing like “new” or “fresh” or “unexplored.”  I know no one wants to admit that what they’ve done is a different approach to an old theme or a perspective from a slightly different angle, but, for the most part, that’s exactly what most writers who are cited as “new” or “fresh” actually do… and there’s nothing wrong with that. 

For the writers who truly do something different and unique… well… most of them are ignored because most readers are uncomfortable with something truly unique.  A few manage to do the unique in a way that conceals how unique what they do is… and about one in a million turns out to be J.R.R. Tolkien.

A Different Approach to Balancing the Federal Budget?

Recently, various Democratic politicians have been pushing a range of tax options.  From what I can determine, and from what many experts are saying, most of them would cause more harm than good.  I’m not saying that we don’t need more federal revenue.  I’m saying that everyone is looking in the wrong places.

Let’s go back to basics.  First, you can’t tax people who have no income.  Second, despite the political rhetoric, it’s highly unlikely that a “wealth tax” is constitutional.  Third, a wealth tax would destroy a lot of entrepreneurs while only marginally inconveniencing financial types.  That’s because the wealth of the entrepreneurs is usually tied up in stock and having to sell large blocks of it to pay taxes could destroy the company or at least damage it.  The same thing could happen to family held companies without large cash reserves.  Fourth, extremely high marginal tax rates would cause either creative tax evasion or tax flight, both of which would leave the upper middle class shouldering the burden, not the wealthy.

BUT… there is another source of untapped revenue that has several advantages.  First, it targets the financial community, and that’s where most of the money is.  Second, it’s about time that Wall Street starting paying the bill.  And third, it’s not really that onerous a tax when you think about it.  And fourth, some states already use it.

I’m talking about a transfer tax on every share of stock sold on every stock exchange in the U.S.  The tax would be levied on the seller, since the seller gets the money. With computers, keeping track shouldn’t be that hard.

I did some back of the envelope calculations, which astounded me. The other day, which wasn’t extraordinary, the top 100 stocks on the NASDAQ-100 had a daily volume of over 400 million shares traded.  Only one of those stocks sold for less than $6 a share and the rest looked to average a hundred dollars a share.  The yearly sales value of just those 100 stocks look to exceed $10 trillion annually.  A one percent transfer tax on the sales of the shares of just those one hundred companies would yield about $100 billion annually… and there are over 4000 publicly traded companies on U.S.

Now I know that there are also some public start-up companies whose shares are valued in cents, and some sort of sliding scale would be necessary for them, but given how much Wall Street has benefitted, is a one percent a share, or even half of that, too much to ask of the large established firms… and the algorithm-driven trading computers used by market profiteers?

Corruption, Wealth, and Consequences

Everyone’s heard about the “Golden Rule,” but there’s a variation on it that I often heard from the boss of the consulting firm I worked for years ago in Washington, D.C. – “Those who’ve got the gold make the rules.”

Along those same lines, the sixteen century poet Sir John Harrington wrote, “Treason doth never prosper, what’s the reason? For if it prosper, none dare call it treason.”  That quote was co-opted and often cited by the far right in the 1960s as the result of a book with a title of  None Dare Call It Treason, but unfortunately the principle still applies today, except in a far different context, as a result of the “second” golden rule, except it might be entitled Law by and for Affluent White Males.

By that, I mean that whether something is a crime or not, legally speaking, depends on what the law says, and the laws in the United States have been drafted, almost entirely, by affluent white men, except, to some degree as one reader pointed out, recently in California.  Now, it’s definitely a crime if someone takes a gun and steals your wallet.  Likewise, it’s a crime if someone physically assaults you for no reason, or if someone hacks your bank account and drains it.

But what about laws and a legal system that affect poorer people more than richer ones?  Or ones that use laws to transfer money from poorer people to richer ones?

It’s a crime for two parties to collude to force people to pay more for a product, unless one of those parties is a pharmaceutical company and the other is the Social Security/Medicare/Medicaid Administration, because the government, thanks to a special law passed some fifteen years ago, is expressly forbidden from negotiating lower drug prices.  In effect, it’s a form of price-fixing, and it’s taking money from the pockets of every health care consumer and putting it in the bank accounts of all the pharmaceutical companies… and adding to executive and CEO bonuses.  Those higher costs pose a far higher burden on the poor and elderly than on the rich and famous. Right now, hundreds of thousands, if not millions, of diabetics are having trouble paying for insulin because the pharmaceutical industry keeps jiggering the formulations so that a basic medication more than fifty years old can cost diabetics almost $6,000 a year, up from $2,800 seven years ago. The cost of one Advair asthma inhaler has gone from $316 in 2013 to $541 this month.

Robbery and embezzlement are both forms of theft.  Most robberies are committed by poorer members of society, and most embezzlement by those better off.  According to Federal sentencing reports in 2016, the average sentence time for embezzlement was eight months, but half of those convicted were sentenced to no time in jail; the average sentence for robbery was six years. By comparison, another study showed that the average sentence for fraud of less than $5,000 was five months, while the average sentence for greater amounts averaged twenty-two months, but burglary/breaking and entering, which also doesn’t involve violence to others, has an average sentence of 4.6 years.

So, the poor and disadvantaged criminals steal less money and spend more time in jail.  Part of that is, of course, that more affluent clients can also afford better attorneys.  Now, if that embezzlement actually hits the rich and famous, as in the case of Bernie Madoff, then the gloves come off.  That’s why Madoff is facing 150 years in prison. 

But my former boss’s “Golden Rule” also affects how new laws are made as well.

For example, recently the U.S. Supreme Court ruled that former Virginia Governor Bob McDonnell didn’t break corruption laws when he pulled political favors for a big donor in return for lavish gifts and personal loans. That kind of “you scratch my back, I’ll scratch yours” behavior was not barred by current corruption laws, the Court decided, despite the tens of thousands of dollars McDonnell and his wife received in gifts, cash, and loans.  This decision was the result of small changes in the laws over the years that now allow corporations and the rich to literally buy politicians legally, as also reflected in the Citizens United Supreme Court decision.

And conservatives wonder why minorities are often skeptical of the laws and the legal system?

States’ Rights?

I’ve often said that I live in the semi-sovereign theocracy of Deseret, and in the last month or so, the state legislature has decided to prove that.  As background, voters in the state voted two initiatives into law.  One legalized various uses of marijuana for medical purposes; the other expanded Medicaid coverage as allowed under federal law.

The medical marijuana initiative was largely supported because, for the last two sessions of the legislature, the legislature voted against all measures to do so, and many felt that was because of the views of the LDS Church.

Why might people suppose that?  It just might be because the Republicans have a super-majority in both the state house and senate, and, interesting enough, 81 of the 82 Republicans are members of the LDS faith, even though only about 63% of the state population is LDS.  The Democrats, all 22 of them, are, as best I can determine, roughly 60% LDS and 40% other faiths, which is, also interestingly enough, close to the belief structure of the state. 

Once the marijuana initiative passed, immediately after the election, the Republicans called a special session, declaring that, as law, the initiative was unsuitable, and immediately went to work to pass legislation to water it down and eliminate certain provisions.  They were successful in doing so, not surprisingly when you consider their faith and majority status.

The second initiative was to expand Medicaid coverage to the additional level allowed, but not required, by federal law. Now that the legislature has convened, the state Senate has passed and sent to the state House legislation to significantly cut back that coverage on the grounds that, some five years from now, it will cost the state some $10 million dollars a year to maintain that coverage.  But the point of the initiative was to cover all of those eligible but not covered, not part of them, and the cost not already in the state budget to the average taxpayer would have been less than $10 per year.

The House speaker has indicated that the measure will pass, and the governor will sign it, and all the Republicans claim that it’s necessary for budgetary prudence, even though the state is running a budgetary surplus, and the legislature is mulling tax cuts… and, oh, yes, the state spends less per student on public education than any state in the union, by a wide margin.

But then, perhaps all this might, just might, have something to do with the fact that the LDS Church insists on a 10% tithe on gross income, and it doesn’t want its members overtaxed.

But… all this might also provide an example of why I’m just a bit leery when people trumpet “states’ rights.”