Archive for the ‘General’ Category

The Wrong Healthcare Issue

Right now, the House Republicans are fighting to get enough votes to pass their bill to repeal and replace the Affordable Care Act, aka “Obamacare.” The Democrats are staunchly opposed. Both sides are arguing over the affordability of healthcare and access to healthcare insurance.

As far as I can see, they’re both circling around wrong tree, chasing each other’s tails. Insurance is only a symptom of the greater problem, and trying to deal with symptoms is not only expensive, but will also postpone dealing with the real problem, which continues to worsen. That problem? Healthcare costs. People need insurance because healthcare costs in the U.S. are effectively the highest in the world, and the vast majority of Americans don’t get as good healthcare as nations spending far less on healthcare.

In 2015, U.S. health care costs were $3.2 trillion, making healthcare one of the largest U.S. industries, nearly eighteen percent of Gross Domestic Product, but fifty-five years ago, healthcare only comprised five percent of GDP.

Part of the reason for the cost increase is emergency room treatment, the most expensive single aspect of current healthcare, making up one-third of all health care costs in America. And a significant proportion of emergency room care occurs because people can’t get or afford other treatment for various reasons.

Another component of rising costs is the continuing increase in the costs of drugs and medical devices. According to Forbes, the healthcare technology industry was the most profitable U.S. industry sector of all in 2015, notching an average profit margin of 21%, with the most profitable company of all being Gilead Sciences with a 53% profit margin. And no wonder, given that the list price for the top-20-selling drugs in the U.S. averages more than twice as much as those drugs as in the E.U. or Canada.

While the pharmaceutical industry pleads high research and development costs, a GlobalData study showed that the ten largest pharmaceutical companies in the world in 2013 spent a total of $86.9 billion on sales and marketing, as opposed to $35.5 billion on research and development, almost two and a half times as much on marketing as R&D. Those ten companies had an average profit margin of 19.4%, ranging individually from 10% to 43%, with half making 20% or more. And since Medicare is prohibited by law from negotiating drug prices for its 55 million beneficiaries, the program must pay whatever price drug makers set.

The U.S. medical technology market exceeds $150 billion a year in sales, and in 2015 the gross profit margin for the medical equipment and supplies industry averaged 12.1%, according to data from

Studies of doctors’ compensation show that over the past twenty years, that, in general, physician compensation has increased far less than all other components of healthcare. In fact, annual earnings actually declined for the typical physician between 2000 and 2010. Annual earnings for physician assistants and pharmacists have increased at a greater rate. More to the point, as a percentage of total national healthcare costs, U.S. physician wages are small – approximately 9% – a number among the lowest in the developed world.

Hospitals’ costs have increased significantly, but not because they’re making money. A Health Affairs study analyzed hospital income and costs of more than 3,000 hospitals nation-wide and found that fifty-five percent of hospitals lost money on each patient they served in 2013. This does raise the question of whether non-profit hospitals are paying more and more, possibly too much, for high-priced administrators apparently required by the bureaucratic and legal maze generated by the interweaving of private and public medical systems, government regulations, and insurance company requirements. Studies indicate that administrative costs make up twenty to thirty percent of the United States health care bill, far higher than in any other country. American insurers, meanwhile, spent $606 per person on administrative costs, more than twice as much as in any other developed country and more than three times as much as many, according to a study by the Commonwealth Fund.

Then add to that the skyrocketing costs of malpractice insurance and often excessive court judgments in medical tort claims cases.While the amount is subject to dispute, it’s not inconsiderable and also adds to costs.

Unfortunately, neither the Affordable Care Act nor any proposed Republican replacement will do anything to deal with what I’ve mentioned, and what I’ve mentioned are only the most obvious causes of ever-increasing health care costs.

Political Messaging

Everyone who follows media in the entire world likely knows that President Trump sends messages via Twitter. What’s been almost lost in the Twitter-storm, and the swirling claims and counter-claims about Russian influence in one form or another, is another, and far more ominous message.

In the United States, indeed anywhere, one form of “reality” is not what necessarily is, but what people believe is so. If people believe that foreign aid comprises twenty percent of federal spending or that public television and radio constitute five percent of the budget, then for them, that is reality, regardless of the facts. Unfortunately for actual reality, a majority of Trump supporters hold such beliefs, despite hard dollar figures to the contrary. So when Trump’s proposed budget proposes massive cuts to federal programs such as those whose total budgets in reality only comprise than five percent of federal spending, Trump’s followers truly believe that he is trying to make a significant cut in federal spending, while the observations by those who understand the numbers and the federal budget that such cuts cripple worthwhile programs while not really addressing the actual debt and deficit are largely ignored or minimized as just being politics as usual.

The problem is that Trump has no interest in confronting reality. His interest is, as is the interest of all promoters and snake-oil salesmen, to sell people on his version of reality, or to affirm their illusory version of reality to increase his own power and image. He also understands, as apparently the mainstream media doesn’t, that repetition turns anything into popular “truth.”

This is something that the mainstream media still doesn’t seem able or willing to confront. It’s one thing to argue about what national spending priorities should be. It’s another to put forth a spending plan designed solely to appease and appeal to one’s supporters, as Trump has, while totally ignoring fiscal reality. Unfortunately, even a sizable fraction of the GOP members of Congress seems unwilling to come to grips with this, and that’s understandable because Trump will turn on “defectors” and because a majority of Republicans also want to believe in Trump’s version of reality.

The media should be pointing out, daily, and loudly, that the numbers don’t add up. Have you seen a headline claiming “Trump Budget Based on Lies”? Or: “EPA Head, Oil Industry Cause 5,000 Earthquakes.” Or: “Trump Buys Off McConnell.” All of those are legitimate headlines, but you haven’t seen them, and you likely won’t, because if they show up, Trump will accuse them of being crooked liars, or the equivalent.

There seems to be a media assumption that people will see the truth on their own. Really? In a nation that requires remote controls for their televisions, ten second sound-bites, and news by Twitter, maximum 128 characters? Add to that the fact that most media publishes Trump’s proposals without strong critical analyses and worries more about his criticism than letting the public know what is occurring.

Establishing “truth” by repetition is currently winning… and all of us are losing.

Taking Credit

There are people who accomplish good or great things, and there are those who take credit for those accomplishments. As most intelligent individuals know, often the person who gets credit isn’t the one who actually did the work. Also, sometimes more than a few individuals take credit for something that was never accomplished or completed.

Over the course of my life I’ve certainly seen a lot of such instances. One of the best things – or the worse – about being a writer is that when a book is published you get the credit – or criticism. In my case, either, depending upon your point of view, is warranted, because I personally write every word that’s published, except for the few words corrected by my editor. I have been known to borrow/steal ideas from my wife, but the words are my own.

Not all books, however, are necessarily written by name on the spine of the book. While the original “Ellery Queen” mysteries were written by Frederic Dannay and Manfred Bennington Lee, more than twenty of the later Ellery Queen novels were ghost-written by others, including SF author Jack Vance.

Likewise, particularly in politics and often in business and academia, credit or blame is often taken by or placed on the wrong people. President Herbert Hoover didn’t cause the Great Depression, nor did Franklin Roosevelt end it [although he did make great efforts and did his best to mitigate its effects until the economic recovery caused by WWII kicked in]. Bill Clinton got credit for the economic recovery actually primed by the first President Bush.

Then there are the people who labor long and hard and slowly build something from virtually nothing, such as Fred Adams, who created the now well-known Utah Shakespeare Festival [which was good enough to win a Tony several years ago as the best regional theatre in the U.S.]. His wife Barbara did half the work, but only those who knew Fred and Barbara know that because Fred was not only a great builder, but a great showman. There’s also a well-known fantasy author whose wife contributed to every book, but whose name only appeared on the last few.

In more than a few cases, those who build an organization, a cause, a business just aren’t self-promoters, and often, because of that, others take credit… or the individual never gets credit.

More often than not, why someone gets credit, deserved or undeserved, is because they’re a good self-promoter, and there’s nothing wrong with that in itself – unless the self-promoter steals the credit from someone else. What is equally wrong is when the rest of us let the self-promoters who are stealing credit from those who deserve it reward the deceptive self-promoter.

National Identity and Anger

A recent poll from The Associated Press-NORC Center for Public Affairs Research showed that seventy percent of Americans felt that the country was “losing its identity.” Unfortunately, what the poll also revealed was that Americans couldn’t agree on what were the important components of that “identity.”

Although there are some points of agreement among Democrats, Republicans and independents about certain aspects of what makes up the country’s identity, such as a fair judicial system and rule of law, the freedoms enshrined in the Constitution, and the ability to get good jobs and achieve the American dream, recent political developments make it clear that the consensus on these points is overshadowed by the differences.

Fifty-seven percent of Republicans thought one of the most important parts of the national identity was a Christian belief structure, as opposed to twenty nine percent of Democrats. On the other hand, sixty-five percent of Democrats thought that that the mixing of global cultures in the U.S. was important, compared to thirty-five percent of Republicans.

According to the poll, seventy-four percent of Democrats say that the ability of immigrants to come to the U.S. to escape violence and persecution is very important, as opposed to fifty-five percent of Republicans. Forty-six percent of Republicans agreed the culture of the country’s early European immigrants was very important, versus twenty-five percent of Democrats.

Putting these findings together suggests that, in general, Republicans think that the national identity should be based on an enshrined Christian faith and the Anglo-centric patriarchal culture of the first immigrants, while Democrats emphasize a more global-culture, welcoming to immigrants, and more concerned with the present than the past. Obviously, that’s an oversimplification, but there’s still a basic conflict, almost between the past and the present.

That conflict was definitely revealed in the last election, with the Republicans essentially claiming that the country was turning from its white, European, and totally Christian roots, and that such a turn was destroying and/or diminishing not only the United States, but the position of middle-class white American males.

As both the AP-NORC Poll and the Women’s March on Washington [with millions of women in hundreds of cities and towns across the country] showed this Republican “traditional” society is not endorsed by a significant percentage of the country.

Yet the Founding Fathers attempted to hold together thirteen colonies of very different belief structures, some with the [to me] abhorrent idea that slavery was morally acceptable, and they crafted a government based on shared principles that did not require a specific religious belief, or indeed, any belief in a supreme deity at all. For the time, this was an extraordinarily radical enterprise, so radical that the American Revolution equally merits the title of the Anglo-American Civil War.

So why is there so much disagreement about national identity and national priorities?

The election results and the vitriolic rhetoric from the right reflect, among other things, that there are fewer and fewer well-paid unskilled and semi-skilled jobs, and those jobs already lost to out-sourcing and technology, but mainly to technology, removed some eight million largely white men from the middle class. Those men and their families and relatives look to a past of more secure and prosperous employment and believe that the country has lost its way… and its traditional identity, and they’re angry.

On the other hand, there are over forty million African Americans in the U.S., and while the Civil War that resulted in their freedom ended over 150 years ago, those blacks still face discrimination and other barriers to rights equal to other white ethnicities. After 150 years they’re angry, and getting angrier, especially given the number of young black males killed and incarcerated, particularly when study after study shows discrimination still exists and that blacks receive harsher jail sentences for the same offense as do whites… among other things.

Educated women of all ethnicities are angry that they do not receive even close to equal pay for the same jobs as men and that the male-imposed glass ceilings in business, government, and politics still remain largely unbroken.

Since women and minorities are getting more and more vocal, and since minorities are becoming a bigger and bigger share of the American population, I foresee some very “interesting” years ahead, and I’d suggest that the largely white male Congress consider those facts very carefully.

The “Other” Culture

There are several definitions of “culture.” One is the development of microorganisms in an artificial media. Another is “the refinement of mind, morals, or taste.” A third is “the cultivation of plants or animals.” But there are two other definitions that tend to get overlooked: (1) the specific period or stage in the development of a civilization and (2) the sum total of the attainment and learned behavior patterns of any specific period or group of people regarded as expressing a way of life. The second of those latter definitions is the one that tends to get overlooked in government and politics, and yet the problems caused by the “learned behavior patterns” of smaller groups within a society represent one of the principal reasons for societal unrest.

That is largely because quite a few nations, including the United States, are in fact composed of various subcultures. In the U.S., those subcultures, especially those disliked by the majority, are often minimized or denigrated in racial or religious terms. An important point, and one consistently ignored by ideologues, businesses, and particularly politicians, is that “culture,” as exemplified by learned patterns of behavior, trumps “race” or religion. By that I mean that the good or bad traits of group or subgroup of people have virtually nothing to do with their religion or their skin color or ethnicity. What determines how people act is their “learned patterns of behavior.”

And while religion is definitely a learned behavior, how people of a certain religion act can and does vary enormously from cultural group to cultural group. It also varies over time. Some 500 years ago, good “Christian” countries in Europe were slaughtering each other in a fashion even more brutal than that in which the Sunni and Shia factions of Islam are now doing. Yes, religion is a critical part of “culture,” but it ranges from being the primary determinant of a culture to being merely one of many factors, and in the history of certain civilizations, the impact of a religion can and has changed the culture drastically.

As I’ve also noted before, likely more than a few times, history is filled with examples of both great and failed societies and nations identified as being predominantly of one race or religion. There have been great empires in all parts of the world – except, so far, Antarctica, and there have been failed societies everywhere in the world, regardless of race or religion.

Certain cultural practices seem to work better than others, one of which is that cultures that allow religion to control society tend to stagnate and become ever more brutal. Cultures with great income inequality tend to be more likely to be oppressive, and a greater percentage seem to have either de jure or de facto polygamy. A good sociologist could likely carry this much farther, but the basic point is that it’s not only morally wrong to claim that a given race or ethnicity or religion is “stupid” or “inferior” (or any other number of pejorative terms), but also such unthinking “type-casting” totally misses the point. Culture – not race, genes, skin color, or religion – determines how people behave. More to the point, one can change a toxic culture [although it takes time] and a beneficial culture is always only a cultural change or two away from becoming toxic.

The Threat from Radical Islamic Terrorists

I’m fed up with the propaganda from the White House about the “overwhelming” danger to U.S. citizens from radical Islamic terrorists. Yes, there are radical Islamic terrorists, but here in the United States, radical Islamic terrorists are far less of a threat than home-grown right-wing terrorists. Overseas, that’s another question, which is why so many law-abiding members of the Islamic faith want to get to the U.S. – or did before Donald Trump became President.

While consensus on hard numbers is difficult to come by, and numbers vary by source, whatever the source, those numbers suggest that radical Islamic terrorists are not the major threat to Americans – not even close. Other Americans are.

On terms of terrorist attacks in the United States, the numbers are lopsided, to say the least. According to a study by the United States Military Academy’s Combating Terrorism Center, domestic right-wing extremists averaged 337 attacks in the U.S. in the decade after 9/11, accounting for 254 fatalities, while, depending on the study and the definitions, between a total of 20 and 24 terrorist attacks in the U.S. were carried out by Islamic radicals with between and 50 and 123 fatalities.

In the last several years, the vast majority of police deaths have come from domestic extremists. An ADL report states that of the forty-five police officers killed by extremists since 2001, ten were killed by left-wing U.S. extremists, thirty-four by right-wing U.S. extremists, and one by domestic Islamic extremists.

As far as Trump’s proposed travel ban goes, there has not been one single terrorist attack on U.S. soil in the last four decades that has been carried out by citizens of any the seven countries on Trump’s ban list. Out of the 240,000 Americans who have been murdered since the attacks on the Twin Towers in 2001, exactly 123 of those deaths were linked to Muslim-American extremists. In other words, .05123 percent of the murders in the United States in a sixteen year period were carried out in the name of radical Islam. Even figures from the right-wing National Review only list 88 deaths in the U.S. from radical Islamic terrorists since 2009.

Yet at the same time that Trump is citing the danger from radical Islamic terrorists, reports have surfaced that he plans to shut down Homeland Security’s watch list for domestic extremists. Not only that, but bowing to the NRA, he decided to void an Executive Order by former President Obama that would have put people declared to be mentally incompetent by a court on a do-not-buy list for firearms. The NRA argued that mentally incompetent people should have the same right to firearms as anyone else.

And we’re worrying about Islamic terrorists?

Education and the Business Model

More and more state legislators across the country are demanding that education be run in a more business-like fashion. While greater efficiency is likely necessary in many educational systems, running higher education “like a business” is not only counter-productive, but it’s more likely to create more problems than it purports to solve – and business-like approaches so far haven’t shown much success at the university level.

One of the tools employed by both business and educational systems run by the “business model” is to reduce costs by reducing the number of employees and their compensation in relation to “output.” In the business area, this has given us outsourced manufacturing or high-tech automated manufacturing or, on the other hand, in retailing, lots and lots of underpaid, part-time employees without benefits. In education, a similar change is occurring, particularly in higher education, where university faculties have shifted from those primarily comprised of full-time dedicated professors to faculties where the majority of teaching faculty are part-time adjuncts, many of them far less qualified or experienced than seasoned full-time faculty. At the same time, administrations spouting the “business model” mantra have burgeoned.

At virtually all public universities, administrative overhead and full-time administrative positions have increased two to threefold over the past twenty plus years, while full-time faculty positions have decreased, except in the case of some smaller state universities that have expanded massively so that full-time positions have increased somewhat, even though the percentage of full-time positions has decreased to the same level as at other state universities, if not more.

The chief reason for this emphasis on part-time teaching positions is cost. As I’ve noted before, fifty years ago, on average, state legislatures supplied the bulk of higher education funding. In 1974, states provided 78% of the cost of educating a student. Today, although total funding is actually higher, because almost four times as many students attend college today, the amount of state funding per student averages around 20%, although it varies widely by state, and in some cases it is around 10%.

For example, almost until 1970, California residents could attend the University of California [Berkeley] tuition-free. Today, tuition and fees for in-state students are around $15,000 a year. This trend, if anything, is accelerating. Just since 2008, state funding of higher education has dropped by 20% per student

The response by legislatures, predictably, is to push for more efficiency. Unhappily that has translated into “get lower costs however you can.” The problem with this is that the emphasis, no matter what administrators say, is to turn out the most graduates at the lowest cost. Universities also tend to phase out departments with small numbers or high costs, and expand departments with large numbers and low costs, even if students that major in that area have difficulty getting jobs.

In addition, political pressure, both to “keep” students in school for budgetary reasons and to graduate a higher percentage of students, has inexorably decreased the academic rigor of the majority of publicly funded universities and colleges. This, in turn, has led to more and more businesses and other employers demanding graduate degrees or other additional qualifications, which further increases the tuition and debt burden on students. That’s scarcely “economic” on a societal basis because it pressures students to aim for high income professions or high income specialties in a profession, rather than for what they’re good at doing and what they love. It’s also created an emphasis on paper credentials, rather than the ability to do a job. On top of that, it’s meant more highly qualified individuals are avoiding professions such as teaching, library science, music, art, government civil service, and others; and those professions, especially teaching, are being filled by a greater percentage of less highly qualified individuals.

The end result of the combination of stingy state legislatures and the “business model” is less rigorous academic standards and watered down curricula at the majority of public colleges and universities, skyrocketing student debt, a smaller and smaller percentage of highly qualified, excellent, and dedicated full-time professors, and a plethora of overpaid administrators, the majority of whom heap even more administrative requirements on full-time teaching faculty.

No efficient business actually operates this way, and why higher education gets away with calling what it’s doing “the business model” has baffled me for more than two decades.

Economics, Politics, Business, and Regulation

To begin with, economics and business are not the same, although they share much of the same terminology, because the economics of business center on business, while the study of economics, at least in theory, encompasses all of society, and just not business, even though some business types have trouble comprehending that a nation’s economy consists of more than business, or more than government and business.

And no matter what they claim, most business people really don’t understand economics, or choose not to. Likewise, very few economists really understand business. Politicians, for the most part, understand neither, and most Americans understand even less than the politicians. This is, I submit, one of the fundamental problems facing the U.S. today.

Let’s just look at why in terms of fundamentals. Supposedly, the basis of economics and business rests on the interaction of supply and demand. In general terms, “supply” means the amount of a good sellers are willing to provide at a given price. Demand is what buyers will purchase at a given price. In a relatively free market [there are no totally free markets, and never can be, a point too many business types fail to acknowledge publicly], the going price of a good or service is set when supply and demand meet. If there is greater demand or a lesser supply, usually prices rise. If demand falls, or supply increases significantly, prices usually fall, again in a relatively free economy.

Of course, no economy is completely free because to have a working economy requires a working society, and human beings have yet to create a working society without government, and government, for various reasons, always imposes restrictions on the market. Some of those restrictions, given human nature, are necessary. Why? Because of the intersection of the way business operates and human nature.

As some have pointed out, the price of a good or service is not necessarily its cost plus remuneration to the supplier, but over time, price has to consist, at the least, of the amount necessary to cover costs of production plus enough above that to keep the supplier or business going. But the devil is in the details, and one of those details is how one defines “costs of production.”

There are all sorts of costs – fixed costs, marginal costs, operating costs, external diseconomies [otherwise known as negative externalities], etc. The cost that matters most to a business is whatever costs the business is required to pay by both the demands of the marketplace (i.e., supply and demand) and the government. If a business has to pay taxes, that’s a cost imposed by government. So are wage, benefit, safety, and environmental standards.

So… by what right, in a supposedly free market economy, is government imposing those costs on business?

The reason for government action is because: (1) the marketplace doesn’t include all the costs of production and (2) a totally “free” marketplace creates wage levels and working conditions virtually all western governments have declared unacceptable, and, therefore, governments have set minimum standards for wages, safety, and worker health conditions.

In addition, some of those government taxes provide for the highways and airways on which business goods are transported, for the national defense which protects business and everyone else from enemies from coming in and seizing businesses and properties and which allows U.S. businesses to conduct operations elsewhere in the world, for regulation and continuance of a stable banking system, for public safety, and so forth, all of which make the operation of businesses possible.

One of the reasons that, years ago, the Cuyahoga River next to the Republic steel mill in Cleveland caught fire was because the marketplace cost, and thus the price of a good, didn’t include costs passed on to others in society in the form of polluted air or water, and thus, any manufacturer who did restrict the emissions of pollutants incurred higher costs compared to producers who didn’t. Consequently, marketplace “discipline” effectively encouraged pollution, or at the very least, certainly didn’t discourage it. Costs inflicted on others are usually termed negative externalities [the older term is external diseconomies], but such terms tend to gloss over the fact that pollution and other degradation of the environment caused by manufacturing is not reflected in the cost of production unless government requires it.

So, when a manufacturer claims that environmental or worker safety regulations are stifling the economy, what that manufacturer really is saying is that he or she can’t compete with manufacturers in other countries that have fewer environmental regulations, and thus, often lower costs of production… and when that manufacturer demands less regulation, it is a demand to allow more pollution so that the manufacturer can make more money – or even stay in business.

Balancing economic output and worker and environmental health and safety is a trade-off. Although some regulations have been ill-thought-out, in general, stricter regulations result in a better environment for both workers and society, but if the rest of the world has lower levels, those U.S. industries competing in a global market will suffer higher costs, unless they have other cost advantages, such as better technology or far more productive workers. Because environmental control technology is expensive, most industries tend to oppose regulations requiring more technology.

In certain industries, workers, such as coal miners, often oppose environmental rules because those rules raise costs, and higher costs may result in the loss of their jobs. The question in such cases is whether continuing such jobs is worth the environmental and health damage, both to workers and to others. The Trump administration is working to remove an Obama administration rule that put stricter limits on how close to watercourses coal mining and chemical wastes could be placed, claiming that the rule will cost jobs, which it likely would to some degree. But the rule would also cut the number of coal and chemical industrial storage and waste disposal sites near rivers and streams in an effort to eliminate slurry and waste accidents such as the one along the Elk River in West Virginia in 2014 that fouled miles of streams and rivers, poisoned hundreds of people who drank the water unknowingly, and left more than 300,000 people without drinkable water for months.

History has shown, convincingly, for all who are willing to look at the facts, actual deaths, poisonings, and worse, that, without government regulations, a significant proportion, sometimes all, manufacturers in an industry will commit unspeakable wrongs in the search to maximize profit. Remember when the Ford Motor Company tried to cover up the faulty design of the gas tank in the Ford Pinto, deciding that it was cheaper to pay legal costs for deaths [which Ford estimated at $49 million] rather than produce a more expensive gas tank, which would have cost $113 million. Ford decided against the fix on a cost-benefit basis, then ended up paying out much more in legal settlements, in addition to a costly recall.

This kind of business cost-benefit analysis continues today, and that’s why the “business model” can’t be allowed without oversight and regulation. The question is not whether to regulate or not to regulate, but how much regulation is appropriate in what circumstances. Or put another way, is your business more important than my health? Except that business owners would say, an increase in regulations will kill my business and probably won’t measurably improve your health. Both are likely exaggerating, and that’s why verifiable science and facts – scientific, financial, and economic – are critical, and why political slogans and political pressure brought by outside interests have no place in determining whether a regulation is necessary, and if so, the degree of regulation required.

Bookstore Idiocy

Last weekend, I attended a fantasy and science fiction literary symposium in northern Utah, called LTUE (or, after a noteworthy writer, Life, The Universe, and Everything). As more of a literary symposium than a standard convention, LTUE attracts a great number of writers and editors, and an even greater number of would-be or beginning F&SF writers. Over the years, the guests of honor have included best-selling authors, F&SF publishers, and noted editors in the field (and, yes, I’ve been a GOH twice).

One of the highlight events of LTUE is a “mass signing” of all attending authors on Friday night, and this is facilitated by a book-selling site in the same enormous room as the mass signing, which means that those who are attending can run over and buy a book for an author to sign if they attended panels or discussions and realized that they really wanted to try an author’s work.

For over twenty years, the Barnes & Noble in Orem operated this on-site temporary book-selling venue, and, from what I’ve observed in the years I’ve attended, they seemed to do very well indeed. I know that my books have always sold moderately well at LTUE, and often the works of bigger name authors sold in the hundreds of copies over three days.

This year, however, the B&N store was unable to continue this activity, not because it didn’t sell books and make money, but because B&N recently adopted a chain-wide policy that banned “satellite events.”

To me, such a blanket policy makes no sense. I could understand a policy that declared that satellite events must cover their costs or come close, but a blanket ban? This reeks of accounting bean-counting. The business of a bookstore is, at least ostensibly, to sell books. If LTUE gets readers to try reading authors new to them, at least a proportion of those readers will buy more books by those authors. This increases sales, and since B&N is the only large set of bookstores in Utah, at least some of those sales will come from B&N. What’s not to like about making a bit of money at the symposium and increasing overall sales?

Tom Doherty, the publisher of Tor, came up through the sales ranks, and he’s told me more than a few times about the role the small mall stores – Waldenbooks and B. Dalton, now both defunct – played in developing readers, because they were convenient places for people to pick up books, not destination stores like the current B&N megastores, or the vanished Borders stores. Now, most of those convenient places are gone, whether it’s the vanished rack in the drugstore, the small mall bookstore, or the like, and those bookselling venues that are left are stocked by computer on based on national sales that often have little to do with the community where the sales outlet is located. Along that line, the ability of B&N store managers to customize their inventory has been reduced, if not eliminated.

I know B&N is having financial problems, but focusing on almost mindless cost-cutting when the effect of cost-cutting is to reduce sales is counter-productive. Success is measured by increasing sales in a cost-effective manner, not by cutting costs and doing less. You don’t turn around a financial down-turn just by cutting costs; you also have to increase sales, and doing things like a blanket ban on satellite events cuts down on sales. It also leaves a bad taste in the mouth of the symposium regulars and organizers, as far as B&N is concerned – and those people are all heavy readers. Does this really make sense, economically or in PR terms?

By the way, a small book vendor did step up at the last moment and set up an on-site book store, and she certainly sold a number of my books, as well as those of quite a few other authors.

Political Appeal and Innumeracy

U.S. federal spending in 2016 was roughly $4 trillion, and revenues were slightly over $3.4 trillion, leaving a deficit of around $600 billion. Out of total spending, $2.6 trillion was mandatory spending on programs such as Social Security, Medicare, and Medicaid. Spending on these programs cannot be cut without major changes in federal law, and since 77% of all Americans oppose such cuts, it’s highly unlikely that major cuts will occur any time soon. Then add to that some $260 billion in mandatory payments on the federal debt, and essentially 72% of federal spending cannot be effectively cut, at least at present. That leaves $1.1 trillion in discretionary spending, that is, spending that can be increased or decreased by Congress.

Unhappily, the vast majority of Americans have no real understanding of even these basic numbers, especially Fox News viewers, 49% of whom declared in a recent poll that cutting “waste and fraud” would eliminate “the national debt” [which now stands at $14.4 trillion]. A number of polls over the year have shown that most Americans believe that 25% of the federal budget goes to foreign aid [it’s less than one percent], and that five percent of all federal spending goes to PBS and NPR [in fact, roughly a tenth of one percent does].

The real numbers are more daunting. The largest component of discretionary spending is defense, and while the DOD “official” budget is slightly under $600 billion, various contingency funds and defense activities funded in other forms and by other agencies [for example, the Coast Guard is funded by the Treasury Department], brought the total annual cost of U.S. defense much higher, as high as $900 billion, according to some sources, but even assuming $600 billion for defense, that leaves $500 billion for everything else, including agriculture, energy, education, transportation, federal lands management, national parks, environmental protection, veterans benefits, welfare payments, and a whole lot more.

Trump’s proposed tax cut would reduce federal revenues by $500 billion, according to the Tax Foundation, on top of that $600 billion deficit, so even if he could persuade Congress to cut non-defense discretionary spending by 50% — in essence gutting most federal agencies, the deficit would increase to nearly $900 billion, and that doesn’t count the additional spending he’s proposed for infrastructure spending – which initial estimates suggest range from $500 billion to over a trillion dollars, over ten years, or $50 billion to $100 billion a year.

Proponents of the Trump plans claim that all the new investment and jobs will increase tax revenues, and some probably will, but not anywhere close to enough to deal with the federal deficit that increases the national debt – and the interest that must be paid on it – each year.

Based on a 2014 study by Standard & Poor’s, if Congress were to pass a $50 billion a year infrastructure bill, that legislation would create an additional 1.1 million jobs. Construction workers make an average of around $35,000 a year, and, under the best estimate of the Trump tax plan, those million workers would pay around $4,000 in federal income taxes each, thus adding up to an additional $4.5 billion. Economists like to point to the multiplier effect, i.e., how many additional jobs are created by one new job. According to the IMF, under present conditions, the multiplier effect is hovering around one, one additional job created somewhere in the economy for each new job created by investment. So… fifty billion dollars of infrastructure investment might create somewhere over two million jobs and possibly add $10 billion in tax revenues while costing $50 billion. Even if the multiplier effect is five times as much as the IMF says, the infrastructure proposal is at best a break-even proposition, and, as such, might be a good idea. BUT… it won’t do much for reducing the current deficit, let alone the increase in the deficit that will be occurring as a result of more federal spending on defense, and the likely coming increase in interest rates.

The other bottleneck in increasing jobs is the mismatch between available workers and the available jobs. According to research from human resources consultancy Randstad Sourceright, a survey of more than 400 U.S. executives found a skills gap impacting their businesses. Four-fifths of those executives said that a shortage of sufficiently skilled workers will affect their companies in the next 12 months. Complaints of hard-to-fill factory jobs are backed up by Bureau of Labor Statistics data: 324,000 manufacturing spots were open in November, up from 238,000 a year earlier.

Another problem that the Trump approach doesn’t address is that jobs creation isn’t equal. Right now, employees of high-tech companies receive almost 12% of all employee compensation, but there are only seven million of them and the average salary is close to $105,000, more than double the salary of the average industrial or manufacturing employee, or triple that of a construction worker. In addition, the tech industries are only adding about 200,000 employees a year. That doesn’t do much for the nearly 15 million unemployed or underemployed Americans, or the roughly three million college graduates each year. The largest numbers of jobs are in the lower paid service industries, and all the investment money putatively freed up by the tax cuts will be going to tech-heavy companies, and those jobs comprise less than 5% of total U.S. employment.

Massive tax cuts, more defense spending, a major infrastructure initiative… all to be paid for by new jobs and cuts in such federal programs as PBS, NPR, the Endowments for the Arts and Humanities, foreign aid, and the like? The numbers don’t add up, even if the political appeal does, perhaps because most Americans don’t seem to understand the numbers, or care to.

The Education/Business Fallacy

Recently, a semi-prominent president of an educational institution told a group of music professors that they shouldn’t complain about the fact that they were paid less than professors in other disciplines or that they were required by the institution to work longer hours and more days than most other professors because they “knew what they were getting into.” Besides the arrogance of the statement, I also found the sheer ignorance behind those words even more disturbing.

First off, when the vast majority of students on the collegiate or graduate level begin their academic preparation for their careers – whatever those careers may be – they have only the vaguest understanding of the scope of that career or of the demands it will make on them. Those only become truly apparent AFTER students graduate and move into the professional fields. That’s one of the reasons why something like 50% of all teachers drop out of teaching within five years. It’s why professionals change careers or leave them behind totally.

Second, this kind of attitude is typical of those who regard education from the “business” mindset and contributes to such factors as pushing to obtain as many students as possible, regardless of whether the students are ready or suited for college and where there’s a huge push to “steer” students toward “STEM” education and careers, as if students are organic robots that can simply be programmed toward the most lucrative careers, or those that will at least allow them to repay their often-massive student loans. As both a parent of a number of children who have been successful in various fields and careers and as a former faculty member on the collegiate level, I find the idea that students can be successfully “programmed” for specific careers or even careers in a general field totally ludicrous. People have different levels of ability in differing fields and different mind-sets.

For someone to have suggested that I might have a career in music because pop music stars make lots of money would have been both criminal and deceptive, given that I can’t carry a tune and have no sense of rhythm. In turn, to suggest that a good music student who can barely pass basic chemistry or physics, and for whom calculus is akin to magic, would be better served by going into a science or technology career would also be criminal and deceptive.

Third, the emphasis on college as vocational training, particularly on the undergraduate level, ignores reality. Even today, most college-educated individuals change jobs and often entire career paths seven to ten times in their professional lives. Those who make those transitions most successfully are those who have learned how to keep learning. Even those who remain in the same field have found that the requirements of their positions continue to change as technology advances.

Fourth, available jobs and job requirements are constantly changing as the result of shifting economic factors and technological advancement, and “guiding” students to the current “jobs du jour” may serve those not strongly motivated to enter that field poorly indeed.

Fifth, while employment “supply and demand” does in fact determine compensation levels, those levels have increasingly less and less to do with the skills needed by society. At least at present, scarce skills, even those that aren’t all that necessary to the functioning of society, are more highly valued than many necessary occupations and services. No matter what the financial types say, we need very few hedge fund managers for a successful civilization. We need a lot more of the practical and mundane skills, from electricians and plumbers to good classroom teachers and more doctors in general practice, but fewer and fewer doctors want to be in internal medicine or general practice because those fields usually pay half what specialized medical fields do and require longer hours, making it far harder to pay off the medical school loans.

Finally, what drives personal success in any field is the love of what one is doing combined with the education and capability to do the job at hand. “Training” a student for a theoretically more remunerative field that disregards the student’s abilities and interests serves neither the society’s interests nor the student’s. It’s a sad commentary on higher education when a university president suggests that because economics lowers the comparative compensation of professionals in certain disciplines and because the university takes advantage of that to the point of requiring more of those individuals, it’s all the fault of those professionals because they “chose” to pursue the field in which their talents lie.

This administrative mindset is also why more and more universities hire fewer and fewer expert and dedicated full-time professionals and more and more underpaid part-time adjuncts, because the quality of the instruction has become increasingly less and less important than the push to lower “people” costs, or at least the people costs associated with actual learning, as opposed to those associated with collegiate athletics.

Decline of Fictional Uniqueness?

As some of my readers know, these days I binge-read fiction on business trips or other travels, and, for the most part, I make an effort to search out books and authors I haven’t read, as well as books that deal with what I’d call interesting subjects or more familiar subjects addressed in a unique fashion.

The problem is, at least for me, that, beneath the veneer of “new and different” claimed by publishers and authors, I’m finding that there really isn’t all that much truly new and different. Oh, there are definitely books that deal with “new and different,” but not nearly so many as the publishing hype might suggest. Perhaps that’s always been the case, and perhaps when an author gets older, and has read as many books in the field as I have, it’s just harder to find something that’s truly different.

But I’m not so certain about that. Tolkien re-invented heroic fantasy with The Lord of the Rings, and I can’t even count the number of follow-ons and knock-offs. As far as I can determine Fred Saberhagen re-invented the vampire genre with The Dracula Tape in 1975 [Ann Rice’s Interview with the Vampire wasn’t published until May of 1976], although one could also claim that Richard Matheson’s I Am Legend [1954] was the first of the true twentieth century vampire “re-births,’ but Matheson’s blood-suckers were more “generic.’ Saberhagen also pioneered the whole idea of malevolent, non-gendered cyber beings with his berserker stories, something that tends to get overlooked in all the hoopla about Ann Leckie’s Ancillary Justice and its sequel.

Certainly James Tiptree, Jr., [Alice Bradley Sheldon], Joanna Russ, Sheri Tepper, and Ursula K. LeGuin were questioning gender roles and societal norms some thirty years ago, and even in 1987 Melissa Scott wrote The Kindly Ones, a masterful work in which it is impossible to determine with any certainty the gender of the protagonist.

The Wheel of Time and Game of Thrones are essentially huge-scale epic fantasies, with a few twists, that, in my mind, at least, fall into the post-Tolkien follow-on school.

Now, as I’ve noted in some of my comments on what I’ve read, there are still books with unique twists on old themes and some few with new themes, and I’m still looking, but it just could be that, as I’m getting older, it’s just harder to surprise me.

What do you think… and what books have struck you as unique… and why?

Egocentric Facts and “Morality”

Donald Trump’s initial reaction to the questions raised by federal appellate judges about his Executive Order establishing a travel ban clearly establishes his viewpoint – again. Anything he believes is right is indeed right, and it doesn’t matter what judges, history, or the Constitution say, because he is right. Even after the Ninth Circuit Court of Appeals upheld the stay on the travel ban, Trump insisted that the Court was wrong and that the Supreme Court will see it his way.

Since the Ninth Circuit merely ruled on the issue of not allowing the ban to take effect until it is fully reviewed by the judicial system, it’s certainly possible that some version of the ban will be approved. In time, in fact, that’s very likely to occur, but it most likely won’t be the ban that Trump initially proposed.

The ban issue also is merely one facet of an unfortunately larger issue. The man who outsourced the production of all of the consumer products bearing his name (but who champions verbally U.S. production while avoiding it) is “right.” The man who stiffed scores of contractors is “right.” The man who insisted for years that President Obama was not a U.S. citizen is “right.” The man who promised a clean sweep of corruption and business as usual in Washington and who started his administration by appointing the wealthiest and most “business as usual” types as his cabinet picks is “right.”

This is a man who refuses to accept proven and verifiable facts that contradict him and who attacks personally the people who cite such facts to oppose him.

I’m not sure which appalls me more, the fact that Trump is so arrogantly sure about what is clearly not so, while being blatantly hypocritical, or the fact that some 48% of U.S. citizens apparently believe him, and more than 55% approve of the travel ban.

We truly live in a polarized country, so polarized that what is accepted as fact depends more on ideological beliefs than concrete and provable evidence. Polls show fairly clearly that more and more people are rejecting provable facts that don’t agree with what they wish to believe, and Trump is not only playing to this weakness but doing so in a way that attempts to destroy the credibility of anyone and any institution that disagrees with him… and his supporters and 90% of Republicans are lapping it up, according to a recent poll by Emerson College.

This sort of attack on the media isn’t new. A then-little-known German politician started the same way in the late 1920s, with blistering attacks on those who opposed him, with deceptive statements and outright falsifications, and by the early 1930s had complete control of Germany.

In 1935, the novelist Sinclair Lewis wrote a novel entitled It Can’t Happen Here about a U.S. politician taking power in the same way. But it can happen here, particularly if Trump and his supporters are allowed to flout the laws and tell blatant falsehoods without being challenged. All it takes is 51% of the voters to vote for such behavior on a continuing basis.

Political disagreements are endemic and necessary in our system of government, but vicious personal attacks by the President and his staff, blatant lies and falsehoods, and, in particular, personal attacks on other branches of government that disagree with the President are neither necessary nor desirable. Nor are attacks on a free press anything but a disservice to us all.

Simplistic “Solutions”

President Trump has unleashed his pen and set forth something like twenty Executive Orders, in an apparent effort to carry out a number of his campaign promises. What is obvious about this rush of rash action is that neither Trump nor his advisors have thought through the implications and ramifications of those orders, nor the legal requirements under the Constitution.

One of the basic rights under the Constitution is the right to fair treatment under law, and a keystone of that is the right to due process of law. Certainly, the travel ban doesn’t seem to comply with the Fifth Amendment of the Constitution, which states that “No person shall … be deprived of life, liberty, or property, without due process of law…” Procedural due process requires that government officials follow fair procedures before depriving a person of life, liberty, or property, and those procedures minimally require the government to afford the person notice and an opportunity to be heard.

That apparent failure was one of the legal bases for the various lawsuits to stay or lift the travel ban.

Beyond the legal issues are the practical issues. Forty university presidents signed and sent a letter to Trump protesting the ban, noting that they had students, professors, and university employees scattered across the globe, and that many were being summarily detained or denied a return to the United States, and that the travel ban would have an adverse effect on those universities and individuals. What seemed to be overlooked is that the U.S. hosts over four million international students, and a great number come from countries where Islam is the prevailing religion.

In addition, businesses and non-profit organizations with international activities would also be affected in a similar fashion, and the “roll-out” imposed significant costs and disruptions upon the airlines as well – all without a significant impact on terrorism.

Like it or not, we live in a high-tech, complex global economy, and simplistic, or “simple,” solutions are seldom suited to resolving problems, especially when they’re thrust without notice or warning on unsuspecting travelers, businesses, and, especially, the government officials who are supposed to implement them.

Yes, we’ve had some terrorist acts in the United States, but we’ve likely had more deaths recently caused by driving or walking while texting than terrorist killings. We’ve certainly had more deaths caused by good U.S. citizens killing each other or themselves with firearms, or in vehicle accidents, and I don’t see any Executive Orders banning texting, drunken driving, or detaining anyone carrying a firearm. But our good President can certainly whip out an Executive Order banning anyone from seven countries from entering the United States on the grounds that a handful might be terrorists.

Yes, we likely do need a careful vetting of immigrants, but that’s been going on all along. For the past several years, under present security procedures, the number and percent of Islamist-inspired terrorist activities is quite low in the U.S., and some of those acts have been carried out by people who were either raised here or born here and who would not have been precluded from those acts by the travel ban. We’ve also had some nasty native-born terrorists over the years, such as Timothy McVeigh and Ted Kaczynski, or senseless killings of six-year-olds by automatic weapons at Sandy Hook elementary school, but those didn’t seem to require Executive Orders to address.

Equally important, a slap-dash ban will only increase the incentive for that minute fraction of Islamic believers who are terrorists to radicalize more people. That’s a far greater danger than that posed by refugees and immigrants, and also an example of the damage hasty and ill-thought campaign promises can create when dashed off as Executive Orders.

The Right to Impose?

With Trump’s nomination of Neil Gorsuch to the Supreme Court, liberals are fuming, and conservatives are rejoicing. Both ought to be weeping.

The “battle” over the appointment of a justice to succeed the late Antonin Scalia hasn’t been a battle over law, or justice, but a fight over who can impose what on whom. And it’s a fight we shouldn’t be having, one that the Founding Fathers very much tried to avoid, both in the structure of our government and in the clause that was designed to separate church and state.

That clause was included in the Right of Rights specifically because European history of the previous centuries had essentially been a series of “religious wars” fought to determine who could impose whose belief system on whom.

The conservative religious right in the United States very much wants secular law to embody their religious beliefs, and, where possible, they’ve attempted to accomplish just that. The ultra-liberal left tends to want to impose what one might call mandated equality of outcomes, as opposed to true equality of opportunity.

The right doesn’t really want true equality of opportunity because it would destroy the world they know by getting rid of legacy admissions to Ivy League universities, limiting preferential education and opportunity based on familial resources, removing female deference to men and acknowledging that women do not have to be brood mares, eliminating male gender superiority in virtually all economic and political structures, and by requiring an acceptance of all individuals based on character, ability, and accomplishment.

The left doesn’t really want true equality of opportunity because it would reveal that, regardless of anything else, individuals have different capabilities; that certain cultures and cultural practices are in fact toxic, that certain other cultures and cultural practices do in fact achieve better results, that effort without competence and ability is meaningless, and that all the government programs in the world cannot elevate those unwilling to make the effort…among other things.

And both sides tend to be resolute in their view that compromise is unacceptable; even while decrying the same sort of unyielding religious warfare that is taking place in the Middle East.

As I’ve written before, justice is an ideal, an ideal that can never be reached, but one that we should aspire to, nonetheless, while law is an imperfect tool, albeit one of the best we have, in an effort to achieve justice… but it is not the only tool. Without understanding, compassion, and compromise, law becomes a tyrant. And right now both sides want absolute control of that tool, rather than seeking a way to keep it from imposing a tyranny on the non-believers, i.e., the other side.

“Do You Hear What I Hear?”

During a discussion with a friend, who spent a career in the production side of popular music, ending up as the head of a fairly well known record label, the question came up as to what actually constituted a “standard” in popular music, that is, a song recorded and/or covered by a number of well-known performers. Our friend the former music industry executive immediately pointed out that each generation had its “standards,” to which I rejoined that not all “standards” were necessarily equal, because there can be a difference between standards, particularly given in the technical skill of the composer and performer of one generation’s standard, and that of another generation’s standards, and that there also can be a great difference between popularity and artistic excellence.

That brings up several questions. First, how can a listener tell the difference? Second, does it really matter? And third, assuming that one can tell the difference, why does it matter?

To begin with, most listeners can’t tell the difference, not really, because a listener can’t tell the difference if he or she hasn’t listened to a broad range of music, and the majority of listeners tend to listen in a narrow comfort range, both in terms of type of music and the time and style in which it is/was played or recorded. In addition, if someone doesn’t know something about the technical side of both instrumental and vocal music production, the distinctions are merely based on likeability or familiarity. That’s fine from a personal point of view, but it means that such a person really can’t see how music has changed.

Does being able to see the changes and what they indicate really matter? Again, on the level of whether one enjoys the current “standards,” it doesn’t. On a cultural and societal level, I’d submit that it does. When complex melodic patterns are replaced on a wide scale by short melodic repetitions, when repetitive rhythms and percussive effects overshadow melody and meaning, when lyrics become increasingly crude and simplistic in popular music, those all reflect a considerable societal change. But anyone who hasn’t listened to poplar music spanning decades or hasn’t studied it won’t even see the change, much less consider the implications.

Popular music is symptomatic of culture, and the issue goes well beyond music. The same issues apply to popular fiction, what art is popular, what movies and television shows earn the most or have the highest audience ratings, and even what theatre is most popular – or what entertainment form is dominant.

The majority of those immersed in a society/culture really don’t see, let alone question, what such changes mean… and what they foretell. Part of that is that most members of any culture don’t understand their own history, let alone the broader path of past history.

In the early days of Rome, gladiatorial contests were rare, and semi-religious. Chariot racing was small-time. By the time of the empire, particularly after Augustus, both had become popular blood sports. A century ago, football in the United States was a collegiate sport, and limited to comparatively few colleges at that, and baseball was the national sport. There was auto racing, but it was the habit of a few, generally wealthy, individuals.

Now, football has become the national blood sport; basketball has gone from being a generally non-contact sport to a contact sport, and NASCAR is a multi-billion dollar business. And, oh, yes, the most popular music is incredibly simplistic and linguistically almost unintelligible (while sounding pretty much all alike), and a greater percentage of movies now incorporate more and more sex and violence.

Do you see what I see?

The Importance of Place

No, I’m not going to pontificate about where people of privilege live and how that location benefits them, true as it is. Rather, I’m going to point out how the patterns of how and where Americans live influences (some might say biases) the entire political system of the United States.

By now, most people who follow U.S. politics know that Hillary Clinton won the popular vote by almost three million votes, yet lost the Electoral College by a wide margin – termed “a landslide” by Trump. When the Founding Fathers created the Electoral College, the reason was very simple. They didn’t want Presidential elections decided by the votes in Virginia and Pennsylvania, at least not exclusively by those two states.

What people tend to overlook about the Electoral College is that it reflects a mash-up of the make-up of both the U.S. House of Representatives and the Senate, that is, the total number of votes represents the total number of representative and senators. While states with greater populations have a greater number of representatives, each state has two senators. Thus, right off the bat, rural and sparsely populated states have an advantage.

The second problem is that, when the Founding Fathers set up the Electoral College, the United States was essentially ninety percent rural. This meant, that from the beginning, state legislatures were dominated by rural interests. While that influence has continually diminished, on the state level, in almost every state, rural lawmakers have an outsized vote. More important, since state legislatures, in all but two states, as I recall, dominate the reapportionment of congressional districts every ten years. Those states with rural populations tend to redistrict with an eye to maintaining the dominance of rural interests.

What has happened in Utah provides a good example of this. When I was working some forty years ago, Utah had two representatives, and one was a Democrat and one a Republican. This wasn’t a one-time thing. It continued for at least a decade, except… when Utah got more people and another Representative, the legislature made sure two of the three seats were Republican. At that time Utah periodically elected Democratic governors. For the past twenty-five years, there haven’t been any. That’s largely because of redistricting. And now, all the representatives and senators are Republicans, despite the fact that Salt Lake City has a Lesbian Democratic mayor. This just might have something to do with the 2011 re-districting that split up Salt Lake City so pieces of that Democratic bastion were included in districts where Democratic voters were outnumbered by Republicans.

Under current law, this is perfectly legal, but that “legality” overlooks two facts, one demographic and the other political.

The demographic factor is that poorer voters, for the most part, tend end up in high population density areas out of economic necessity. This makes shenanigans like re-districting them to minimize their impact much easier, and once that happens, their political power is reduced.

The political factor is that it’s not only expensive to run for political office, but it also requires name recognition, and our current President is a very good example of this. The only practical way for a non-wealthy candidate to gain political office is to work his or her way up the ladder, from city council to state representative to state senator, then U.S. Representative. If you’re in the minority, current redistricting practices make this difficult, and, as in the case of Utah at present, pretty near a practical impossibility. Add to this the fact that people working near the minimum wage level, who tend to lean Democratic usually have less financial resources, and less time to devote to politics.

California is an example of more successful Democratic redistricting, but I’d submit that it only worked there because of the growing wealth of the “newer” entertainment industry, which tends to be more liberal. Without that wealth, the state would likely have remained as it was in the time of Ronald Reagan, and the Democrats in most states can’t muster that of financial support.

So…in a different way… place matters more than is usually considered.

The March on Washington

Last Saturday, in Washington, D.C., well over 300,000 women, possibly as many as half a million, demonstrated in support of women’s rights and against the Trump administration’s positions on those rights, as did hundreds of thousands in scores of cities and towns across the nation – and even in a number of foreign cities. High profile women, primarily in politics and entertainment, spoke and exhorted women and men alike to press to retain and expand women’s rights.

Meanwhile, in thousands of suburbs and small towns across the country, life continued apace, with almost no mention or recognition of the march, except through national media outlets and media sources based primarily in large urban centers. Trump and his administration largely ignored the protestors, except to complain that they were snarling traffic and making it difficult for him and his staff to get to a scheduled meeting at CIA headquarters.

What all those demonstrators may not realize is that their millions still comprise only a few percent of the American people, and it takes more than that to effect political change, or even to keep Trump from rolling back past changes.

The bottom line? Demonstrations, whether by women, black, or other minorities, mean nothing to this president and this administration. By themselves they will change nothing. Trump respects no one and no thing except his own ego and propositions, and nothing will change his mind or his actions except some form of power, whether that power be electoral, legal, regulatory, economic, or military.

This is going to be the reality of the political system so long as Trump is president. Those who disagree will either be ignored or attacked. Civility will avail an opponent of Trump nothing but contempt, and Trump will attempt to meet power with power and crush it. Oh, there will be times of civility and charm, but only when it suits Trump.

I honestly don’t think most current politicians, whether they’re Republicans or Democrats, have come in contact with anyone like Trump, and they in for more of a shock than they realize.

As for the Democrats, for the past decade and a half, if not longer, they’ve relied far too heavily on media to make their case and push their policies. They’ve neglected building grass-roots and local and state-level political structures in far too many parts of the country, and they’ve thought that demonstrations would push the way for change – and, for the most part, that hasn’t happened… and it won’t. Right now, such demonstrations will either be ignored or create a backlash of greater polarization. And, if the Democrats don’t get back to basics and hard political, legal, social, and economic grunt-work, they’re going to continue to get steam-rollered.

How Could You Possibly Miss…?

… or what about a little perspective?

The other day I was reading a post on that listed books in which weather magic was central to the plot – and, yes, The Towers of the Sunset was mentioned. When I came to the comments, all of which either seconded a book on the list or suggested another, one particular comment struck me, because over the past ten years, I’ve seen a form of this comment time and time again. The comment poster wrote, “How the list could be drawn up without [XXXXX] defies belief.” I’ve left out the name of the book because the title is irrelevant to what follows.

Once upon a time, when I was a struggling poet and had a day job, and even before that, I was a voracious reader, largely of science fiction, because that was in the days before much fantasy was published (and part of that time was even before The Lord of the Rings). I read a lot, sometimes close to three hundred books a year, and I’d accumulated a paperback library of some 3,000 F&SF books before I moved to New Hampshire in late 1989 and had to downsize my library – and life – a great deal. At that time, as my late editor David Hartwell pointed out, back then, it was barely possible to read every new F&SF title that came out in a year, and I came close some years.

That was then. This is now, as the saying goes. Right now, the major publishers and the genre F&SF presses publish around 1,800 new titles a year, and reprint another 1,600 – and that doesn’t count mainstream, romance, or mystery books that cross over or self-published titles. So there shouldn’t be a question as to why a science fiction or fantasy reviewer or columnist perhaps doesn’t include all the books that fit a given category, such as weather magic. It’s because, despite best efforts, no one can read them all, except perhaps a speed reader who has nothing else to do.

But then, shouldn’t such reviewers or columnists at least read the “best” books? That becomes a question of exactly what are the best books. Locus magazine, which bills itself as “the magazine of the science fiction and fantasy field,” last year recommended something like 120 titles. That listing doesn’t include some titles recommended by others, such as the Nebula and Hugo awards, or by other recommenders, such as Kirkus, Library Journal, or Publishers Weekly. I’d venture that every year more than three hundred titles make some authority’s “best” list. That’s not surprising; even “experts” in the field have strong disagreements about what constitutes a good book, and I’ve definitely disagreed with some of those “best” recommendations or felt that other books that didn’t get a recommendation deserved such. And, to no one’s surprise, least of all mine, I’ve had more than a few books receive starred or rave reviews from one expert and be totally panned by another.

Unfortunately, there seems to be a growing sense of outrage, at least among some readers, when reviewers or “experts” disagree with their opinion or fail to mention a work they feel is important or that shouldn’t have been overlooked. There’s no doubt that some works probably shouldn’t be overlooked, for better or worse, because of their enormous impact. In this light, certainly Lord of the Rings comes to mind, as well as other works that have shown their impact by remaining in print and being widely read for several decades.

But the bottom line is simply that it’s difficult, if not impossible, even for the reviewers and “experts,” to read every book recommended as “best,” let alone every book that every reader feels is important, let alone agree on the significance or contribution (or lack thereof) of such books.

Another Legislative Misstep?

Last year, in an effort to curb the sale of fraudulently signed sports memorabilia and other memorabilia, the state of California passed a law that also affects the sale of books signed by the author. Although the sponsor of the bill claims that it was not meant to apply to bookstores and booksellers, it appears that such an exclusion isn’t actually in the law itself, although EBay did get itself an exclusion, as did pawnbrokers.

Under California AB 1570, when a California consumer sells an autographed item worth $5 or more, the consumer’s name and address must be included on a Certificate of Authenticity (COA). This requirement also applies to anyone reselling the item as authentic, be it a bookseller, auction house, comic book dealer, antiques dealer, autograph dealer, art dealer, an estate sales company, or even a charity. Copies of the certificate must be kept for seven years. Equally significant is the requirement for sellers to disclose the name and address of the person from whom they acquired the signed book – which is a violation of their right to privacy (a right which is also protected by law in California).

The COA must (1) Describe the collectible and specify the name of the personality who autographed it. (2) Either specify the purchase price and date of sale or be accompanied by a separate invoice setting forth that information. (3) Contain an express warranty, which shall be conclusively presumed to be part of the bargain, of the authenticity of the collectible… (4) Specify whether the collectible is offered as one of a limited edition… (5) Indicate whether the dealer is surety bonded… (6) Indicate the last four digits of the dealer’s resale certificate number… (7) Indicate whether the item was autographed in the presence of the dealer and specify the date and location of, and the name of a witness to, the autograph signing. (8) Indicate whether the item was obtained or purchased from a third party. If so, indicate the name and address of this third party. (9) Include an identifying serial number that corresponds to an identifying number printed on the collectible item, if any….

That means, among other things, that the law applies to anyone engaged in the online sale of signed items. So, if a bookstore holds an author signing for the author’s latest book and then offers the signed books on its website, it is engaged in the online business of selling signed items. Easton Press, which has a business of selling autographed new books, now refuses to sell such books in California presumably because of the paperwork requirements. So do at least three other national collectible book dealers.

I know that both Borderlands Books in San Francisco and Mysterious Galaxy in San Diego have shipped signed copies of my books in the past, and, of course, Subterranean Press offers signed limited editions of the books of scores of authors, all of which would seem to be subject to the law – which also has penalties, as stated in the law itself:

“ Any consumer injured by the failure of a dealer to provide a certificate of authenticity containing the information required by this section, or by a dealer’s furnishing of a certificate of authenticity that is false, shall be entitled to recover, in addition to actual damages, a civil penalty in an amount equal to 10 times actual damages, plus court costs, reasonable attorney’s fees, interest, and expert witness fees…”

All that strikes me as pretty onerous for a signed book.