Archive for April, 2010

Reality or Perception?

The growth of high-technology, particularly in the area of electronics, entertainment, and communications, is giving a new meaning to the question of what is “real.”  As part of that question, there’s also the issue of how on-line/perceptual requirements are both influencing and simultaneously diverging from physical world requirements.

One of the most obvious impacts of the instant communications capabilities embodied in cell-phones, netbooks, laptops, and desktops is the proliferation of emails and text messages.  As I’ve noted before, there’s a significant downside to this in terms of real-world productivity because, more and more, workers at all levels are being required to provide status reports and replies on an almost continual basis.  This constant diversion encourages so-called “multitasking,” which studies show actually takes more time and creates more errors than handling tasks sequentially – as if anyone in today’s information society is ever allowed to handle tasks sequentially and efficiently.

In addition, anyone who has the nerve or the foolhardiness to point this out, or to refrain from texting and on-line social networking, is considered out of touch, anti-technology, and clearly non-productive because of his or her refusal to “use the latest technology,” even if their physical productivity far exceeds that of the “well-connected.”  No matter that the individual has a cellphone and laptop with full internet interconnectivity and can use them to obtain real physical results, often faster than those who are immersed in social connectivity, such individuals are “dinosaurs.”

In addition, the temptations of the electronic world are such, and have created enough concern, that some companies have tried to take steps to limit what on-line activities are possible on corporate nets.

The real physical dangers of this interconnectivity are minimized, if not overlooked.  There have been a number of fatalities, even here in Utah, when individuals locked into various forms of electronic reality, from Ipods to cellphones, have stepped in front of traffic and trains, totally unaware of their physical surroundings.  Given the growth of the intensity of the “electronic world,” I can’t help but believe these will increase.

Yet, in another sense, the electronic world is also entering the physical world.  For example, thousands and thousands of Asian young men and women labor at various on-line games to amass on-line virtual goods that they can effectively trade for physical world currency and goods.  And it works the other way.  There have even already been murders over what happened in “virtual reality” communities.

The allure of electronic worlds and connections is so strong that hundreds of thousands, if not millions, of students and other young people walk past those with whom they take classes and even work, ignoring their physical presence, for an electronic linkage that might have seemed ephemeral to an earlier generation, but whose allure is far stronger than physical reality.…

Does this divergence between the physical reality and requirements of society and the perceptual “reality” and perceived requirements of society herald a “new age,” or the singularity, as some have called it, or is it the beginning of the erosion of culture and society?

Important Beyond the Words

Despite all the “emphasis” on improving education and upon assessment testing in primary and secondary schools, education is anything but improving in the United States… and there’s a very good reason why.  Politicians, educators, and everyday parents have forgotten one of the most special attributes that makes us human and that lies behind our success as a species – language, in particular, written language.

An ever-increasing percentage of younger Americans, well over a majority of those under twenty, cannot write a coherent paragraph, nor can they synthesize complex written information, either verbally or in writing, despite all the testing, all the supposed emphasis on “education.”  So far, this has not proved to be an obvious detriment to U.S. science, business, and culture, but that is because society, any society, has always been controlled by a minority.  The past strength of U.S. society has been that it allowed a far greater percentage of “have-nots” to rise into that minority, and that rise was enabled by an educational system that emphasized reading, writing, and arithmetic – the three “Rs.”   While mastery of more than those three basics is necessary for success in a higher-technology society, ignoring absolute mastery in those subjects for the sake of knowledge in others is a formula for societal collapse, because those who can succeed will be limited to those whose parents can obtain an education for their children that does require mastery of those fundamental basics, particularly of writing.  And because in each generation, there are those who will not or cannot truly master such basics, either through lack of ability or lack of dedication, the number of those able to control society will become ever more limited and a greater and greater percentage of society’s assets will become controlled by fewer and fewer, who, as their numbers dwindle, find their abilities also diminish.  In time, if such a trend is not changed, social unrest builds and usually results in revolution.  We’re already seeing this in the United States, particularly in dramatically increased income inequality, but everyone seems to focus on the symptoms rather than the cause.

Why writing, you might ask.  Is that just because I’m a writer, and I think that mastery of my specialty is paramount, just as those in other occupations might feel the same about their area of expertise?  No… it’s because writing is the very foundation upon which complex technological societies rest.

The most important aspect of written language is not that it records what has been spoken, or what has occurred, or that it documents how to build devices, but that it requires a logical construct to be intelligible, let alone useful. Good writing requires logic, both in structuring a sentence, a paragraph, or a book.  It requires the ability to synthesize and to create from other information.  In essence, mastering writing requires organizing one’s thoughts and mind.  All the scattered facts and bits of information required by short-answer educational testing are useless unless they can be understood as part of a coherent whole.  That is why, always, the best educational institutions required long essay tests, usually under pressure.  In effect, such tests both develop and measure the ability to think.

Yet the societal response to the lack of writing, and thus thinking, ability has been to institute “remedial” writing courses at the college entry level.  This is worse than useless, and a waste of time and resources.  Basic linguistics and writing ability, as I have noted before, are determined roughly by puberty.  If someone cannot write and organize his or her thoughts by then, effectively they will always be limited.  If we as a society want to reverse the trend of social and economic polarization, as well as improve the abilities of the younger generations, effective writing skills have to be developed on the primary and early secondary school levels.  Later than that is just too late.  Just as you can’t learn to be a concert violinist or pianist beginning at age eighteen, or a professional athlete, the same is true for developing writing and logic skills.

And because, in a very real sense, a civilization is its written language, our inability to address this issue effectively may assure the fall of our culture.

The Failure to Judge… Wisely

In last Sunday’s education supplement to The New York Times, there was a table showing a sampling of U.S. colleges and universities and the distribution of grades “earned” by students, as well as the change from ten years earlier – and in a number of cases, the change from twenty or forty or fifty years ago.  Not surprisingly to me, at virtually every university over 35% of all grades granted were As.  Most were over 40%, and at a number, over half of all grades were As.  This represents a 10% increase, roughly, over the past ten years, but even more important it represents a more than doubling, and in some cases, a tripling of the percentage of As being given from 40-50 years ago.  Are the teachers 2-3 times better?  Are the students?  Let us just say that I have my doubts.

But before anyone goes off and blames the more benighted university professors, let’s look at society as a whole.  Almost a year ago, or perhaps longer, Alex Ross, the music critic for The New Yorker, pointed out that almost every Broadway show now gets a standing ovation, when a standing ovation was relatively rare some fifty years ago.  When I was a grade-schooler, there were exactly four college football bowl games on New Year’s eve or New Year’s day, while today there are something like thirty spread over almost four weeks.  Until something like half a century ago, there weren’t any “divisions” in baseball.  The regular season champion of the American League played the regular season champion of the National League.  It’s almost as though we, as a society, can’t accept the judgment of continual success over time.

And have you noticed that every competition for children has almost as many prizes as competitors – or so it seems.  Likewise, there’s tremendous pressure to do away with grades and/or test scores in determining who gets into what college.  And once students are in college, they get to judge their professors on how well they’re being taught – as if any 18-21 year truly has a good and full understanding of what they need to learn [admittedly, some professors don’t, but the students aren’t the ones who should be determining this].  Then we have the global warming debate, where politicians and people with absolutely no knowledge and understanding of the mechanics and physics of climate insist that their views are equal to those of scientists who’ve spent a lifetime studying climate.  And, of course, there are the intelligent design believers and creationists who are using politics to dictate science curricula in schools, based on their beliefs, rather than on what can be proven.

And there’s the economy and business and education, where decisions are made essentially on the basis of short-term profit figures, rather than on the longer-term… and as a result, as we have seen, the economy, business, and education have all suffered greatly.

I could list page after page of similar examples and instances, but these all point out an inherent flaw in current societies, particularly in western European societies, and especially in U.S. society.  As a society, we’re unwilling or unable, or both, to make intelligent decisions based on facts and experience.

Whether it’s because of political pressure, the threat of litigation, the fear of being declared discriminatory, or the honest but misguided belief that fostering self-esteem before establishing ability creates better students, the fact is that we don’t honestly evaluate our students.  We don’t judge them accurately.  Forty or fifty percent do not deserve As, not when less than thirty percent of college graduates can write a complex paragraph in correct English and follow the logic [or lack of it] in a newspaper editorial.

We clearly don’t judge and hold our economic leaders, or our financial industry leaders, to effective standards, not when we pay them tens, if not hundreds, of millions of dollars to implement financial instruments that nearly destroyed our economy.  We judge those running for political office equally poorly, electing them on their professed beliefs rather than on either their willingness to solve problems for the good of the entire country or their willingness to compromise to resolve problems – despite the fact that no political system can survive for long without compromise.

Nor are we, again as a society, particularly accurate in assessing and rewarding artistic accomplishments, or lack of them, when rap music, American Idol and “reality” shows draw far more in financial reward and audiences than do old-fashioned theatre, musical theatre [where you had to be able to compose and sing real melodies], opera, and classical music, and where hyped-up graphic novels are the fastest-growing form of  “print” fiction.   It’s one thing to enjoy entertainment that’s less than excellent in terms of quality;  it’s another to proclaim it excellent, but the ability to differentiate between popularity and technical and professional excellence is, again, a matter of good judgment.

In fact, “judgment” is becoming the new “discrimination.”  Once, to discriminate meant to choose wisely;  now it means to be horribly biased.  The latest evolution in our current “newspeak” appears to be that to judge wisely on the basis of facts is a form of bias and oppression.  It’s fine to surrender judgment to the marketplace, where dollars alone decide, or to politics, where those who are most successful in pandering for votes decide… but to decide based on solid accomplishment – or the lack thereof, as in the case of students who can’t read or write or think or in the case of financiers who lose trillions of dollars – that’s somehow old-fashioned, biased, or unfair.

Whatever happened to judging wisely?


Throughout recorded history runs a thread whereupon an older and often distinguished figure rants about the failures of the young and how they fail to learn the lessons of their forebears and how this will lead to the downfall of society.  While many cite Plato and his words about the coming failure of Greek youth because they fail to learn music and poetry and thus cannot distinguish between the values of the ancient levels of wisdom ascribed to gold, silver, and bronze, such warnings precede the Greeks and follow them through Cicero and others.  They also occur in other cultures than in western European descended societies.

Generally, at the time of such warnings, as with the case of Alcibiades with Socrates, there are generally two reactions, one usually from the young and one usually from the older members of society.  One is: “We’re still here; what’s the problem; you don’t understand that we’re different.”  The other is: “The young never understand until it’s too late.”

I’ve heard my share of speeches and talks that debunk the words of warning, and generally, these “debunkers” point out that Socrates and Cicero and all the others warned everyone, but today we live at the peak of human civilization and technology.  And we do… but that’s not the point.

Within a generation of the time of Plato’s reports of Socrates’ warnings, Greece was spiraling down into internecine warfare from which it, as a civilization, never fully recovered.  The same was true of Cicero, but the process was far more prolonged in the case of the Roman Empire, although the Roman Republic, which laid the foundation of the empire, was essentially dead at the time of Cicero’s execution/murder.

The patterns of rise and fall, rise and fall, of cultures and civilizations permeate human history, and so far, no civilization has escaped such a fate, although some have lasted far longer than others.

There’s an American saying that was popular a generation or so ago – “From shirt-sleeves to shirt-sleeves in four generations.”  What it meant was that a man [because it was a society even more male-dominated then] worked hard to build up the foundation for his children, and then the next generation turned that foundation into wealth and success, and the third generation spent the wealth, and those of the fourth generation were impoverished and back in shirt-sleeves.

To build anything requires effort, and concentrated effort requires dedication and expertise in something, which requires concentration and knowledge.  Building also requires saving in some form or another, and that means forgoing consumption and immediate satisfaction.  In societal terms, that requires the “old virtues.”  When consumption and pleasure outweigh those virtues, a society declines, either gradually or precipitously.  Now… some societies, such as that of Great Britain, for years pulled themselves back from the total loss of “virtues.”

But, in the end, the lure of pleasure and consumption has felled, directly or indirectly, every civilization.  The only question appears to be not whether this will happen, but when.

So… don’t be cavalier about those doddering old fogies who predict that the excess of pleasure-seeking and self-interest will doom society.  They’ll be right… sooner or later.

The Continued Postal Service Sell-Out

Once, many, many years ago, I was the legislative director for a U.S. Congressman who served on the Appropriations subcommittee overseeing the U.S. Postal Service.  Trying to make sense out of the Postal Service budget – and their twisted economic rationalizations for their pricing structure – led to two long and frustrating years, and the only reason I didn’t lose my hair earlier than I eventually did was that the USPS comprised only part of my legislative duties.

The latest cry for cuts and service reductions may look economically reasonable, but it’s not, because the USPS has been employing the wrong costing model for over forty years. The model is based on structuring costs, first and primarily, on first class mail, and then treating bulk mail and publications as marginal costs, and setting the rates, especially for bulk mail, based on the marginal costs.

Why is this the wrong model?

First, because it wasn’t what the founding fathers had in mind, and second, because it’s lousy economics.

Let’s go back to the beginning.  Article I, section 8, clause 7 of the U.S. Constitution specifically grants Congress the power “to establish Post Offices and Post roads.”  The idea behind the original Post Office was to further communications and the dissemination of ideas.  There was a debate over whether the Post Office should be allowed to carry newspapers, and a number of later Supreme Court decisions dealt with the limits on postal power, especially with regard to free expression, with the Court declaring, in effect, that the First Amendment trumped the Post Office power to restrict what could be mailed.  During the entire first century after the establishment of the Post Office and even for decades after, the idea behind the Post Office was open communications, particularly of ideas.

The idea of bulk mail wasn’t even something the founding fathers considered and could be said to have begun with the Montgomery Ward’s catalogue in the 1870s, although the Post Office didn’t establish lower bulk mail rates until 1928.  As a result, effectively until after World War II, massive direct bulk mailings were comparatively limited, and the majority of Post Office revenues came from first class mail. Today, that is no longer true.  Bulk mail makes up the vast majority of the U.S. Postal Service’s deliveries, and yet it’s largely still charged as if it were a marginal cost – and thus, the government and first class mail users are, in effect, subsidizing advertising mail sent to all Americans.  Yet, rather than charging advertisers what it truly costs to ship their products, the USPS is proposing cutting mail deliveries – and the reason why they’re talking about cutting Saturday delivery is because – guess what? – it’s the lightest delivery day for bulk mail.

I don’t know about any of you, but every day we get catalogues from companies we’ve never patronized and never will.  We must throw away close to twenty pounds of unwanted bulk mail paper every week – and we’re paying higher postage costs and sending tax dollars to the USPS to subsidize even more of what we don’t want.

Wouldn’t it just be better to charge the advertisers what it really costs to maintain an establishment that’s to their benefit?  Or has the direct mail industry so captured the Postal Service and the Congress that the rest of us will suffer rather than let this industry pay the true costs of the bulk mail designed to increase their bottom line at our expense?

Being A Realist

Every so often, I come head-to-head with an unsettling fact – being a “realistic” novelist hurts my sales and sometimes even upsets my editors.  What do I mean?   Well… after nearly twenty years as an economist, analyst, administrator, and political appointee in Washington, I know that all too many of the novelistic twists and events, such as those portrayed by Dan Brown, are not only absurd, but often physically and or politically impossible.  That’s one of the reasons why I don’t write political “thrillers,” my one attempt at such proving dramatically that the vast majority of readers definitely don’t want their realism close to home.

Unfortunately, a greater number don’t want realism to get in the way, or not too much in the way, in science fiction or fantasy, and my editors are most sensitive to this.  This can lead to “discussions” in which they want more direct action, while I’m trying to find a way to make the more realistic indirect events more action-oriented without compromising totally what I have learned about human nature, institutions, and human political motivations.  For example, there are reasons why high-tech societies tend to be lower-violence societies, but the principal one is very simple.  High-tech weaponry is very destructive, and societies where it is used widely almost invariably don’t stay high-tech.  In addition, violence is expensive, and successful societies find ways to satisfy majority requirements without extensive violence [selective and targeted violence is another question].

Another factor is that people seeking power and fortune wish to be able to enjoy both after they obtain them – and you can’t enjoy either for long if you’ve destroyed the society in order to be in control. This does not apply to fanatics, no matter what such people claim, but the vast majority of fanatics don’t wish to preserve society, but to destroy – or “simplify” – it because it represents values antithetical to theirs.

Now… this sort of understanding means that there’s a lot less “action” and destruction in my books than in most other books dealing with roughly similar situations and societies, and that people actually consider factors like costs and how to pay for things.  There are also more meals and meetings – as I’m often reminded, and not always in a positive manner – but meals and meetings are where most policies and actions are decided in human society.  But, I’m reminded by my editors, they slow things down.

Yes… and no…

In my work, there’s almost always plenty of action at the end, and some have even claimed that there’s too much at the end, and not enough along the way.  But… that’s life.  World War II, in all its combat phases, lasted slightly less than six years.  The economics, politics, meetings, meals, treaties, elections, usurpations of elections, and all the factors leading up to the conflict lasted more than twenty years, and the days of actual fighting, for any given soldier, were far less than that. My flight instructors offered a simple observation on being a naval aviator:  “Flying is 99 percent boredom, and one percent sheer terror.”  Or maybe it was 98% boredom, one percent exhilaration, and one percent terror.

On a smaller and political scale, the final version of Obama’s health care bill was passed in days – after a year of ongoing politicking, meetings, non-meetings, posturing, special elections, etc.   The same is true in athletics – the amount of time spent in training, pre-season, practices, etc, dwarfs the time of the actual contest, and in football, for example, where a theoretical hour of playing time takes closer to three hours, there’s actually less than fifteen minutes of actual playing time where players are in contact or potential contact.

Obviously, fiction is to entertain, not to replicate reality directly, because few read to get what amounts to a rehash of what is now a very stressful life for many, but the question every writer faces is how close he or she hews to the underlying realities of how human beings interact with others and with their societies.  For better or worse, I like my books to present at least somewhat plausible situations facing the characters, given, of course, the underlying technical or magical assumptions.

Often my editors press for less realism, or at least a greater minimization of the presentation of that realism.  I press back.  Sometimes, it’s not pretty. So far, at least, we’re still talking to each other.

So far…

Pondering Some “Universals”

When a science fiction writer starts pondering the basics of science, especially outside the confines of a story or novel, the results can be ugly.  But…there’s this question, and a lot of them that arise from it, or cluster around it… or something.

Does light really maintain a constant speed in a vacuum and away from massive gravitational forces?

Most people, I’m afraid, would respond by asking, “Does it matter?”  or “Who cares?”

Physicists generally insist that it does, and most physics discussions deal with the issue by saying that photons behave as if they have zero mass at rest [and if I’m oversimplifying grossly, I’m certain some physicist will correct me], which allows photons to travel universally and generally invariably [again in a vacuum, etc.] at the speed of light, which is a tautology, if one thinks about it.  Of course, this is also theoretical, because so far as I can determine, no one has ever been able to observe a photon “at rest.”

BUT… here’s the rub, as far as I’m concerned.  Photons are/carry energy.  There’s no doubt about that.  The earth is warmed by the photonic flow we call sunlight.  Lasers produce coherent photonic flow strong enough to cut metal or perform delicate eye surgery.

Second, if current evidence is being interpreted correctly, black holes are massive enough to stop the flow of light.  Now… if photons have no mass, how could that happen, since the current interpretation is that the massive gravitational force stops the emission of light, suggesting that photons do have mass, if only an infinitesimal and currently unmeasurable mass.

These lead to another disturbing [at least for me] question.  Why isn’t the universe “running down”?  Don’t jump on me yet.  A great number of noted astronomers have asserted that such is indeed happening – but they’re talking about that on the macro level, that is, the entropy of energy and matter that will eventually lead to a universe where matter and energy are all at the same level everywhere, without all those nice gradients that make up comparative vacuum and stars and planets and hot and cold.  I’m thinking about winding down on the level of quarks and leptons, so to speak.

Current quantum mechanics seems to indicate that what we think of as “matter” is really a form of structured energy, and those various structures determine the physical and chemical properties of elements and more complex forms of matter.  And that leads to my problem.  Every form of energy that human beings use and operate “runs down” unless it is replenished with more energy from an outside source.

Yet the universe has been in existence for something like fifteen billion years, and current scientific theory is tacitly assuming that all these quarks and leptons – and photons – have the same innate internal energy levels today as they did fifteen billion years ago.

The scientific quest for a “theory of everything” tacitly assumes, as several noted scientists have already observed, unchanging universal scientific principles, such as an unvarying weak force on the leptonic level and a constant speed of light over time.  On a practical basis, I have to question that.  Nothing seems to stay exactly the same in the small part of the universe which I inhabit, but am I merely generalizing on the basis of my observations and anecdotal experience?

All that leads to the last question.  If those internal energies of quarks and leptons and photons are all declining at the same rate, how would we even know?  Could it be that those “incredible speeds” at which distant galaxies appear to be moving are more an artifact of changes in the speed of light?  Or in the infinitesimal decline of the very energy levels of all quarks, etc., in our universe?

Could our universe be running down from the inside out without our even knowing it?

The Absolute Need for Mastery of the Boring

A few weeks so ago, I watched two college teams play for the right to go to the NCAA tournament.  One team, down twenty points at halftime, rallied behind the sensational play of a single star and pulled out the victory by one point in the last seconds.  That was the way television commentators and the print media reported it.  I saw it very differently. One of the starting guards for the losing team missed seven out of twelve free throws, two of them in the last fifteen seconds.  This wasn’t a fluke, a bad day for that player – he had a year-long 40% free throw success percentage.  And just how many games in the NCAA tournament have been lost by “bad” free throw shooting?  Or won by good free throw shooting?  More than just a handful.

Good free-throw shooting is clearly important to basketball success.  Just look at the NBA.  While the free-throw shooting average for NCAA players is 69%, this year’s NBA average is 77%, and 98% of NBA starters have free throw percentages above 60%, with 75% of those starters making more than three-quarters of their free throws.

To my mind, this is a good example of what lies behind excellence – the ability to master even the most boring aspect of one’s profession. Another point associated with this is that simply knowing what a free throw is and when it is employed isn’t the same as being able to do it.  It requires practice – lots of practice. Shooting free throws day after day and improving technique is not exciting; it’s boring.  But the fact that there are very, very few poor free-throw shooters in the NBA is a good indication that mastery of the boring pays off.

The same is true in writing.  Learning grammar and even spelling [because spell-checkers don’t catch everything, by any means] is also boring and time consuming, and there are some writers who are, shall I say, slightly grammatically challenged, but most writers know their grammar.  They have to, because editors usually don’t have the time or the interest in cleaning up bad writing.  It also gets boring to proofread page after page of what you’ve written, from the original manuscript, the copy-edited manuscript, the hardcover galleys, the paperback galleys, and so on… but it’s necessary.

Learning how to fly, which most people believe is exciting, consists of a great deal of boredom, from learning to follow checklists to the absolute letter, to practicing and practicing landings, take-offs, and emergency procedures hour after hour, day after day until they’re second nature.  All that practice is tedious… and absolutely necessary.

My opera director wife is having greater difficulty with each year in getting students to memorize their lines and music – because it’s boring – but you can’t sing opera or musical theatre if you don’t know your music and lines.

I could go on and on, detailing the necessary “boring” requirements of occupation after occupation, but the point behind all this is that our media, our educational system, and all too many parents have instilled a message that learning needs to be interesting and fun, and that there’s something wrong with the learning climate if the students lose interest.  Students have always lost interest.  We’re genetically primed to react to the “new” because it was once a survival requirement.  But the problem today is that the skills required to succeed in any even moderately complex society require mastery of the basics, i.e., boring skills, or sub-skills, before one can get into the really interesting aspects of work.  Again, merely being able to look something up isn’t the same as knowing it, understanding what it means, and being able to do it, time after time without thinking about it and without having to look it up repeatedly.

And the emphasis on fun and making it interesting is obscuring the need for fundamental mastery of skills, and shortchanging all too many young people.


Last week, in my semi-masochistic reading of reviews, I came across a review of The Magic of Recluce that really jarred me.  It wasn’t that the review was bad, or even a rave.  The reviewer noted the strengths of the book and some areas she thought weak, or at least that felt rough to her.  What jarred me were the words and the references which compared it to books that had been published years afterward, as if The Magic of Recluce happened to be copying books that actually appeared after it.  Now, this may have been as much my impression as what the reviewer meant, but it struck a chord – off-key – in my mind because I’ve seen more than a few reviews, especially in recent years, that note that The Magic of Recluce was good, decent… or whatever, but not as original as [fill in the blank].

Now… I’m not about to get into comparative “quality” — not in this blog, at least, but I have to admit that the “not so original” comment, when comparing Recluce to books published later, concerns me.  At the time the book was published, almost all the quotes and reviews noted its originality.  That it seems less “original” now to newer and often younger readers is not because it is less original, but because there are far more books out with differing magic systems.  Brandon Sanderson, for example, has developed more than a few such systems, but all of them came well after my systems in Recluce, Erde, and Corus, and Brandon has publicly noted that he read my work well before he was a published author.

The word “original” derives from “origin,” i.e., the beginning, with the secondary definition that it is not a copy or a duplicate of other work.  In that sense, Tolkien’s work and mine are both original, because our systems and the sources from which we drew are substantially different.  Tolkien drew from linguistics and the greater and lesser Eddas, and, probably through his Inking connections with C.S. Lewis, slightly from Wagner.  I developed my magic system from a basis of physics.  Those are the “origins.”

The other sense of “original” is that signifying preceding that which follows, and in that sense, my work is less original than that of Tolkien, but more “original” than that of Sanderson or others who published later, for two reasons.  First, I wrote it earlier that did those who followed me, and second, I developed magic systems unlike any others [although the Spellsong Cycle magic has similarities to Alan Dean Foster’s Spellsinger, but a fundamentally different technical concept].

There’s also a difference between “original” and “unique.”  While it is quite possible for an original work not to be unique, a truly unique work must be original, although it can be derivative.

Inn any case, my concerns are nothing compared to those raised by the reader review I once read that said that Tolkien’s work was “sort of neat,” even if he did rip off a lot from Terry Brooks.