The Next Indentured Generation?

The other day I received a blog comment that chilled me all the way through.  No, it wasn’t a threat.  The commenter just questioned why state and federal government should be supporting higher education at all.

On the surface, very much on the surface, it’s a perfectly logical question. At a time of financial difficulty, when almost all states have severe budget constraints, if not enormous deficits, and when the federal deficit is huge, why should the federal government and states be supporting higher education?

The question, I fear, arises out of the current preoccupation with the here and now, and plays into Santayana’s statement about those who fail to learn the lessons of history being doomed to repeat them. So… for those who have mislaid or forgotten a small piece of history, I’d like to point out that, until roughly 1800, there were literally only a few handfuls of colleges and universities in the United States – less than 30 for a population of five million people. Most colleges produced far, far fewer graduates annually than the smallest of colleges in the USA do today.  Harvard, for example, averaged less than 40 graduates a year.  William & Mary, the second oldest college in the United States, averaged 20 graduates a year prior to 1800.  Although aggregated statistics are unavailable, estimates based on existing figures suggest that less than one half of one percent of the adult population, all male, possessed a college education in 1800, and the vast majority of those graduates came from privileged backgrounds.  Essentially, higher education was reserved for the elites. Although more than hundred more colleges appeared in the years following 1800, many of those created in the south did not survive the Civil War.

In 1862, Congress created the first land-grant universities, and eventually more than 70 were founded, based on federal land grants, primarily to teach agricultural and other “productive” disciplines, but not to exclude the classics. By 1900, U.S. colleges and universities were producing 25,000 graduates annually, out of a population of 76 million people, meaning that only about one percent of the population, still privileged, received college degrees, a great percentage of these from land grant universities supported by federal land grants and state funding.  These universities offered college educations with tuition and fees far lower than those charged by most private institutions, and thus afforded the education necessary for those not of the most privileged status.  Even so, by 1940, only five percent of the U.S. population had a college degree.  This changed markedly after World War II, with the passage of the GI bill, which granted veterans benefits for higher education. Under the conditions which existed after WWII until roughly the early 1970s, talented students could obtain a college degree without incurring excessive debt, and sometimes no debt at all.

As we all know, for various reasons, that has changed dramatically, particularly since state support of state colleges and universities has declined from something close to 60% of costs forty years ago to less than 25% today, and less than 15% in some states.  To cover costs, the tuition and fees at state universities have skyrocketed.  The result? More students are working part-time and even full-time jobs, as well as taking out student loans.  Because many cannot work and study full-time, the time it takes students to graduate takes longer, and that increases the total cost of their education. In 2010, 67% of all graduating college seniors carried student loan debts, with an average of more than $25,000 per student.  The average student debt incurred by a doctor just for medical school is almost $160,000, according to the American Medical Association.

Yet every study available indicates that college graduates make far more over their lifetime than those without college degrees, and those with graduate degrees generally fare even better.  So… students incur massive debts.  In effect, they’ll become part-time higher-paid indentured servants of the financial sector for at least 20 years of their lives.

The amounts incurred are far from inconsequential.  Student debt now exceeds national credit card debt [and some of that credit card debt also represents student debt, as well]. The majority of these costs reflect what has happened when states cut their support of higher education, and those costs also don’t reflect default rates on student loans that are approaching ten percent.

As a result, college graduates and graduates from professional degree programs are falling into two categories – the privileged, who have no debt, and can choose a career path without primarily considering the financial implications and those who must consider how to repay massive debt loads.  And as state support for higher education continues to dwindle, the U.S, risks a higher tech version of social stratification based on who owes student loans and who doesn’t.

So… should the federal and state governments continue to cut  support of higher education? Are such cuts a necessity for the future of the United States?  Really?  Tell that to the students who face the Hobson’s Choice of low-paying jobs for life or student loan payments for life.  Or should fewer students attend college?  But… if that’s the case, won’t that just restrict education to those who can afford it, one way or another?

The Tax Question

These days an overwhelming number of political figures, especially conservatives and Republicans, continue to protest about taxes and insist that taxes should be lowered and that federal income taxes, at the very least, should be left at the lower levels set during the administration of the second President Bush. Although many conservatives protest that taxes are being used for “liberal” social engineering, the fact is that there are so many “special provisions” embodied in the tax code that such “engineering” runs from provisions purported to help groups ranging from the very poorest to the very wealthiest.  In addition, much of the complexity of the tax code arises from generations of efforts to make it “fairer.”

For all that rhetoric, the basic purpose of taxes is to pay for those functions of government that the elected representatives of past and present voters have deemed necessary through the passage of federal laws and subsequent appropriations.  Or, as put by the late and distinguished Supreme Court Justice Oliver Wendell Holmes, Jr., “Taxes are what we pay for a civilized society.”

Grumbling about taxation has been an American preoccupation since at least the 1700s when the American colonists protested the British Stamp Tax and later the tax on imported British tea.  In the case of the tea tax, the colonists were paying more for smuggled tea than for fully taxed British tea, which has always made me wonder about the economic rationality of the Boston Tea Party, and who really was behind it… and for what reason, since it certainly wasn’t about the price of British tea.

Likewise, my suspicions are that the current furor about taxes, and federal income taxes in particular, may not really be primarily about taxes themselves, but a host of factors associated with taxes, most of which may well lie rooted in the proven “loss aversion” traits of human beings.  Put simply, most of us react far more strongly to events or acts which threaten to take things from us than to those which offer opportunities, and in a time when most people see few chances for economic improvement, loss aversion behavior, naturally, becomes stronger.  And most people see higher taxes, deferred Social Security retirement ages, and higher Medicare premiums as definite losses, which they are.

What’s most interesting about this today is that the leaders of the conservative movements and the Republican party are generally from that segment of society which has benefited the most in the past twenty years from the comparative redistribution of wealth to the uppermost segment of American society and yet they are appealing to those members of society who feel they have lost the most through this redistribution – the once more highly paid blue collar workers in the old automotive industries and other heavy manufacturing areas of the U.S. economy.  The problem with this appeal is not that it will not work – it definitely will work, especially if economic conditions do not improve – but that the policies espoused by the “keep taxes low/cut taxes” conservatives won’t do anything positive to benefit the vast majority of those to whom these conservatives are appealing.  They will, of course, greatly benefit the wealthy, but the comparative lack of federal/state revenues is already hurting education, despite the fact that both conservatives and liberal both agree that improved education is vital for today’s and tomorrow’s students if they are to prosper both economically  and occupationally.  The lack of money for transportation infrastructure will only hamper future economic growth, as will the lack of funding to rebuild and modernize our outdated air transport control system and a number of other aging and/or outdated infrastructure systems.

The larger problem is, of course, that the conservatives don’t want government to spend money on anything, and especially not anything new, while the liberals have yet to come up with a plan for anything workably positive… and, under those circumstances, it’s very possible that “loss aversion” politics, and the anti-taxation mood, will dominate the political debates of the next six months… which, in the end, likely won’t benefit anyone.

 

Cleverness?

Over the years, every so often, I’ve gotten a letter or review about one of my books that essentially complains about the ruthless nature of a protagonist, who is supposed to be a good person.  These often question why he or she couldn’t have done something less drastic or resolved the situation they faced in a more clever fashion.  I realized, the other day, after seeing a review of Imager’s Intrigue and then receiving an email from another writer who was disappointed that Quaeryt couldn’t be more “clever” in his resolution of matters and less reliant upon force exactly what my grandmother had meant in one of her favorite expressions.  She was always saying that some businessman or politician was “too clever by half.”

So, I believe, are some writers.  I try not to be excessively clever, because it’s highly unrealistic in the real world, but it’s difficult when there’s an unspoken but very clear pressure for authors to be “clever.”  My problem is that I’m moderately experienced in how the “real world” operates, and seldom is a “clever” solution to anything significant or of major import a truly workable solution. As I and numerous historians have pointed out, in WWII, with a few exceptions, the Germans had far more “clever” and advanced technology.  They lost to the massive application of adequate technology.  In Vietnam, the high-tech and clever United States was stalemated by the combination of wide-scale guerilla warfare and political opposition within the USA.  Despite the application of some of the most sophisticated and effective military technology ever deployed, the U.S. will be fortunate to “break even” in its recent military operations in the Middle East… and given the costs already and the loss of lives for what so far appear to be negligible gains, it could be argued that we’ve lost.  I could cite all too many examples in the business world where “clever” and “best” lost out to cheaper and inferior products backed by massive advertising.  The same sort of situations are even more prevalent in politics.

“Clever,” in fact, is generally highly unrealistic as a solution to most large scale real-world problems.  But why?

Because most problems are, at their base, people problems, it takes massive resources to change the course of human inertia/perceived self-interest. That’s why both political parties in the United States mobilize billions of dollars in campaign funds… because that’s what it takes, since most people have become more and more skeptical of any cleverness that doesn’t fit their preconceptions…  partly because they’re also skeptical of the “clever” solutions proposed by politicians.  It’s why most advertising campaigns have become low-level, not very clever, saturation efforts.  Military campaigns that involve national belief structures and not just limited and clearly defined tactical goals also require massive commitments of resources – and clever just gets squashed if it stands in the way of such effectively deployed resources.

That’s why, for example, in Imager’s Intrigue, Rhenn’s solutions are “clever” only in the sense that they apply massive power/political pressure to key political/military/social vulnerabilities of his opponents.  Nothing less will do the job.

I’m not saying that “clever” doesn’t work in some situations, because it does, but those situations are almost always those where the objectives are limited and the stakes are not nearly so high.  That makes “clever” far more suited to mysteries, spy stories, and some thrillers than to military situations where real or perceived national interests or survival are at stake.

 

The Ratings-Mad Society

The other day, at WalMart, where I do my grocery shopping, since, like or not, it’s the best grocery store in 60 miles, the check-out clerk informed me that, if I went to the site listed on my receipt and rated my latest visit to WalMart, I’d be eligible for a drawing for a $5,000 WalMart gift card.  The next day, at Home Depot, I had a similar experience. That doesn’t include the endless ratings on Amazon, B&N, and scores of retailers, not to mention U-Tube, Rate Your Professor, and the student evaluations required every semester at virtually every college or university. Nor does it include the plethora of reality television shows based on various combinations of “ratings.”

It’s getting so that everything is being rated, either on a numerical scale of from one to five or on one from one to ten.  Have we gone mad?  Or is it just me?

Ratings are based on opinions.  Opinions are, for the overwhelming majority of people, based on their personal likes and dislikes… but ratings are presented for the most part as a measurement of excellence.

Yet different people value different things. My books are an example. I write for people who think and like depth in their fiction… and most readers who like non-stop action aren’t going to read many of my books, and probably won’t like them… and those are the ones who give my books one star with words like “boring”… or “terminally slow.”  By the same token readers who like deep or thoughtful books may well rate some of the fast-action books as “shallow” [which they are by the nature of their structure] or “improbably constructed” [which is also true, because any extended fast-action sequence just doesn’t happen often, if ever, in real life, and that includes war].

Certainly, some of the rationale behind using ratings is based on the so-called wisdom of crowds, the idea that a consensus opinion about something is more accurate than a handful of expert opinions.  This has proven true… but with two caveats – the “crowd” sampled has to have general knowledge of the subject and the subject has to be one that can be objectively quantified.

The problem about rating so many things that are being rated is that for some – such as music, literature, cinema, etc. – technical excellence has little bearing on popularity and often what “the crowd” rates on are aspects having nothing to do with the core subject, such as rating on appearance, apparel, and appeal in the case of music or special effects in the case of cinema.

Thus, broad-scale ratings conceal as much as they reveal… if not more.  Yet everyone with a product is out there trying to get some sort of rating? Obviously, those with a product want a high rating to enhance the salability of their product or service.  But why do people/consumers rely so much on ratings?  Is that because people can’t think?  Or because that they’re so inundated with trivia that they can’t find the information or the time they need to make a decision?  Or because the opinion of others means more than their own feelings?

Whatever the reason, it seems to me that, in the quest for high ratings, the Dr. Jekyll idea of applying the wisdom of the crowd has been transformed into the Mr. Hyde insanity of the madness of the mob.

 

The Hullabaloo Over College Majors

Now that it’s the season for college graduation, once more the articles and commentaries are popping up everywhere – and most of them either tout certain undergraduate majors as “good” because employment in that field is up or bad because immediate job prospects aren’t as good.  What’s even worse is that politicians are getting into the act, some of them going so far as to suggest that students shouldn’t major in fields that don’t pay as well or where employment prospects are aren’t so good, with hints that government and universities shouldn’t offer aid to students interested in such fields.

There are enormous problems with the whole idea of over-emphasizing undergraduate collegiate majors, the first of which is that many students entering college don’t have the faintest idea what their true talents are or whether their interests match their abilities. This problem has worsened in the past several generations as the general academic rigor of high schools has declined and as more students enter colleges and universities without ever having been truly tested to the limits of their abilities.

The second problem is that the emphasis on a “profitable” major is also a growing emphasis on turning college into what amounts to a white-collar vocational school, rather than on an institution devoted to teaching students how to think and to learn on a life-long basis. Colleges themselves are buying into this by pushing departments into “accountability” and insisting that departments determine how many graduates are successful and employed in that field years after graduating.  But does that really measure success?

In addition, the emphasis on selecting a major based on future projected employability neglects two incredibly important factors.  The first is the student’s aptitudes.  A student who is weak in mathematics is highly unlikely to be particularly successful in fields that require that ability, no matter how many jobs exist.  Second, most students take four years or more to finish college.  Projecting what occupations will be hiring the most in four years is chancy.

As for the subjects students choose for their major, the “employability” measurements used are generally employment in the first year after graduation, and the differences in various fields aren’t that significant.  For example, in a recent Georgetown University study, there was only about a 10% difference in employment between the “worst” and “best” undergraduate majors. Such measurements strongly suggest that a student who likes a field and works hard to excel is more likely to land a job, even in a field where employment is not as robust, than a student who tries to game the employment field and who picks a major based on projected employment and earnings rather than on picking a field suited to his or her abilities. In short, it’s far better for students to be at the top of a field they like than at the bottom of one that they don’t.

More than a few studies have shown and projected that today’s educated workers will have changed fields of work three to four times between entering the workforce and retiring – and that today’s students will face even more need to change their field of work.  Such changes place a premium on the ability to think and to learn throughout life, not on a single set of skills tailored to one field or profession.  Yes, there are some fields where dedicated and directed learning is required from the beginning of college, but those fields are a minority and call for initial dedication.  They seldom attract students who are unsure of what they want to do in life or students with wide interests.

In the end, all the political and media concern about “appropriate” majors, despite protests to the contrary, ignores much of what is best about college and for the students by emphasizing short-term economic goals that cannot possibly benefit the majority of students.