Older and Depressed?

The other day one of my readers asked, “Is there anything positive you can talk about or have you slid too far down the slope of elder grouchiness and discontent?”  That’s a good question in one respect, because I do believe that there is a definite tendency, if one is intelligent and perceptive, to become more cynical as one gains experience.

Psychological studies have shown, however, that people who suffer depression are far more accurate in their assessments of situation than are optimists, and that may be why optimism evolved – because it would be too damned hard to operate and get things done if we weighed things realistically.  For example, studies also show that entrepreneurs and people who start their own businesses invariably over-estimate the chances of their success and vastly underestimate their chances of failure.  This, of course, makes sense, because why would anyone open a business they thought would fail?

There’s also another factor in play. I spent nearly twenty years in Washington, D.C., as part of the national political scene, and after less than ten years I could clearly see certain patterns repeat themselves time after time, watching as newly elected politicians and their staffs made the same mistakes that their predecessors did and, over the longer term, watching as each political party gained power in response to the abuses of its predecessor, then abused it, and tried to hold on by any means possible, only to fail, and then to see the party newly in power immediately begin to abuse its power… and so on. It’s a bit difficult not to express a certain amount of “grouchiness and discontent,” especially when you offer advice based on experience and have it disregarded because the newcomers “know better”… and then watch them make the same kind of mistakes as others did before them.  My wife has seen the same patterns in academia, with new faculty and new provosts re-inventing what amounts to a square wheel time after time.

It’s been said that human knowledge is as old as written records, but human wisdom is no older than the oldest living human being, and, from what I’ve seen, while a comparative handful of humans can learn from others, most can’t or won’t.  And, if I’m being honest, I have to admit that for the early part of my life I had to make mistakes to learn, and I made plenty. I still make them, but I’d like to think I make fewer, and the ones I make are in areas where I don’t have the experience of others to guide or warn me.

The other aspect of “senior grouchiness,” if you will, is understanding that success in almost all fields is not created by doing something positively spectacular, but by building on the past and avoiding as many mistakes as possible. Even the most world-changing innovations, after the initial spark or idea, require following those steps.

I’m still an optimist at heart, and in personal actions, and in my writing, but, frankly, I do get tired of people who won’t think, won’t learn, and fall back on the simplistic in a culture that has become fantastically complex, both in terms of levels and classes of personal interactions and in terms of its technological and financial systems. At the same time, the kind of simplicity that such individuals fall back on is the “bad” and dogmatic kind, such as fanatically fundamental religious beliefs and “do it my way or else,”  as opposed to the open and simple precepts, such as “be kind” or “always try to do the right thing.”  I’m not so certain that a great portion of the world’s evils can’t be traced to one group or another trying to force their way – the “right way,” of course, upon others.  The distinction between using government to prohibit truly evil behavior, such as murder, abuse of any individual, theft, embezzlement, fraud, assault, and the like, and forcing adherence to what amounts to theological beliefs was a hard-fought battle that took centuries to work itself out, first in English law, and later in the U.S. Constitution and legal system.  So when I see “reformers” – and they exist on the left and the right – trying to undermine that distinction that is represented by the idea of separation of church and state [although it goes far beyond that], I do tend to get grouchy and offer what may seem as depressing comments.

This, too, has historical precedents.  Socrates complained about the youth and their turning away from the Athenian values… but within a century or so Athens was prostrate, and the Athenians never did recover a preeminent position in the world. Cicero and others made the same sort of comments about the Roman Republic, and in years the republic was gone, replaced by an even more autocratic empire.

So… try not to get too upset over my observations. After all, if more people avoided the mistakes I and others who have learned from experience point out, we’d all have more reasons to be optimistic.

 

The Republican Party

Has the Republican Party in the United States lost its collective “mind,” or is it a totally new political party clinging to a traditional name – whose traditions and the policies of its past leaders it has continually and consistently repudiated over the past four years?

Why do I ask this question?

Consider first the policies and positions of the Republican leaders of the past.  Theodore Roosevelt pushed anti-trust actions against monopolistic corporations, believed in conservation and created the first national park. Dwight D. Eisenhower, general of the armies and president, warned against the excessive influence of the military-industrial complex and created the federal interstate highway system.  Barry Goldwater, Mr. Conservative of the 1970s, was pro-choice and felt women should decide their own reproductive future.  Richard Nixon, certainly no bastion of liberalism, espoused universal health insurance and tried to get it considered by Congress and founded the Environmental Protection Agency.  Ronald Reagan, cited time and time again by conservatives, believed in collective bargaining and was actually a union president, and raised taxes more times than he cut them.  The first president Bush promised not to raise taxes, but had the courage to take back his words when he realized taxes needed to be increased.

Yet every single one of these acts and positions has now been declared an anathema to Republicans running for President and for the U.S. House of Representatives and the Senate.  In effect, none of these past Republican leaders would “qualify” as true card-carrying Republicans according to those who now compose or lead the Republican Party.  A few days ago, former Florida governor and Republican Jeb Bush made a statement to the effect that even his father, the first President Bush, wouldn’t be able to get anything passed by the present Congress.

President Obama is being attacked viciously by Republicans for his health care legislation, legislation similar to that signed and implemented by Mitt Romney as governor of Massachusetts and similar in principle to that proposed by Richard Nixon.

Now… I understand that people change their views and beliefs over time, but it’s clear that what the Republican Party has become is an organization endorsing what amounts almost an American version of fascism, appealing to theocratic fundamentalism, and backed by a corporatist coalition, claiming to free people from excessive government by underfunding or dismantling all the institutions of government that were designed to protect people from the abuses of those with position and power.  Destroy unions so that corporations and governments can pay people less.  Hamstring environmental protection in the name of preserving jobs so that corporations don’t have to spend as much on environmental emissions controls. Keep taxes low on those making the most.  Allow those with wealth to spend unlimited amounts on electioneering, if in the name of  “issues education,” while keeping the names of contributors hidden or semi-hidden.  Restrict women’s reproductive freedoms in the name of free exercise of religion. Keep health care insurance tied to employment, thus restricting the ability of employees to change jobs.  Allow consumers who bought too much housing to walk away from their liabilities through bankruptcy or short sales (including the honorable junior Senator from Utah), but make sure that every last penny of private student loan debt is collected – even if the students are deceased.

The United States is a representative democratic republic, and if those calling themselves Republicans wish to follow the beliefs and practices now being spouted, that’s their choice… and it’s also the choice of those who choose to vote for them.

But for all their appeal to “Republican traditions,” what they espouse and propose are neither Republican nor traditional in the historic sense,  But then, for all their talk of courage and doing the hard jobs to be done, they haven’t done the first of those jobs, and that’s to be honest and point out that they really aren’t Republicans, and they certainly aren’t traditional conservatives, no matter what they claim.

The Derivative Society?

Once upon a time, banking and investment banking were far less complex than they are today, especially recently.  In ancient times, i.e., when I took basic economics more than fifty years ago, banks used the deposits of their customers to lend to other customers, paying less to their depositors than what they charged those to whom they made loans.  Their loans were limited by their deposits, and banks were required to retain a certain percentage of their assets in, if you will, real dollars.  Even investment banks had some fairly fixed rules, and in both cases what was classed as an asset had to be just that, generally either real property, something close to blue chip securities, municipal, state, or federal notes or bonds, or cash. With the creeping deregulatory legislation that reached its apex in the 1990s, almost anything could be, with appropriate laundering, otherwise known as derivative creation, be classed as someone’s asset.

And we all know where that led.

And for all the furor about derivatives, and the finger-pointing, something else, it seems to me, has gone largely unnoticed.  The fact is that our entire society, especially in the United States, has become obsessed with derivatives in so many ways.

What are McDonald’s, Wendy’s, Burger King, Applebee’s, Olive Garden, Red Lobster, Chili’s, and endless other restaurant chains, fast-food and otherwise, but derivatives.  What happened to unique local restaurants?  The ones with good inexpensive food often became chains, deriving their success from the original.  The others, except for a few handfuls, failed.  Every year it seems, another big name chef starts a restaurant franchise, hoping to derive success and profit from a hopefully original concept [which is becoming less and less the case].

Department stores used to be unique to each city.  I grew up in Denver, and we had Daniels & Fisher, with its special clock tower, the Denver Dry Goods [“The Denver”], and Neustaeder’s.  Then the May Company took over D&F, and before long all the department stores were generic. In Louisville, where my wife was raised, there were Bacon’s, Kaufmann’s, Byck’s, Selman’s, and Stewart’s. Not a single name remains.

Even Broadway, especially in musical theatre, has gone big for remakes and derivatives. Most of the new musicals appear to be remakes of movies, certainly derivative, or re-dos of older musicals. Every time there is a new twist on TV programming the derivatives proliferate.  How many different “Law and Order” versions are there?  Or CSI?  How many spin-offs from the “American Idol” concept?  How many “Reality TV” shows are there?  Derivative after derivative… and that proliferation seems to be increasing. Even “Snow White” has become a derivative property now.

In the field of fantasy and science fiction writing, the derivatives were a bit slower in taking off, although there were more than a few early attempts at derivatives based on Tolkien, but then… somewhere after Fred Saberhagen came up with an original derivative of the Dracula mythos, vampires hit the big-time, followed by werewolves, and more vampires, and then zombies.  Along the way, we’ve had steampunk, a derivative of a time that never was, fantasy derivatives based on Jane Austin, and more names than I could possibly list, and now, after the “Twilight” derivatives, we have a raft of others.

Now… I understand, possibly more than most, that all writing and literature derives from its predecessors, but there’s a huge difference between say, a work like Mary Robinette Kowal’s Shades of Milk and Honey, which uses the ambiance of a Regency-type culture and setting in introducing a new kind of fantasy [which Kowal does] and a derivative rip-off such as Pride and Prejudice and Zombies or Emma and the Werewolves.  When Roger Zelazny wrote Creatures of Light and Darkness or Lord of Light, he derived something new from the old myths.  In a sense, T.S. Eliot did the same in The Wasteland or Yeats in “No Second Troy.”  On the other hand, I don’t see that in John Scalzi’s Redshirts, which appears to me as a derivative capitalization on Star Trek nostalgia.

How about a bit more originality and a lot fewer “literary” derivatives?  Or have too many writers succumbed to the lure of fast bucks from cheap derivatives? Or have too many readers become too lazy to sort out the difference between rip-off and robbery whole-cloth derivatives and thoughtful new treatments of eternal human themes?

 

Coincidences?

We’ve all been there, I think, on the telephone discussing something important to us or with someone important to us… and no one else is home, when the doorbell rings, or another call comes through, with someone equally important, or both at once.  Now, it doesn’t matter that no one has called or rung the doorbell for the previous two hours and no one will for another hour or two.  What is it about the universe that ensures that, in so many cases, too many things occur at the same time?

I’m not talking about those which aren’t random, but can be predicted, like the political calls that occur from five or six in the evening until eight o’clock, or the charitable solicitations that are timed in the same way [both conveniently excepted from the do-not-call listing]. I’m talking about calls and callers and events that should be random, but clearly aren’t.  Sometimes, it’s merely amusing, as when daughters located on different coasts call at the same time.  Sometimes, it’s not, as when you’re trying to explain why you need the heating fixed now, and your editor calls wanting an immediate answer on something… or you’re discussing scheduling long-distance with your wife and you ignore the 800 call that you later find out was an automated call, without ID, informing you that your flight for six A.M. the next morning has been cancelled… and you don’t find out until three A.M. the next morning when you check your email before leaving for the airport… and end up driving an extra 60 miles to the other airport. There’s also the fact that, no matter what time of the afternoon it is, there’s a 10-20% chance that, whenever I’m talking to my editor, either FedEx, UPS, or DHL will appear at the door [upstairs from my office] needing a signature… and we don’t get that many packages [except from my publisher] and I spend less than a half hour a week on the phone with my editor.

I know I’m not alone in this.  Too many people have recounted similar stories, but the logical types explain it all away by saying that we only remember the times these things happen, but not the times that they don’t.  Maybe… but my caller I.D. gives the times for every incoming call, and when I say that there haven’t been any calls for two or three hours, and then I get three in three minutes… it doesn’t lie – not unless there’s a far grander conspiracy out there than I even wish to consider.  And why is it that I almost always get calls in the ten minutes or so a day when I’m using the “facilities”?  No calls at all in the half hour before or after, of course.

This can extend into other areas – like supermarket checkout lines. The most improbable events occur in all too many cases in whatever line I pick.  The juice packet of the shopper in front of me explodes all over the conveyor belt.  The checker I have is the only one not legally able to ring up beer, and the manager is dealing with an irate customer in another line.  The register tape jams.  The credit/debit card machine freezes on the previous customer, just after I’ve put everything on the belt.

Now… to be fair, it sometimes works the other way. There was no possible way I ever could have met my wife.  None [and I won’t go into the details because they’d take twice the words of my longest blog], but it happened, and she’s still, at least occasionally, pointing out that it had to be destiny… or fate.  Well… given how that has turned out, I wouldn’t mind a few more “improbable” favorable coincidences, but… they’re pretty rare.  Then again, if all the small unfavorable improbabilities are the price for her… I’ll put up with them all.

 

The Next Indentured Generation?

The other day I received a blog comment that chilled me all the way through.  No, it wasn’t a threat.  The commenter just questioned why state and federal government should be supporting higher education at all.

On the surface, very much on the surface, it’s a perfectly logical question. At a time of financial difficulty, when almost all states have severe budget constraints, if not enormous deficits, and when the federal deficit is huge, why should the federal government and states be supporting higher education?

The question, I fear, arises out of the current preoccupation with the here and now, and plays into Santayana’s statement about those who fail to learn the lessons of history being doomed to repeat them. So… for those who have mislaid or forgotten a small piece of history, I’d like to point out that, until roughly 1800, there were literally only a few handfuls of colleges and universities in the United States – less than 30 for a population of five million people. Most colleges produced far, far fewer graduates annually than the smallest of colleges in the USA do today.  Harvard, for example, averaged less than 40 graduates a year.  William & Mary, the second oldest college in the United States, averaged 20 graduates a year prior to 1800.  Although aggregated statistics are unavailable, estimates based on existing figures suggest that less than one half of one percent of the adult population, all male, possessed a college education in 1800, and the vast majority of those graduates came from privileged backgrounds.  Essentially, higher education was reserved for the elites. Although more than hundred more colleges appeared in the years following 1800, many of those created in the south did not survive the Civil War.

In 1862, Congress created the first land-grant universities, and eventually more than 70 were founded, based on federal land grants, primarily to teach agricultural and other “productive” disciplines, but not to exclude the classics. By 1900, U.S. colleges and universities were producing 25,000 graduates annually, out of a population of 76 million people, meaning that only about one percent of the population, still privileged, received college degrees, a great percentage of these from land grant universities supported by federal land grants and state funding.  These universities offered college educations with tuition and fees far lower than those charged by most private institutions, and thus afforded the education necessary for those not of the most privileged status.  Even so, by 1940, only five percent of the U.S. population had a college degree.  This changed markedly after World War II, with the passage of the GI bill, which granted veterans benefits for higher education. Under the conditions which existed after WWII until roughly the early 1970s, talented students could obtain a college degree without incurring excessive debt, and sometimes no debt at all.

As we all know, for various reasons, that has changed dramatically, particularly since state support of state colleges and universities has declined from something close to 60% of costs forty years ago to less than 25% today, and less than 15% in some states.  To cover costs, the tuition and fees at state universities have skyrocketed.  The result? More students are working part-time and even full-time jobs, as well as taking out student loans.  Because many cannot work and study full-time, the time it takes students to graduate takes longer, and that increases the total cost of their education. In 2010, 67% of all graduating college seniors carried student loan debts, with an average of more than $25,000 per student.  The average student debt incurred by a doctor just for medical school is almost $160,000, according to the American Medical Association.

Yet every study available indicates that college graduates make far more over their lifetime than those without college degrees, and those with graduate degrees generally fare even better.  So… students incur massive debts.  In effect, they’ll become part-time higher-paid indentured servants of the financial sector for at least 20 years of their lives.

The amounts incurred are far from inconsequential.  Student debt now exceeds national credit card debt [and some of that credit card debt also represents student debt, as well]. The majority of these costs reflect what has happened when states cut their support of higher education, and those costs also don’t reflect default rates on student loans that are approaching ten percent.

As a result, college graduates and graduates from professional degree programs are falling into two categories – the privileged, who have no debt, and can choose a career path without primarily considering the financial implications and those who must consider how to repay massive debt loads.  And as state support for higher education continues to dwindle, the U.S, risks a higher tech version of social stratification based on who owes student loans and who doesn’t.

So… should the federal and state governments continue to cut  support of higher education? Are such cuts a necessity for the future of the United States?  Really?  Tell that to the students who face the Hobson’s Choice of low-paying jobs for life or student loan payments for life.  Or should fewer students attend college?  But… if that’s the case, won’t that just restrict education to those who can afford it, one way or another?