Archive for April, 2019

The Muzzling of Science

The latest edition of Scientific American printed a table from the Silencing Science Tracker, created and maintained by the Columbia University Law School, which listed almost 200 actions by the federal government to cut off scientific research and muzzle the reporting of scientific findings.

The federal agencies most involved in muzzling science were the U.S. EPA, the Department of Interior, the Department of Energy, HHS, and the Department of Agriculture. The forms such actions took include distorting or destroying data, restricting government scientists from attending conferences, closing down science advisory groups, removing scientists from such boards, and replacing scientists on science boards.

The vast majority of these actions appear to have been taken to suppress scientific information or expertise contrary to the political agenda of the White House, such as air pollution health data, occupational health exposures, climate change data, and pesticide exposure data. Apparently, when the White House cannot find studies or data to support its political agenda, it just restricts the dissemination of data, muzzles government scientists, or replaces scientists with industry lobbyists… and then ignores any scientist – or anyone else — who suggests that the administration’s actions were motivated for business or political ends.

Those who agree with the administration’s actions seem to believe that scientific findings can be changed by wishing otherwise, or ignored with no impact. In addition to that, it seems as though almost no one outside the science community seems to care about the need for impartial scientific findings, or that science is being muzzled or distorted… and that public health, education, and the environment are being sacrificed to political expediency on the largest scale in U.S. history.

Knowing… and Knowing

There are many ways to classify knowledge, but, in the end, what each of us knows is based on one of two methods… and sometimes a combination of both.

The first way of knowing is through observation, including experimentation and evidence-based and peer-reviewed knowledge gathered by others.

The second is belief based on faith.

Now, admittedly, if I accept the conclusions or findings of a historian or a scientist, there is an element of faith in the individual and the field, and sometimes a great deal of faith is required, especially if it’s a field with which I have little experience. In science, however, virtually all theories and findings are scrutinized by a great number of other scientists, as are the facts employed, the various kinds of supporting evidence, and the methodologies.

So… when someone says, “I don’t believe in global warming,” or “in human caused global warming”, or “I think vaccines are more dangerous than the disease,” that belief isn’t based on knowledge, but on belief… and upon the conviction that they are right against the weight of evidence and expertise. Now, as science has advanced, some older theories have been modified, and a few even disproved, but in the last century, very few of the current theories have been disproved outright, although a number have been modified or extended as technology has made it possible to test theories and theorems to greater extents.

I’ll also make the point that, at least so far, in areas where science and religion have conflicted in the material world [I’m not discussing the metaphysical world], religious faith has an abysmal record as far as being accurate.

As an interesting aspect of the fact/belief conflict, a recent article in New Scientist pointed out that, after massive river floods in Europe over the past millennium, after two generations, people went back to building houses and structures in the areas that were bound to be flooded – generation after generation. The study concluded that once those people who had suffered the devastations died off it only took another generation or two before their descendants disregarded the historical accounts. The study authors also suggested that the same lack of first-hand experience might explain why there is now a growing number of anti-vaxxers in the U.S. – because these individuals have no personal experience of the ravages of “childhood diseases.

I’m old enough to remember those diseases. I had contemporaries who suffered through polio and who ended up wearing leg-braces and/or in wheelchairs. A young woman I knew was born without a lower arm because her mother had measles when pregnant. Someone in my family had severe vision damage from measles. My uncle died from long-term complications from strep. And even now, over 1,100 children have died from measles so far this year in Madagascar.

Yet we have people in the U.S. who absolutely know that measles is a comparatively harmless childhood disease.

So… why do some people insist on holding to beliefs that are proven inaccurate? Is it because those beliefs are so central to their religious faith or their self-image that they cannot accept something that goes against facts and evidence? Or because they cannot believe something that goes against those beliefs unless or until they personally experience something drastic that changes those beliefs? Or because changing or adjusting their beliefs will alienate them from their faith or “tribal affiliation”? Or because they just have to be “right”?

Guarantee?

As I mentioned in an earlier post, the local university is transitioning to a trimester system so that students can theoretically obtain their bachelor’s degree in three years. This is a bit of a misnomer, because there are certain degrees where that’s likely to be impossible, given the technical content and other requirements, and the university is soft-pedalling those possibilities for the moment.

One of the other aspects of this “degree speed up” that bothers me, and more than a few in the higher education community, is that the push for getting degrees faster represents the commodification of higher education and the fact that the emphasis is getting to be more and more upon the degree as a credential. In turn, these forces represent a growing mindset that having that degree is a guarantee of a better job and higher income.

While statistics show that this correlation was true in the past, there’s an old saying that correlation is not causation.

The fact is that higher education represents an opportunity for economic and personal improvement, but even in the past it was not an absolute guarantee of either, or necessarily of economic success. Today, and in the years to come, with the growing glut of college degrees among the younger generation, blanket economic opportunity for degree holders is certainly not guaranteed. Some studies indicate that, today, there are twice as many college graduates each year as there are jobs for them that require a college degree. After WW II, by comparison, only about ten percent of high school graduates received a degree, while now seventy percent of all high school graduates go to college, and four in ten Americans have at least an undergraduate degree.

The one guarantee that does exist is that those Americans without either higher education or specialized technical or trade training will be largely frozen out of decent-paying occupations, but with the growing number of college graduates, the increase in computerization and automation of many former white collar jobs, the number of higher paying jobs is not growing nearly as fast as the number of graduates seeking those jobs.

And that means that a college education isn’t nearly the guarantee of economic and professional success it once was, and far less of a guarantee than people now believe. It’s more like a high-priced gamble where the odds are only slightly in favor of the degree holder.

If This Keeps Up…

Trump will win re-election. What do I mean by “this”? It’s not any one thing, but a combination of factors.

First, Trump is solidifying his base into an immovable monolith. Admittedly, that “monolith” amounts to “only” 36-46% of the electorate, depending on how it’s measured, but for voting purposes, I’d submit it’s 45-48% of likely voters, and a large percentage of this group consistently votes, and that’s more than enough to win with a “fragmented” electorate.

Second, the Democrats are splintering and arguing over which liberal “theme” is most important and which will galvanize their supporters.

Third, anything that’s radical enough to electrify or galvanize a particular group of supporters won’t be of paramount interest to the rest of Democratic voters and will alienate some, while galvanizing Trump supporters in opposition.

Fourth, with something like fourteen different candidates with considerably different priorities, the media will have a field day targeting or trumpeting those differences, which will in turn create an impression of incompetence and disorganization among Democrats, and that will reduce support and interest by independent voters.

Fifth, the proliferation of specific interests will make it difficult, if not impossible, for Democrats to come up with a single, meaningful, and unifying theme and to unite enthusiastically behind a single candidate.

Sixth, unless the Democrats in the House can come up with factual, indisputable, and hard evidence that Trump, or someone in the Trump family, did something that was criminal and major – and soon – any continued pursuit of Trump and his family will eventually be regarded as the “witch hunt” Trump claims it already is and will discredit the Democrats.

Seventh, because of the Democratic takeover of the House of Representatives, Democrats believe Trump is so vulnerable that any Democratic candidate can win. What they don’t want to acknowledge is that they won because most of their candidates were tailored to win in those specific districts. While this is a viable and successful Congressional strategy, it doesn’t translate to a national presidential strategy, especially since too many of those running are assuming they have or can obtain national mandate, which they don’t have and mostly like won’t obtain.

Eighth, none of the Democratic candidates seem to realize any of the above, and while Nancy Pelosi does, her influence over who runs and what they say is limited to advice and counsel, which none of the young hotshots are likely to heed.

So brace yourselves for a second Trump term, unless his health fails suddenly… and that might not even keep him from winning, so long as he’s alive and able to tweet.

Not Listening, Not Being Taught… or Not Caring?

The 1960s, and especially 1968, were a tumultuous time in U.S. culture and history. In the middle of the Vietnam War, there were continual protests and flag burning and draft card burnings across the country. Students attacked nearly 200 ROTC buildings on college campuses, and there were violent protests against the war at more than 250 colleges. There were protests everywhere, especially in Washington, D.C. At one protest at Kent State, actually in 1970, National Guardsmen shot student protesters, killing four and wounding nine others. Between 30,000 and 40,000 young men fled to Canada, rather than be drafted into the army and fight in Vietnam.

There were more political killings and attempts than at any other time in U.S. history. President Kennedy was killed; Texas Governor John Connally was wounded in the same attack; Senator Robert Kennedy was killed while running for the Presidency. George Lincoln Rockwell, head of the American Nazi Party, was assassinated. Numerous black leaders were killed: Martin Luther King, Medgar Evers, Malcolm X, and James Chaney are the most notable, but the Civil Rights Memorial in Montgomery, Alabama, lists 41 civil rights workers who were killed because of their efforts to obtain civil rights.

After Martin Luther King’s assassination in 1968, riots erupted across the nation in more than 100 cities, including Washington, D.C., Chicago, Baltimore, Kansas City, Detroit, Louisville, New York, Pittsburgh, Cincinnati. More than 40 people died, and over 1500 were injured, with more than 15,000 people being arrested.

The beating of an African American motorist by LAPD officers in August 1965 set off riots in the Watts area of LA that lasted six days, with 34 deaths, over a thousand serious injuries, more than 3,400 arrests, and property damage in excess of $40 million [roughly $300 million in today’s dollars].

When my wife the professor brings this up to her students, most of them look blank. When she pointed out to her female students out that little more than a century ago women were essentially property and couldn’t vote in most of the U.S. until 1920, or that women couldn’t get credit cards without the approval of husband or father until the late 1950s, they didn’t believe her initially. Then they just shrugged.

The other day, I got an email from a young woman, an educated young woman in her late twenties, who asked me why I’d said the 1960s were as more turbulent time than the present. So I started asking other educators I know about this, wondering if what we’d seen was just an outlying oddity. It might still be, but the half-dozen other educators I talked to had similar stories.

From what I’ve seen, it’s almost as if the younger generation doesn’t know recent U.S. history, and, to me, at least, this seems to be a recent phenomenon. I was taught about World War I, the Great Depression, and other historic events that occurred in the generation before I was born. What bothers me about this is that there seems to be an assumption on the part of the younger generation that progress is a given. A study of history shows it’s not, but those who don’t know history won’t see what can and has happened. Rome did fall. So did the Eastern Roman Empire, and the Ottoman Empire, not to mention the British Empire (although it didn’t fall so much as was relinquished because of economic and political pressures), and a lot of others. Germany went from being a comparatively open and free nation into Nazism. For centuries, Europe was racked by wars and uncounted deaths because religion dominated politics.

In some ways, there’s nothing new under the sun, that is, if you know what came before. If not, you’ll get what you deserve. The problem is that so will those of us who saw what could happen and were ignored because the majority believed progress would continue without work and without an understanding of the past.

So You Want to Be A College Professor?

Once upon a time, being a college professor was thought to be an existence of intellectual pursuits and the imparting of knowledge to students who truly wanted to learn. Like all fairy tales, or nostalgia for the past, it has never been that ideal or idyllic, but the impact of the current world on collegiate teaching has been significant… and often brutal.

A generation ago, and certainly two generations back, if you were financially and intellectually able to obtain a doctorate, your odds of obtaining a full-time, tenure track position were far, far better than now, given that in the immediate post-World War II period, close to eighty percent of teaching faculty were in full-time positions. Today, 73% of all college instructors or professors are part-time adjuncts without benefits, a high percentage of whom have doctorates and are unable to find a full-time position with benefits. Part of the reason for this is that more and more students have gone on to gain terminal professional degrees – far more than there are full-time academic positions. At the same time, the massive demand for college degrees has coincided with a growing reluctance of state governments to support higher education. Two very predictable results have been the massive hiring of cheaper adjunct instructors and the burgeoning amounts of student debt.

Then there’s the problem of how students have changed. Undergraduate degrees are now regarded as “credentials,” particularly by politicians, parents, and even students. The combination of skyrocketing tuition, the consumerism of student evaluations, and the need for credentials have taken a huge toll on academic rigor. For their money, most students expect to receive a grade of A, and they’re disappointed, if not angry, if they don’t get it, and they’ll take that anger out through evaluations of any professor who denies them the grade they think they deserve. All too many of them are also ultra-sensitive, and any professor who uses sarcasm, particularly in written form, is risking disciplinary action in many universities. And in this age of educational consumerism, colleges and universities are factoring in student evaluations into decisions on faculty raises, tenure, and promotion. The predictable result is less academic rigor and a gradual dumbing down of course content.

Recent studies have also shown that students now entering college have a social and emotional maturity some 3-5 years less than students of a generation ago, which is why teaching courses taken in the first two years of college is often more like teaching high school used to be – especially in state universities and colleges. In addition, because of the proliferation of electronic devices, especially I-phones, most college students today have difficulty concentrating and maintaining a focus on anything – except electronics – for more than a few minutes. This combination, along with increased student fragility and sensitivity, is another reason why university after university has had to hire more and more counselors and psychologists. Too many of these students literally do not know how to learn on their own, or to handle the smallest adversity, and they’re overwhelmed.

To cope with all of this, administrators and politicians keep looking for the Holy Grail of education, trying new methods, new means of teaching, reinventing a new wheel, so to speak, even before they can determine whether the last wheel they tried really worked.

One local university here has announced just last week that it is going to a new “trimester” system, starting next January, so that students can graduate in three years. This will shorten each semester from 15 academic weeks to 12 weeks, which will likely result in more dumbing down of course content because teaching is not like higher speed automation. Cutting out roughly twenty percent of teaching time will mean less will be taught, and less will be learned. The university faculty is aghast at the timetable, because none of this was discussed with faculty. Higher level courses aren’t developed in cookie-cutter fashion. It takes time to develop an effective way to present material, and there isn’t time to carefully redo every course all in a few months, especially while teaching a full load at the same time. The impact will be even worse for adjunct faculty, because they don’t get paid for course development, and most are barely making ends meet anyway.

The result will likely be a disaster, and will take several years to straighten out, if it even can be, but the university president is clearly responding to parental and political pressure to make education quicker and more affordable so that students can get that “credential” sooner and cheaper. No one is talking about whether they’ll learn as much.

Now… do you really want to be a university professor?

Priority By Budget

In early March, President Trump released his budget proposal for the 2020 fiscal year, a proposal that would set federal research spending at $151 billion, or roughly 3% of total federal spending, which would cut overall federal research spending by 11%, or almost $17 billion. Now, that’s only his proposal, and the final say on federal spending lies with the Congress, but proposals do indicate the President’s priorities. Under Trump’s priorities, the National Institutes of Health, the National Science Foundation, and the Department of Energy’s Office of Science would all face cuts of more than 12%, while science funding at the Environmental Protection Agency would drop by 40%.

After World War II, the U.S. funded almost 70% of research and development funding world-wide. Today, that figure is 28%, and while that shift can be partly explained by the ability of the rest of the world to be able to fund research, the fact is that the U.S. is being badly outspent, particularly in the area of basic research.

At present, total U.S. spending on basic research comprises less than 17% of all U.S. R&D spending. About three-quarters of U.S. basic research is funded by the federal government (44%), state governments (3%), institutions of higher education (13%), and other non-profits (13%).

To make matters worse, the majority of R&D spending by U.S. businesses goes toward product development, with only about six percent of business R&D funds going to basic research, and over the last four decades, the contribution of U.S. corporations to new basic research has dropped from 30% of published research to less than 10%. This isn’t surprising, because basic research is unpredictable and often expensive, but without basic research, in time, product development will slow dramatically, if not come to a virtual halt. That’s why federal support of basic research is absolutely necessary if U.S. industry is to continue to compete in a global market.

Then add to that the fact that climate change and its environmental effects are a persistent and real future problem… and Trump wants to cut environment research by 40%?

All that suggests that the President’s priorities are anything but for the future.