Wednesday, February 15, 2023

What is Truth?

We can disagree about truth in economics, the facts of legal disputes, experimental science, or even in the arts, but the premise, stated or implied by critical theory, is that there is no truth. Everything is reduced to politics or propaganda.  

It is easy to be cynical when so much is determined not by what is true but by who decides policy in government, acceptable discourse in academe, or in even in science, where money influences research to such an extent that Marcia Angell MD, physician and long-time editor of New England Journal of Medicine, has said on record

“It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.” 

History is a record of struggle among power mongers.  Justice is the idea that the struggle should be moderated by truth, moral or, at least, factual.  Philosophical inquiry and liberal agitation, at the time of the French Revolution, began an era termed the Enlightenment.  In this ideology, truth could be ascertained rationally, and it should supersede authority in determination of human rights and policy. A quote attributed to the Parisian Denis Diderot is of the essence in its challenge to authority vested in kings and priests:

“Man will never be free until the last king is strangled with the entrails of the last priest.”

It’s interesting to note that liberal, in this era, meant constraining governmental control and authority vested in the church, whereas it now means increasing government control over the economy and anything else that is controversial.  At any rate, Enlightenment theory revered reason and individualism against authority.  The Enlightenment, also known as the late Modern era, began around 1800 with European and American revolutions and involved transition from a world dominated by imperial and colonial powers to nation states.  After World War I, World War II, and the Cold War, increasing skepticism about truth and justice, led to the Postmodern era, “the end of history”, or postmillennialism. Truth and justice have been relegated to a dubious realm considered beyond existential experience.  

Friedrich Nietzsche, ahead of his time, put it bluntly:

"God is dead."

"There are no facts, only interpretations."

"All things are subject to interpretation; whichever interpretation prevails at a given time is a function of power and not truth."

"In the end, one only experiences oneself."


And Martin Heidegger:

"Making itself intelligible is suicide for philosophy."


Nietzsche died in 1900.  In May of 1933 Heidegger joined the Nazi Party.  Adolph Hitler’s Mein Kampf is implicitly or explicitly Postmodern and Darwinian: “He who would live must fight. He who doesn’t wish to fight in this world, where permanent struggle is the law of life, has not the right to exist.”  

Existentialists like Nietzsche and Heidegger are deadly serious thinkers from whom we get the pop-culture idiom that we “create our own reality”.  Or as an old friend, a poet and librarian, used to say, “All we know is that we have a life on our hands”.


Skepticism about truth has not prevented various people and parties from speaking about justice.  Ordinary people revolt against unfair business practices, hypocrisy by political figures, or oppression of the powerless.  Our literary canon is full of moral object lessons that dramatize the conflict of good and evil.  That so much literature ends in tragedy only intensifies human indignation about injustice.  In this state of mind, we are vulnerable to propaganda and persuasion.  Manipulators use their skeptical premise to attack any kind of success, on the presumption that it must be based on privilege or exploitation. 

If there is no truth and we create our own reality, people who are doing well must have gained their advantage by manipulation of power.  If their power is not visible, they must have gained it by conspiratorial means.  Jews have done well in the world despite persecutions, ergo they must be networking some financial or familial influence to get rich.  But Jews have also earned more Nobel Prizes than any other ethnic group. That they are good at most anything they undertake has to be explained away as some kind of cheating.  The state of Israel is democratic, productive, and occupies a sliver of land surrounded by authoritarian Arab states.  Yet it is the object of attacks both literal and rhetorical.  Asians are doing well in the United States to a degree that their success is disproportionate to their numbers in the population.  The Supreme Court is deliberating whether affirmative action policy that discredits Asians for admission to Harvard University and the University of North Carolina is, in fact, illegal discrimination.  Black students are not doing well in mathematics in Seattle schools.  Administrators have concluded that it must be because there is structural racism in math instruction that favors white people.  It is politically incorrect to say that hip-hop culture does not encourage black children to excel in school. Life isn’t fair, so we must compensate to bring about, if not justice, equity.

The success of entire civilizations can be explained away on these nihilistic premises.  Jared Diamond’s Pulitzer Prize-winning book Guns, Germs and Steel; The Fates of Human Societies is based on materialistic corollaries of Darwinism and skepticism about the actual merit of Western Civilization.  Deconstruction of the documents and polity of the United States has become an industry in academe. The success and productivity of American capitalism is undeniable.  The 1619 Project, by Nikole Hannah-Jones, published by the New York Times as a long-form journalism endeavor, reframes American history as consequences of slavery.  

The State of Israel is productive out of proportion to much of the rest of the world.  Numerous technical discoveries have come from Israel’s research labs. Government is democratic in a locale dominated by authoritarian states.  How has it been able thrive and repel warfare and terrorism against it since the Balfour Declaration?  If no culture is, in truth, superior to any other, this success must be based on exploitation and oppression of indigenous people.  That Israelis are themselves indigenous is ignored in the continual rhetoric attacking the state of Israel.


What about the old religious ideals and morality that have gone out of fashion?  The doctrine that man is created in the image of God is the kind of metanarrative that is rejected by Modern and Postmodern thought.  A local Presbyterian minister describes his wife’s experience in graduate school at the University of Washington.  She began a thesis about the cultural legacy of the Biblical doctrine that man is created in the image of God, the Imago Dei.  Her research was deemed inadmissible, and an advisor refused to discuss it or let the ideas circulate in the department.  Even if it is assumed that the Bible is mythology, we can sustain investigation of the usefulness of certain ideals in historic documents: “All men are created equal and endowed by their creator with inalienable rights…”  These kinds of principles in medieval philosophy were considered metaphysically real, in the Platonic or Aristotelian sense, and ascertainable through deductive reasoning from theological premises, often dogmatically posited. Philosophy following William of Ockham (c. 1287 – 1347) reframed metaphysical categories as concepts in the human mind and thus names only, nominal.  Thus, we can theorize that distinct biological species evolved without any metaphysical formal order.


Since the beginning of the COVID epidemic, the CDC and FDA have endorsed vaccine mandates, and, as is increasingly evident, suppressed dissent and alternative treatments.  The vaccines had limited testing, and now, after more than two years, the numbers of COVID deaths are, in fact, higher among the vaccinated.  Even Dr. Fauci has conceded that vaccines do not prevent transmission of the disease.  The mandates and censure of dissident medical opinions bring to mind the comments of the former editor of The New England Journal of Medicine about conflicts of interest in medical practice.  The pharmaceuticals industry has made billions on the vaccine. If hyperbole about the vaccines is driven by financial interests and effected by political influence, we don’t need religious moral imperatives to judge industry executives, compliant politicians, and medical professionals.  People’s lives are at risk.  Where there is uncertainty about health hazards, the moral options dictate caution and openness to debate.  We can disagree about ethics, but people who, for profits, ignore hazards to millions of people have ceased being ethical.


Another perspective on this controversy is the remark of an atheist friend, a graduate of Stanford University, who could recall the locker-room comments of medical researchers with whom he used to play recreational basketball.  They were unapologetic about their dependence on money from the pharmaceuticals industry: “Give them what they want, and they’ll keep funding the research”.  Academic careers can fail on scruple or succeed by rationalizing away ethical constraints.  It seems more than coincidence that, in the humanities, academe has become a fifth column engaged in deconstruction of the historic literature and political ideals by which Western culture has thrived.


History provides many examples of scientific consensus that has been, not only wrong, but intransigently committed to flawed theories and conclusions. Thomas Kuhns’ book The Structure of Scientific Revolutions explains that new discoveries create a crisis during which change is resisted by those invested in the prevalent ideas.  For fifty years a scientific consensus about population and climate change has led to spectacularly wrong predictions.  Activists have claimed a scientific consensus about just how devastating the impact of climate change will be, but for 50 years, scientists have been consistently wrong in their doomsday predictions.

Here’s a Brief History: Fantastically Wrong Climate Change Predictions, published Apr 24, 2017.

Wisconsin Senator Gaylord Nelson, the father of Earth Day said before the first Earth Day in 1970 that “the secretary of the Smithsonian Institute believes that in 25 years, somewhere between 75 and 80 percent of all the species of living animals will be extinct.”

Life Magazine reported that same year that “Scientists have solid experimental and theoretical evidence to support… the following predictions: In a decade, urban dwellers will have to wear gas masks to survive air pollution… by 1985 air pollution will have reduced the amount of sunlight reaching earth by one half.”

Stanford biologist Paul Ehrlich, the author of The Climate Bomb, wrote in 1971 that “by the year 2000 the United Kingdom will be simply a small group of impoverished islands, inhabited by some 70 million hungry people… If I were a gambler, I would take even money that England will not exist in the year 2000.”

In 1975 Newsweek ran a now-infamous article entitled “The Cooling World,” which cited several climate scientists in concluding that “the central fact is that… the earth’s climate seems to be cooling down… If the climate change is as profound as some of the pessimist’s fears, the resulting famines could be catastrophic.”

Global famine was a popular prediction in the 70’s.  North Texas State professor Pete Gunter summed up the prevailing sentiment when he wrote in “The Living Wilderness” that by “by the year 2000… the entire world, with the exception of Western Europe, North America, and Australia, will be in famine.”

In 1986, NASA scientist James Hansen testified before Congress that “global temperatures should be nearly 2 degrees higher in 20 years, ‘which is about the warmest the earth has been in the last 100,000 years.’”

Two years later, Dr. Hansen told an interviewer that in 20 years, the area below his New York City office would be completely changed, most notably that “the West Side Highway [which runs along the Hudson River] will be under water.”

Carl Sagan predicted in 1990 that “the planet could face an ‘ecological and agricultural catastrophe’ by the next decade if global warming trends continue.”

That same year, Dr. Michael Oppenheimer with The Environmental Defense Fund wrote:

By 1995, the greenhouse effect would be desolating the heartlands of North America and Eurasia with horrific drought, causing crop failures and food riots…” (By 1996) The Platte River of Nebraska would be dry, while a continent-wide black blizzard of prairie topsoil will stop traffic on interstates, strip paint from houses and shut down computers… The Mexican police will round up illegal American migrants surging into Mexico seeking work as field hands.

As recently as the last decade, both Dr. Hansen and Peter Wadhams, the head of the Polar Ocean Physics Group at the University of Cambridge, believe “that the Arctic is likely to become ice-free… as early as 2015.”

That’s actually two years later than Al Gore predicted in 2007, 2008, and 2009, when he cited what he called a scientific consensus to claim that the North Pole would be “ice free by 2013.” 

Pentagon scientists in 2003, when warned in their report “An Abrupt Climate Change Scenario and its Implications for United States National Security” that within 10 years, “it is not implausible” that parts of California will be flooded, parts of the Netherlands would be uninhabitable, and an unprecedented rise in hurricanes, tsunamis, and tornadoes would spark wars across the globe as people fought for increasingly scarce resources. 

Scientists with the United Nations Environment Program warned in 2005 that man-made global warming would so decimate coastal areas as well as the Caribbean and Pacific islands that there would be “upwards of 50 million climate refugees by 2010.”

Of course, none of these scientific predictions since Earth Day have been right, but more predictions keep coming with more admonitions that the science is settled. 

Recent legislation by the US Congress has been described as the most significant climate legislation in U.S. history.  It takes immense credulity to believe the billions of dollars in research funding contingent on the “Inflation Reduction Act” do not have bearing on its endorsement by the reigning scientific authorities.


Fear mongering is useful, not only to financially dependent researchers, but to media empires seeking viewers for their broadcasts.  It’s not surprising that the hype never ends.  As the targets of these pressure-sales tactics are we to give up on knowing the facts?  Is truth subjective, or worse, for sale to the highest bidder?  Anybody who has read a murder mystery or even a little of the Western literary canon knows that resignation to deconstructionist theories is unacceptable.  Literature is not part of a systemic oppression of the powerless by dominant factions of our culture.  Human outrage about lies dispels any doubt that there is truth regardless of what anybody thinks about it.  Elite factions of our culture are pushing deconstruction of historic literature and culture.  This seems a projection of their own motives. In the absence of truth and justice, elite factions have control.  The “scientific consensus” about vaccines or climate change routes billions of dollars to health care, environmental research, and academic institutions.

Conspiracy theorists of various political stripes have built empires on the view that there is an oligarchy, financial, racial, or Communist, who write the scripts that shape acceptable ideas for right thinking people.  So much of what is widely believed is absurd that it’s difficult to believe that our thinking is designed by a powerful elite.  It’s enough to recognize that people are powerfully disposed to think like others with whom they identify.  It doesn’t take an edict from Klaus Schwab and the World Economic Forum to motivate people to act in irrational conformity. Transgender ideology is in fashion. Like the fads that used to dictate hair styles, young girls, with their parents blessing, are having mastectomies and taking testosterone.  Boys are taking drugs that are used to chemically castrate sex offenders.  

Rene Girard has written volumes about mimetic envy.  The behavior of social creatures is based on imitation to a much greater degree than is generally supposed. Children learn language and behavior by imitation. Stocks rise and fall because investors are trend followers.  During a lecture at Stanford, Girard said, “We don’t know what our desire is. We ask other people to tell us our desires.  We would like our desires to come from our deepest selves, our personal depths, but desire is always for something we feel we lack.”  Envy and resentment are the inevitable consequences of this drive toward mimesis. And envy fuels conflict whenever two or more “mimetic rivals” want something that can go to only one. It might be a woman, a presidency, or a research grant. Many religious prohibitions are meant to regulate and control such conflict. 


Ideas have consequences, and the consequences are evidently extreme for a population that has accepted an ideology in which there is no truth, and human behavior and cultures are impervious to evaluation by merit.  Here’s the Merriam-Webster definition of the term gaslighting: “Psychological manipulation of a person usually over an extended period of time that causes the victim to question the validity of their own thoughts, perception of reality, or memories.  Typically, gaslighting leads to confusion, loss of confidence and self-esteem, uncertainty of one’s emotional or mental stability, and a dependency on the perpetrator.”  The dictionary publisher has deemed gaslighting the 2022 “word of the year”

Interestingly enough, the Merriam-Webster publication about gaslighting includes the following example of usage of the term: 

“My committee’s investigation leaves no doubt that, in the words of one company official, Big Oil is ‘gaslighting’ the public. These companies claim they are part of the solution to climate change, but internal documents reveal that they are continuing with business as usual.” –Representative Carolyn B. Maloney, Chairwoman of the Committee on Oversight and Reform, House of Representatives, September 2022. 

The honorable Representative of New York’s 12th district has unintentionally provided an example of how the public has been gaslit for so long about petroleum products and climate change that she assumes oil companies doing business “as usual” verge on criminality.  The world economy is so dependent on petroleum products that rhetoric about a transition to renewable resources is mostly hype.  Even if you drive an electric car, or if you don’t drive at all, you are consuming food that requires petroleum-derived nitrogen fertilizers, pesticides, agricultural machinery that runs on gasoline or diesel, and fuel for transportation from farms to market.  Products made from or containing petroleum include wax, ink, drug capsules, denture adhesives, upholstery, plastics, paints, guitar strings, carpets, hair coloring, deodorant, lipstick, heart valves, anesthetics, cortisone, aspirin, detergents, dyes, pharmaceuticals, and cosmetics.  Curtailing drilling for oil in the United States and Europe has led to increasing imports from Russia in the midst of economic sanctions that purportedly constrain Putin’s aggression in Ukraine.


Fortunately, for the near term anyway, the world economy and people’s jobs function on a more rational basis than the Utopian illusions of politicians and media talking heads.  Truck drivers, plumbers, electricians, and accountants ignore most of the propaganda and go to work.  Unfortunately, counselors, educational advisors, and surgeons working in the sexual-reassignment industry also go to work, and many of them are involved in suppression of the evidence that gender dysphoria isn’t relieved even by surgery.  Gender is not socially constructed or surgically reassign able.  A recent Harvard CAPS/Harris Poll reported that most people still align with the common-sense idea that people are born with their gender. BBC News reports that the Brits are closing the UK’s only dedicated gender identity clinic for children and young people, after a report warned the current model of care was leaving young people “at considerable risk” of poor mental health and distress and having one clinic was "not a safe or viable long-term option”


Here’s an interesting discussion titled An Existential Threat to Doing Good Science

By Luana Maroja –Nov 07, 2022

“One of the most fundamental rules of biology from plants to humans is that the sexes are defined by the size of their gametes–that is, their reproductive cells. Large gametes occur in females, small gametes in males. In humans, an egg is 10 million times bigger than a sperm. There is zero overlap. It is a full binary.  But in some biology 101 classes, teachers are telling students that sexes, not gender, sex, are on a continuum. At least one college I know teaches with the “gender unicorn” and informs students that it is bigoted to think that humans come in two distinct and discrete sexes. “

And this: “…it has become taboo in the classroom to note any disparities between groups that are not explained as the result of systemic bias.”

And: “…discussing with students how the great variation in human culture affects our behavior and outcomes is now untouchable.”


Common sense is becoming increasingly uncommon.  Suppose we listen to the overture to Mozart’s opera The Marriage of Figaro and then compare it to that of any opera by the contemporary composer Phillip Glass.  Classical music developed over hundreds of years in a tradition that respected formal order in nature.  Tonal music is based on a scale built on the harmonics of sound.  The octave and fifth, dominant and tonic, relations are primary.  The music has direction, development, and resolution around tonal centers.  After World War I, disillusioned composers jettisoned tonal music for music of their own invention.  They couldn’t sell their works to an audience beyond the academic subculture in which it was created.  A German friend can remember when it was broadcast from radio studios subsidized by the government.  Nobody listened.  Atonal works seldom survive their premier performances.  Creating a new musical “reality” in every generation may satisfy composers with Nietzsche’s will to power, but it sounds like an egotistical nightmare.