the imaginative conservative logo

Tradition, philosophy, and religion long ago succumbed to science as the sole arbiter of truth, but now science too has been dethroned. What rules today is a different form of ideology—ideas and images replace concrete experience. Can we escape the Screen with its hollow images and counterfeit emotions and experience the human way of life?…

At the core of science is an unquestionable, sacred idea succinctly articulated by biologist H. Allen Orr: “The universe, including our own existence, can be explained by the interactions of little bits of matter,”[1] a concrete representation of the philosophy of materialism, which holds that every object as well as every act in the universe is matter, an aspect of matter, or produced by matter.

Surprisingly, physicist Erwin Schrödinger, in 1958, showed that the sacred idea of science is false, since the activity of matter cannot even explain how we see: “If you ask a physicist what is his idea of yellow light, he will tell you that it is transversal electromagnetic waves of wavelength in the neighborhood of 590 millimicrons. If you ask him: But where does yellow come in? He will say: In my picture not at all.”[2]

Although Schrödinger employs technical language, his insight into the nature of perception is based on straightforward observations. Let me fill in the details of his argument. Suppose sunlight is reflected from yellow sunflowers into the eye of a landscape painter. The sunlight passes through the lens of the eye and strikes the retina, a sheet of closely­ packed receptors—4.5 million cones and 90 million rods. Activated by the incoming sunlight, chemical changes occur in the rods and cones, which are then translated into electrical impulses that travel along the optic nerve to the brain. Further electrical and chemical changes take place in the brain. In terms of physiology of seeing, this description is complete; however, the sensation yellow has not appeared in this scientific account of perception. The landscape painter experiences the yellow of the sunflowers, not the myriad chemical and electrical changes that are necessary for seeing.

What is true for seeing is true of the other senses: physical-chemical changes in the brain are insufficient to explain any sensory perception. The sensible qualities a person perceives never appear in the brain as such. The brain itself is shrouded in complete silence, even while a person hears the deafening roar of a jet aircraft engine. Likewise, the brain, encased in the skull, is covered with darkness even while a person perceives the brilliance of the sun’s glare. Our brains do not become colder when we touch snow or harder when we touch iron. Not a single sugar molecule passes from the chocolate candy in the mouth to the gustatory region of the cerebral cortex—and yet we perceive sweetness notwithstanding. The brain tissue itself takes on none of the sourness of a tasted lemon or the acrid odor of the skunk’s spray that we smell.

Charles Sherrington, the founder of modern neuroscience, points out that physics and chemistry “bring us to the threshold of the act of perceiving, and there to bid us ‘goodbye.’'[3] In the scientific picture of the world, colors, odors, sounds, flavors, and textures are absent. Schrödinger thus concludes that scientific “theories are easily thought to account for sensual qualities; which, of course, they never do.”[4]

The toolbox of physical science is limited to air pressure, chemical changes, electrical impulses in nerves, brain cell activity, and other measurable properties of matter. Mechanical, chemical, and electrical changes themselves are not hearing, seeing, smelling, tasting, and touching. Science is mute about perceiving, as well as feeling, thinking, and willing. The reason is obvious: The interior life of a person is nonmaterial—perceptions, emotions, and thoughts cannot be touched, smelled, tasted, heard, or seen.[5]

The Blind Hope

Schrödinger’s argument is universally acknowledged by neuroscientists, yet the hope that someday physics, chemistry, biology, and neural networks will eventually explain perception lingers. Neuroscientist Antonio Damasio, for instance, believes the “simple sensory qualities to be found in the blueness of the sky or the tone of sound produced by a cello… will be eventually explained neurobiologically, although at the moment the neurobiological account is incomplete and there is an explanatory gap.”[6]

Neuroscientist Vilayanur Ramachandran, too, acknowledges that explaining perception in terms of brain function alone is an impasse for neuroscience: “No matter how detailed and accurate [the] outside-objective description of color cognition might be, it has a gaping hole at its center because it leaves out” the experience of redness.[7] He laments that the impasse results from a limit of present-day science: “Perhaps, science will eventually stumble on some unexpected method or framework for dealing with qualia—the immediate experiential perception of sensation, such as the redness of red or the pungency of curry—empirically and rationally, but such advances could easily be as remote from our present-day grasp as molecular genetics was to those living in the Middle Ages.”[8]

H. Allen Orr agrees that “how such mere objects [as a brain and its neurons] can give rise to the eerily different phenomenon of subjective experience seems utterly incomprehensible.”[9] Despite this, he evades Schrödinger’s argument with its implication that perceiving, thinking, and willing are nonmaterial realities, not the result of neurons interacting with each other. Dr. Orr offers an explanation why qualia and the other aspects of the interior life pose an insurmountable obstacle to present-day science: Our inability to see how certain material structures produce subjective experience is that our evolved, finite brains cannot “fathom the answer to every question that we can ask.” Dr. Orr believes matter gives rise to the interior life, but scientists will never know how, a belief that violates the central tenet of science—a theory in some way must be open to experimental refutation. In this way, Dr. Orr exposed that his and most scientists’ unshakeable belief that “the universe, including our own existence, can be explained by the interactions of little bits of matter” is an ideology[*], not open to refutation by experiment or rational thought.

Ideology and the Interpretation of Quantum Mechanics

The ideology of science is the driving force behind the passionate, and at times acrimonious, disputes over the interpretation of quantum mechanics. In the Copenhagen interpretation of quantum mechanics, the primary emphasis is on undivided wholeness, in which the observing instrument is not separate from what is observed. And, of course, the observing instrument includes the experimenter who set up the equipment. Thus, the experimenter is a participant in nature as well as an observer; an understanding expressed by Niels Bohr: “In the great drama of existence, we ourselves are both actors and spectators.”[10] Physicists have stated this fundamental aspect of the Copenhagen interpretation in various ways. Eugene Wigner: “It was not possible to formulate the laws of quantum mechanics in a fully consistent way without reference to the consciousness.”[11] Max Born: “No description of any natural phenomenon in the atomic domain is possible without referring to the observer, not only to his velocity as in relativity, but to all his activities in performing the observation, setting up instruments, and so on.”[12] Freeman Dyson: “The laws of subatomic physics cannot even be formulated without some reference to the observer. The laws leave a place for mind in the description of every molecule.”[13] Hence, according to the Copenhagen interpretation, mind and matter are two realities, neither reducible to the other.

Physicist Steven Weinberg, a vehement opponent of the Copenhagen interpretation, objects to dragging humans into the fundamental laws of nature and thus abandoning the idea that the “world [is] governed by impersonal physical laws that control human behavior along with everything else.”[14] David Albert, also a physicist, reveals the real issue behind the passionate disputes over the interpretation of quantum mechanics: “The original, unbounded, omnivorous, terrifying aspiration [of science is] to reduce the entire world, and ourselves, and all our doings, to a vast concatenation of simple mechanical pushings and pullings. That’s what can never entirely be taken in. That’s what discomposes every pretense to wisdom.”[15] In an endnote to his “Quantum’s Leaping Lizards,” Dr. Albert indicates that by human wisdom he means the teachings of “Confucius and Buddha, Jesus and Pericles, Erasmus and Lincoln.”[16]

In a recent poll of 33 attendees of a conference on the foundations of quantum mechanics,[17] fewer than two out of five physicists adhered to the Copenhagen interpretation. The most extreme position of practitioners of quantum mechanics follows the advice given to some graduate physics students—“Shut up and calculate!”—be mindless, do not fret over the meaning of quantum mechanics, which is philosophy, mere words.[18] A theoretical physicist friend of mine told me that he just “turns the crank to get answers” and never thinks deeply about what he is doing. From my professional experience as a physicist, I suspect that my friend is representative of most physicists, who scorn philosophy and are true believers that the universe, including our own existence, can be explained by the interactions of little bits of matter.

Dr. Weinberg’s claim that “physical laws… control human behavior” and Dr. Albert’s assertion that “all our doings” result from “a vast concatenation of simple mechanical pushings and pullings” exhibit the incoherence of the ideology of science. Let me state the logical contradiction of the sacred idea of science in the simplest possible terms. Suppose my interior life is determined wholly by the motions of atoms in my brain. One day, the atoms in my brain jostle around and what arises is the ancient Irish belief that leprechauns store their gold in a pot at the end of the rainbow. Such a belief clearly pertains to the atoms in my brain, not to an objective world. But this is true for all my beliefs, and hence I have no reason to suppose that any of them are true, including the one that the motions of atoms in my brain cause my interior life, or that physical laws control by behavior, or that all my doings result from mechanical pushings and pullings in my brain and body.[19]

Here is the above argument given in technical terms. If scientists are, as their ideology proclaims, nothing but a pack of neurons, and their joys, sorrows, memories, and sense of free will are in fact no more than the behavior of a vast assembly of nerve cells,[20] then intellectual insight is an illusion. Mouthpieces of their genes and American culture, Dr. Weinberg and Dr. Albert said no to the Copenhagen interpretation of quantum mechanics, while Bohr and Heisenberg shouted yes to the Copenhagen interpretation because of differing genes and cultures. If the thoughts of scientists are determined by neurophysiology, action potentials, and the endocrinology of neurotransmitters, then physics, biology, and psychology are meaningless. If every decision in science, is a thoroughly mechanical process determined by the results of prior mechanical processes, then truth is an illusion.

From this contradiction, we can safely conclude that mind cannot be a mere byproduct of matter and that philosophy of materialism is false.

That the unquestionable, sacred idea of science is false, does not mean the all the findings of science are false. Because materialism is a metaphysical belief—an intellectual creed—added on to science, its rejection would not alter any scientifically established re­sult: Atoms would still exist; the Earth would still 4.5 billion years old; and the universe would still have begun with a Big Bang.

The Rise and Fall of Science as the Arbiter of Truth

True believers in the ideology of science are not open to a rational examination of their sacred idea, and hence they will never abandon their belief that matter is the only reality; that would be of no consequence, if science had not become the sole path to truth.

From the eighteenth century on, no person in the Western World failed to see that scientists commanded nature. Even the most poorly educated peasant understood that the state of scientific knowledge was greatly advancing, while the most sophisticated intellectual knew that philosophy remained a domain of “contentious and barking disputation,”[21] where nothing is ever settled. Small pox vaccine, the steam train, and the electrical telegraph demonstrated that science was the path to truth, and that was bad news for theologians, philosophers, and poets, for they could not command nature. Their prestige began a steady, irreversible decline. Today, no one speaks of poetic knowledge, philosophical insights into nature and human affairs, or killer arguments for the existence of God.

Ironically, the mechanical technology that promoted science to the King of the Castle, the chief arbiter of truth, has transformed into digital technology that is in the process of slaying the King. The assault on truth began with the invention of the camera.

Images and words are two totally different ways of communicating. A writer can fashion factual statements and established principles into a rational argument that persuades her readers of a certain truth. Syllogisms are not in the toolkit of the photo editor, who orders images by association. For example, in the 2016 Democratic Presidential Primary, The New York Times supported Hillary Clinton and ran a left-profile photo of her, gazing slightly upward, with warm light evenly illuminating her face, a photo worthy of Leni Riefenstahl. The most frequent photo of Bernie Sanders showed him descending the stairs from an airplane, stooped-shouldered, carefully watching his step, and securely grasping the side rails. The choice was stark: a beautiful woman who sees the truth or an old man who is unsure of his footing.

Photo and video editors are masters of using the emotions that photographs can evoke to sell political candidates as well as products. In the 2016 Coca-Cola Emoticons commercial, four wholesome-looking teenagers, two boys and two girls, are strangers in front of a Coke machine. Out of the machine, one of the boys takes a Coke with the emoticon Kiss on the label; the teenagers laugh over different bottles with the emoticons LOL, Sexy, Naughty, and Hello. The teenagers get to know each other, and at the end of the 45-second commercial, one couple walks off together with the voiceover “take the feeling, share the feeling,” a happy feeling with Coke that brings young people together.[22]

The association of Coke and happy teenagers is an emotional judgment that is ridiculous to reason; drinking Coke does not result in friends or happiness; the juxtapositions of images are not cause and effect, but nevertheless they can change a viewer’s beliefs and actions. The Image-World falls outside the realm of reason; persuasion through images is not based on logic, intellectual analysis, and assessment of data, but on feelings.

Digital technology administered the coup de grâce to truth. The Internet is the first medium in history, besides the agora of ancient Athens and the town meetings in New England villages, to create many-to-many communication; the phone is one-to-one, and books, radio, and television are one-to-many. Clay Shirky, a prominent thinker on the social and economic effects of the Internet, points out that “every time a new consumer joins this media landscape a new producer joins as well, because the same equipment—phones, computers—let you consume and produce. It’s as if, when you bought a book, they threw in the printing press for free.”[23] Digital technology encourages everyone to be a reporter, a broadcaster, and a commentator.

Social media allow hundreds of millions of users with the click of a mouse to disseminate their opinions without the scrutiny of grammarians, fact-checkers, and editors, the guardians of print culture. Anyone, anywhere, anytime, now, can instantly post an opinion on anything. The Internet has become an ocean of opinion, one comment washing over another, quickly submerging whatever truth that tries to surface.

In the post-truth society that is upon us, the postings and comments on the Internet are governed not by objective facts and established principles but by emotion and personal opinions. In this new digital world, your facts are not my facts; personal opinions determine facts.

On the Web, the declarations about the meaning of human life by such eminent scientists as Stephen Hawking, Richard Dawkins, and Steven Weinberg carry no more weight than the opinions of an adolescent in Oshkosh, Wisconsin, or the ideas of a conspiracy theorist in rural Vermont, or the feelings of a disgruntled poet in the Big Apple. Such pronouncements as “the human race is just a chemical scum on a moderate-sized planet;”[24] Homo sapiens is “a more-or-less farcical outcome of a chain of accidents;”[25] “human beings are lumbering robots manipulated by genes;”[26] and brains are “meat machines,”[27] once taken seriously, now evoke laughter, for the phrases “chemical scum,” “lumbering robots,” and “meat machines” are seen as high comedy worthy of a modern-day Aristophanes updating Cloud Cuckoo Land. In a post-truth society, people still look to scientists and technologists for a cure for cancer and for new digital devices, but not for deep insights into human living.

With tradition, philosophy, and religion long succumbed to science as the sole path to truth and now with the dethronement of science as the arbiter of truth, ruling principles, immovable intellectual centers, and fixed structures of thought appear to be gone forever. What rules is a different form of ideology—ideas and images replace concrete experience.

The Loss of the Real

In the twentieth-first century, few of us have little personal experience of anything; most of our “personal” experience of the world is filtered through TV, movies, and the Internet. TV-watchers invariably believe that television is an electronic window that brings the world into their living rooms. An opera, Iraq War II, or the Amazon rain forest seen through television is taken for experience of the real event or place. But the hollow, depleted images of television bear little resemblance to reality, and not only because they are edited, rearranged, and altered. The viewer of a TV-opera never feels the excitement that runs through an opera house just before the performance begins; the viewer of a TV-war never is subjected to the confusion of battle or the incredible physical vibrations that accompany an artillery barrage; the viewer of the TV-Amazon never smells the rotting organic matter of the jungle or hears the silence that pervades the deep recesses of the rain forest. An artificial, manufactured experience is taken for a genuine human experience.

The “loss of the real”[28] is more complete on Facebook. Jaron Lanier, an early developer of virtual reality and once an uncritical technology enthusiast, argues that the most effective users of Facebook are the “ones who write successful online fictions about themselves.”[29] People communicate online differently than they do in person, because voice inflection and body language are absent in computer chat. Consequently, an online profile can be a carefully crafted fiction meant to impress other Facebook users, something that would be laughable face to face. So abstract and unreal are Facebook profiles that “friends” find it easy to love, hate, laugh at, support, or attack “friends.” With the click of a mouse, a digital relationship between two persons can be deleted and the contrived emotion that resulted from constant updates on Facebook vanishes, forever.

Social networks, just like television, create the illusion that isolated individuals are connected to each other. Identical images are fed into everyone, regardless of age, gender, education, or ethnic background. Individual “experience” is reproduced again and again, not unlike the identical parts stamped out by an assembly line. In this way, the Internet and television destroy personal experience, but isolated individuals believe they are bound together by the same “experience.”

As we live more and more in the Web, the interior life becomes more and more superficial, just like the photographic images skimmed from the real world; the end result of the digital revolution may be that the interior life collapses to hollow images and counterfeit emotions.

With the loss of the real, anything is possible; consequently, conspiracy theories abound on the Internet: Barack Obama was not born in the United States; water condensation trails (contrails) from aircraft consist of harmful chemicals; environmentalists caused the fatal oil-rig industrial accident in the Gulf of Mexico, in 2010; the Coca-Cola Company intentionally changed to an inferior formula with New Coke to drive up the demand of the original product; the 2012 fatal mass shooting at Sandy Hook Elementary School in Newtown, Connecticut, was a staged event to promote gun control; the United States government covered up the fact that alien spacecrafts crashed in Roswell, New Mexico, in 1947; the pharmaceutical industry mounted a scheme to conceal the evidence that vaccines cause autism.

What To Do

In the post-society with the absence of agreed-upon principles and immovable cultural centers, a few of us are tempted to search for the right idea to restore stability to intellectual life. If found, most likely, the right idea would quickly become another ideology, where words and ideas replace actions and truth.

Digital technology has an extraordinary grip on most of us; we do not wish to give up what has made us King of the Castle, “lords of our own tiny skull-sized kingdoms,”[30] the sole judge of what is true, good, and beautiful.

Some of us grasp that every screen, whether of a smartphone, a laptop, an HD TV, a digital camera, or a multiplex cinema presents a manufactured experience, an artificial world of depleted images. We desire real human experience, the kind musician Barry Green reports: Soloists, orchestral players, young students, and seasoned sessions players, alike, know that “unique suspended moment when you actually become the emotional or sensory quality of the music—the colors, the water, the love.”[31] An expert rock climber fills out Green’s reportage: “You are so involved in what you are doing [that] you aren’t thinking of yourself as separate from the immediate activity…. You don’t see yourself as separate from what you are doing.”[32] A dancer says at times she becomes the dance: “Your concentration is very complete. Your mind isn’t wandering, you are not thinking of something else; you are totally involved in what you are doing.”[33] The best moments of human life occur when we are completely engaged in an activity; then, the self disappears, personal problems vanish, the fear of failure disappears, mental clarity results, and joy ensues.[34]

Performers have solved one of the maladies of modern life, the substitution of a fuzzy verbal world for actual, concrete experience. They immediately grasp the truth of psychologist Abraham Maslow’s observation that genuinely creative thinkers “live far more in the real world of nature than in the verbalized world of concepts, abstractions, expectations, beliefs, and stereotypes that most people confuse with the real world.”[35]

To escape the Screen with its hollow images and counterfeit emotions, to experience the human way of life, go outside at night and observe the stars, or in summer pick up a dandelion and behold it, or gaze into the eyes of the next person you see. Connect to the immediate physical world, to the pungency of Stilton cheese, to the softness of cashmere, to the dance of cherry blossoms, to the smell of the ocean salt air, to the wonder and mystery of nature, and yes to the poetry, drama, and music that touch the transcendent.

The Imaginative Conservative applies the principle of appreciation to the discussion of culture and politics—we approach dialogue with magnanimity rather than with mere civility. Will you help us remain a refreshing oasis in the increasingly contentious arena of modern discourse? Please consider donating now.

Endnotes:

[*] Here we take ideology to mean any epistemological, political, or economic system of thought founded on sacred ideas closed to critical philosophical examination or any contrary experience.

[1] H. Allen Orr, “Awaiting a New Darwin,” The New York Review of Books, 60, No. 2 (February 7, 2013).

[2] Erwin Schrödinger, What is Life? with Mind and Matter and Autobiographical Sketches (Cambridge: Cambridge University Press, 1992), p. 153.

[3] Sir Charles Sherrington, Man on His Nature, (Cambridge: Cambridge University Press, 1963), p. 238.

[4] Schrödinger, What is Life?, p. 164.

[5] Here is a short amusing example of why our interior life does not result from brain function alone. During the day adenosine builds up in the brain to register the amount of time that has elapsed since a person awoke. When adenosine concentration peaks, a person feels the irresistible urge to sleep. The concentration of adenosine and the feeling of sleepiness are incommensurable. No matter how much a neuroscientist probes the brain with scans and chemical assays, she will never find sleepiness.

[6] Antonio Damasio, The Feeling of What Happens: Body and Emotions in the Making of Consciousness (New York: Harcourt Brace, 1999), p. 9.

[7] V.S. Ramachandran, The Tell-Tale Brain: A Neuroscientist’s Quest for What Makes Us Human (New York: Norton, 2011), p. 248.

[8] Ibid., p. 249. Dr. Ramachandran’s definition of qualia is on page 248 and is incorporated in this quotation.

[9] Orr, “Awaiting a New Darwin.” Italics in the original.

[10] Neils Henrik Bohr, Essays 1958-1962 on Atomic Physics and Human Knowledge (New York: Wiley, 1963), p. 15.

[11] Eugene P. Wigner, Symmetries and Reflections: Scientific Essays of Eugene P. Wigner (Woodbridge, Conn.: Ox Bow Press, 1979), p. 172.

[12] Max Born, Physics in My Generation (London & New York: Pergamon, 1956), p. 48.

[13] Freeman Dyson, Disturbing the Universe (New York: Harper & Row, 1979), p. 249.

[14] Steven Weinberg, “The Trouble with Quantum Mechanics,” The New York Review of Books (January 19, 2017).

[15] David Z. Albert, “Quantum’s Leaping Lizards,” The New York Review of Books (April 19, 2018).

[16] Ibid.

[17] Maximilian Schlosshauer, Johannes Kofler, and Anton Zeilinger, “A Snapshot of Foundational Attitudes Toward Quantum Mechanics,” PDF.

[18] See Weinberg, “The Trouble with Quantum Mechanics.”

[19] A similar argument was given by J.B.S. Haldane, a British geneticist and a committed Marxist. See J.B.S. Haldane, Possible Worlds: And Other Essays (London: Chatto and Windus, 1927; reprint, London: Transaction Publishers, 2002), p.209. Page reference is to the reprint edition.

[20] Francis Crick, The Astonishing Hypothesis, (New York: Scrib­ner’s, 1994), p. 3.

[21] Francis Bacon, The New Organon and Related Writings (Indianapolis, IN: Bobbs-Merrill, 1960 [1620]), p. 8.

[22] Coca-Cola Emoticons, YouTube.

[23] Clay Shirky, “How Social Media Can Make History,” Ted Talk, 2009.

[24] Stephen Hawking, quoted by David Deutsch, The Fabric of Reality (New York: Viking, 1997), pp. 177-178.

[25] Steven Weinberg, The First Three Minutes: A Modern View of the Origin of the Universe (New York: Basic Books, 1977), p. 154.

[26] See Richard Dawkins, The Selfish Gene (New York: Oxford Univers­ity Press, 1976), p. 21.

[27] The quotation “the brain happens to be a meat machine” is widely attributed to Marvin Minsky, although I could not find the phrase “meat machine” in any article or book written by him.

[28] Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon & Schuster, 1997), p. 235.

[29] Jaron Lanier, You Are Not a Gadget: A Manifesto (New York: Vintage, 2011), p. 71.

[30] David Foster Wallace, Kenyon College Commencement Address, 2005. Available on The Wallstreet Journal.

[31] Barry Green, The Inner Game of Music (New York: Doubleday, 1986), p. 14.

[32] Quoted by Mihaly Csilszentmihalyi, Beyond Boredom and Anxiety (San Francisco: Jossey-Bass, 1975), p. 39.

[33] Ibid.

[34] For a detailed discussion, see George Stanciu, “The Best Moments of Human Life,” The Imaginative Conservative.

[35] Abraham Maslow, “Creativity in Self Actualizing People,” in Creativity and Its Cultivation, ed. Harold Anderson (New York: Harper & Row, 1959), p. 85.

Print Friendly, PDF & Email
"All comments are subject to moderation. We welcome the comments of those who disagree, but not those who are disagreeable."
5 replies to this post
  1. This is the first article from the Imaginative Conservative me. I certainly appreciate the breath of the article. At times I thought I was reading Niel Postman’s “Technopoly:… ” again, but Postman didn’t deal much with the physicists. Glad I happened on this as it was mentioned in “Intellectual Takeout.”

  2. “In the post-truth society that is upon us, the postings and comments on the Internet are governed not by objective facts and established principles but by emotion and personal opinions. In this new digital world, your facts are not my facts; personal opinions determine facts.”

    I feel your brilliant article leaves the road a bit here. “Objective facts” are not – as you yourself have skillfully argued – solid, external things. What we call “facts” are agreements, forged from the chaos of experience by those with the most convincing voices in society, whom we designate as our authorities. Your list of conspiracy theories is a compendium of interpretations of “facts” with at least as much “objective” claim to truth as those they contradict. They are discredited because they are ridiculed; ridicule being the social signpost to correct belief. I myself believe 9/11 was an inside job. Notwithstanding a raft of evidence this is currently not a fact, but if more people come to believe it it will become so. Truth, to put it crudely, is a popularity contest. We are losing the plot, as you have again, IMO, astutely argued, not because we’ve lost the anchor to “objective facts” (there are no such things), but because we’ve disengaged from the authorities who – whether social, religious, or scientific – are the glue that holds our society together. A single person in possession of “the facts” is, literally, a madman. If he can convince a few people of his beliefs he will have a religion. Convince a few more, and he will have created “facts”. That the world was flat was once a fact. Today it is round. Science is replete with “facts” that will sooner or later be discredited and replaced. These “facts” are all social contracts: agreements. The end of agreement is where we’re headed: the Tower of Babel, where authorities are spurned, and each believes his own truth, heedless of anyone else.

    • If you believe that your statement “there are no such things [as objective facts]” is true, then you logically must also believe that your statement itself cannot be an objective fact. And on that latter point at least, I agree with you.

Please leave a thoughtful, civil, and constructive comment: