In the post-truth society, your facts are not my facts, and lies by political figures are greeted with indifference. Judged by past standards, citizens of a post-truth society have no real experience and no capacity for critical thinking.

We Americans have virtually no interest in history; for us, the past pales in comparison with the imagined future. When we think about history, we see individuals, such as George Washington, Thomas Jefferson, and James Madison, who shaped events to their likings, or so we believe, and ignore large cultural forces, such as the demise of aristocracy and the rise of capitalism and democracy. In the 2016 presidential election, most of us saw two individuals with contrary characters and desires pitted against one another. The victory of President Donald Trump meant the new president would reverse the achievements (or disastrous policies) of Barack Obama, confirming the notion that history is an ebb and flow of events.

That history has a direction that transcends individual political figures was clearly revealed by the Oxford Dictionaries Word of the Year 2016—“post-truth,” an adjective designating that not objective facts but emotion and personal beliefs determined the outcome of the presidential election.[1] Post-truth, however, has a wider importance than the particular assertions in the presidential campaigns of Hillary Clinton and Donald Trump, for the word describes a “general characteristic of our age.”[2] The word “post-truth” seems to have been coined by the late Serbian-American playwright Steve Tesich in his 1992 essay “A Government of Lies” published in The Nation magazine. He lamented, perhaps naively, that United States presidents and ambassadors lied about Iran-Contra and Iraq War I.

For the Nation-State, “truth” is an instrument to further power or to extend economic influence. President George W. Bush, May 29, 2003: “We found the weapons of mass destruction [in Iraq]. We found biological laboratories.” Saddam Hussein’s smoking nuclear gun with which Secretary of State Condoleezza Rice frightened the American public was never found. President Richard Nixon, January 4, 1971: “For us to have intervened [in Chile]—intervened in a free election and to have turned it around—I think would have had repercussions all over Latin America.” The United States carried out covert operations in Chile and provided funding through the CIA to overthrow the democratically elected Marxist President Salvadore Allende. President Lyndon Johnson, October 1964: “We are not about to send American boys nine or ten thousand miles away from home to do what Asian boys ought to be doing for themselves.” 3,403,000 “American boys” were deployed to Southeast Asia between 1964 and 1975.

Such lies by Presidents Bush, Nixon, and Johnson are commonplace among high government officials, for they are just part of the daily operation of the Nation-State, although when revealed, shock those Americans who believe their country promotes truth, justice, and peace. But in the present post-truth society, lies by political figures are greeted with indifference. Just to cite two undisputed falsehoods that surfaced in the 2016 presidential campaigns. Hillary Clinton claimed that in 2008 that she landed in Bosnia in 1996 “under sniper fire” and “ran with our heads down” from the plane, while a video shows that Mrs. Clinton was greeted not by gunshots but by a friendly crowd that included an eight-year-old Bosnian girl. President Trump boasted that he and Vladimir Putin were buddies—“I spoke directly and indirectly with President Putin”—only to acknowledge later that they had never met or spoken. Objective facts that showed that Mrs. Clinton and President Trump repeatedly lied were brushed aside by emotion and personal beliefs. “Those are your facts, not mine” became a mantra on both sides of the political spectrum, and after the inauguration of President Trump, his counselor Kellyanne Conway introduced the phrase “alternative facts.”

The post-truth society, an embodiment of nihilism, did not suddenly appear in the presidential campaigns of Hillary Clinton and Donald Trump. The buried stream of nihilism running for a century or more beneath Modernity surfaced in the twentieth century and is now a swift river, carrying many away into moral and intellectual chaos. Traditionally, for some people, philosophy furnished universal truths about the human being, but those happy days disappeared years ago. Martin Heidegger, probably the most influential philosopher of the twentieth century, declared in a posthumously published interview with Der Spiegel, “Philosophy is at an end.”[3] Friedrich Nietzsche, of course, was there first. In an 1873, unpublished essay, “On Truth and Lie in an Extra-Moral Sense,” he asked, “What is truth?” and answered, “A mobile army of metaphors, metonyms, and anthropomorphisms.”[4] One hundred years later, in an essay published by the American Council of Learned Societies, six eminent professors of literature assert, “All thought inevitably derives from particular standpoints, perspectives, and interests.”[5]

In his book Consequences of Pragmatism, philosopher Richard Rorty envisages a post-philosophical culture, where the search for ruling principles, immovable centers, and fixed structures is abandoned. According to Rorty, historians, literary theorists, and philosophers in the future will accept that no escape from culture exists and that all intellectuals can do is ride the “literary-historical-anthropological-political merry-go-round,” chasing one intellectual fad after another. In a world without maps, in a culture without fixed reckoning points, “there is nothing deep down inside us except what we have put there ourselves.”[6]

How sweet and simple it would be if history were the successive embodiment of Great Ideas, then we could blame Nietzsche for the post-truth society and return to the Old Way by merely placing a wreath on the misguided philosopher’s grave in Röcken, Germany. No proponent of Great Ideas Move History that I know can explain how Nietzsche’s “On Truth and Lie in an Extra-Moral Sense” cascades down to a fourteen-year-old boy before a juvenile court in Harwick, Vermont, who says, “Judge that is your opinion, not mine. Heh, man, don’t you know, it’s different strokes for different folks.” When pressed, this semi-literate juvenile would probably tell us that music, poetry, philosophy, religion, and ethics are mere expressions of personal opinions, individual perceptions, and particular cultural viewpoints, although in not such fancy language. From a philosophical perspective, the juvenile is a nihilist, but from his viewpoint “different strokes for different folks” means truth varies from individual to individual.

The clue to the source of modern habits of thinking is given by Alexis de Tocqueville in “Concerning the Philosophical Approach of the Americans,” an absolutely brilliant chapter of Democracy in America. He observes that in a modern democratic society the links between generations are broken, and consequently in such a society men and women cannot base their beliefs on tradition or class. Since an American always begins with the self, each citizen forms the intellectual habit of looking to the part, not to the whole, and as a result is a Cartesian reductionist: “Of all the countries in the world, America is the one in which the precepts of Descartes are least studied and best followed.”[7] Following Tocqueville, if we wish to understand the origins of the post-truth society, we must see how the three legs of Modernity—science and technology, capitalism, and democracy—instill habits of thinking.

Technology has made us inhabitants of the Image-World. Before the invention of the camera, manmade images were exceedingly rare; ordinary people saw statues, stained-glass windows, and paintings only in church, while the privileged few owned a small number of paintings, usually commissioned by them, their ancestors, or one of their wealthy brethren. Nowadays, through magazines, newspapers, movies, television, and the Internet, we are saturated with manmade images. According to a 2016 Nielsen report, the total amount of time American adults spend on average in the Image-World, namely watching TV, surfing the Web on a computer, and using an app/Web on a smartphone, is eight hours and forty-seven minutes per day.[8] The average person spends more time in the Image-World than at work or in bed.

Images and words are two totally different ways of communicating. A writer can fashion factual statements and established principles into a rational argument that persuades her readers of a certain truth. Syllogisms are not in the toolkit of the photo editor, who orders images by association. For example, in the 2016 Democratic Presidential Primary, The New York Times supported Hillary Clinton, and ran a left-profile photo of her, gazing slightly upward, with warm light evenly illuminating her face, a photo worthy of Leni Riefenstahl. The most frequent photo of Bernie Sanders showed him descending the stairs from an airplane, stooped-shouldered, carefully watching his step, and securely grasping the side rails. The choice was stark: a beautiful woman who sees the truth or an old man who is unsure of his footing.

Photo and video editors are masters of using the emotions that photographs can evoke to sell political candidates as well as products. In the 2016 Coca-Cola Emoticons commercial, four wholesome-looking teenagers, two boys and two girls, are strangers in front of a Coke machine. Out of the machine, one of the boys takes a Coke with the emoticon Kiss on the label; the teenagers laugh over different bottles with the emoticons LOL, Sexy, Naughty, and Hello. The teenagers get to know each other, and at the end of the forty-five-second commercial, one couple walks off together with the voiceover “take the feeling, share the feeling,” a happy feeling with Coke that brings young people together.[9]

We sophisticated adults maintain that ads do not affect us, but I am not so sure. Recently, a former student of mine, a young man approaching thirty, complained to me that he could not think straight about purchasing an off-road vehicle because of a Jeep commercial aired during Super Bowl 2016.[10] He could not shake the feeling of himself as once again a young, wild dude, racing over the desert in a Renegade, heading for a bevy of beautiful girls. My former student drew a finger from the top of his forehead to the tip of his nose and said half of me knows that the Jeep ad is a crock and the other half of me wants that youthful joy again—buy the Renegade! Not once did he mention the overall quality, reliability, and resale value of the Jeep as assessed by Consumer Reports or J.D. Power.

The association of Coke and happy teenagers or a Jeep Renegade with being a young, wild dude again is an emotional judgment that is ridiculous to reason; drinking Coke does not result in friends or happiness, owning a Jeep does not result in a new person; the juxtaposition of images is not cause and effect. The Image-World falls outside the realm of reason; persuasion through images is not based on logic, intellectual analysis, and assessment of data, but on feelings. This is not to deny that consummate artists can use images to present a profound truth. Charlie Chaplin in his movie Modern Times conveyed the dehumanizing aspects of industrialism by showing how the Tramp’s work on an assembly line reduced his life to what serves the machine.

Americans inhabit the Image-World for nearly nine hours a day, and as a result, they acquire the habit of judging people and products by feelings; likes and dislikes determine whether a song, a movie, a sitcom, a pop celebrity, or a political candidate is good or bad. In combination, capitalism and technology replaced reasoned argument by emotions.

Unlike old-time newspapers and radio, today’s media is driven by social feedback loops. Any story that shows a sign of life on Facebook or Twitter is copied endlessly. In this way, social media amplifies a story until it captures the public mind. The best stories evoke outrage, which prompts more coverage, which encourages more talk, and on and on, until a new astounding story emerges to dislodge the now boring previous big story.

Spend the better part of a day reading online comments, watching CNN, Fox News, MSNBC, and listening to Alex Jones or Glenn Beck on talk radio, and you will be convinced that as a general rule strong feeling about an issue rarely indicates a deep understanding. Thomas Jefferson, in 1789, wrote that the adoption of the Constitution of the United States gave him a “new and consolatory proof that, whenever the people are well-informed, they can be trusted with their own government.”[11] Without a widespread understanding of the basics of science, economics, and history by both the electorate and the candidates, reasoned, political discourse is impossible, and democracy is doomed.

Democracy, the other leg of Modernity, ousted truth in favor of personal opinion. We Americans have been emancipated from the creeds, doctrines, and dogmas of the Old World. We detest the confines, real or imaginary, of a hierarchical authority, a half-forgotten tradition, and an oppressive community, and refuse to be told what to believe or how to live. Because of our unshakeable belief in equality, we have a strong “distaste for accepting any man’s word as proof of anything”[12] and intensely dislike being under the authority of any person. Who are you to tell me what to do?

Since American culture tells us that all individuals are equal and that we can recognize the truth just as well as the next person, we think that we have no need to seek guidance from others, even so-called masters. Indeed, we believe that if we follow another person’s judgment, we will give ourselves over to that individual, and thereby become enslaved and violate what is most precious to us, our personal freedom. If anyone holds up someone as a master to follow, we will intentionally ignore or dismiss that person, since it screams inequality. Consequently, in American life no masters are recognized, and, in effect, the three great teachers of humankind—the Buddha, Socrates, and Jesus—are seen as just three voices among many.

If the above discussion seems abstract and philosophical, consider this. Last week at lunch, a physicist friend of mine, a former staff member of Los Alamos National Laboratory and the owner of a consulting company in Santa Fe, told me over glasses of Johnny Walker Black there are “two kinds of truth, one scientific, the other personal.” When I pressed him—I hope tactfully, but now I am not so sure—he said that scientific meant universal truths and that personal truths are subjective, unique to each individual, but necessary to live. I pointed out that “personal truth” is modern cliché, a misuse of the word “truth,” a euphemism for “personal opinion;” he just shrugged his shoulders and said, “That is what I believe.”

Technology administered the coup de grâce to truth. The Internet is the first medium in history, besides the agora of ancient Athens and the town meetings in New England villages, to create many-to-many communication; the phone is one-to-one and books, radio, and television are one-to-many. Clay Shirky, a prominent thinker on the social and economic effects of the Internet, points out that “every time a new consumer joins this media landscape a new producer joins as well, because the same equipment—phones, computers—lets you consume and produce. It’s as if, when you bought a book, they threw in the printing press for free.”[13] Smartphones, tablets, and laptops are cheap, and consequently ubiquitous. In 2016, Facebook had 1.86 billion monthly active users worldwide[14]; the corresponding number for Twitter was 319 million.[15] Digital technology encourages everyone to be a reporter, a broadcaster, and a commentator.

In a world where digital technology is everywhere, professional journalists and public intellectuals no longer tell gullible citizens how to vote, what to believe about politicians, and what to think about social issues. Today, the talking heads are entertainers, necessarily voicing extreme and at times outrageous positions in order to get above the clutter, but they are not respected the way Edward R. Murrow and Walter Cronkite were in the era of one-to-many broadcasting.

Social media allow hundreds of millions of Americans, with the click of a mouse, to disseminate their opinions without the scrutiny of grammarians, fact-checkers, and editors, the guardians of print culture. Anyone, anywhere, anytime, now, can instantly post an opinion on anything. The Internet has become an ocean of opinion, one comment washing over another, quickly submerging whatever truth that tries to surface.

A majority of Americans—sixty-two percent—get news via social media.[16] Half the public turned to Facebook, Reddit, Twitter, and other social media sites to be informed about the 2016 presidential election.[17] Short Facebook posts and 140-character tweets exclude complex, nuanced thoughts. The “news” on social media was often rumor, innuendo, and speculation. Here is one example of how a legitimate news item morphed into craziness that went viral. On October 28, 2016, F.B.I. Director James B. Comey informed Congress that he was opening his investigation of Hillary Clinton’s use of a private email server when she was secretary of state. The F.B.I. had found new emails on a computer belonging to disgraced former New York congressman Anthony Weiner, the estranged husband of Huma Abedin, a top Clinton aide. Two days later, someone under the ­handle ­@DavidGoldbergNY tweeted that the new emails “point to a pedophilia ring and ­@HillaryClinton is at the center.” The innuendo was retweeted more than 6,000 times. On November 4, talk-show host Alex Jones said in a YouTube video, “Yeah, you heard me right. Hillary Clinton has personally murdered children. I just can’t hold back the truth anymore.”[18] The video was viewed more than 427,000 times. Mr. Jones eventually tried to back away from his comments about Mrs. Clinton, claiming his remarks were about U.S. policy in Syria.[19]

Truth was now hopelessly submerged by hatred of the Clintons and personal opinions about the unreliability of mainstream media: Hillary Clinton’s campaign chairman, John Podesta, indulged in satanic rituals; he frequently ate pizza at Comet Ping Pong, a center of a child sex ring; in tunnels beneath the restaurant, children were held captive.

Such “news” had real consequences. Late Sunday morning December 4, 2016, Edgar Welch walked into the Comet pizzeria armed with a Colt AR-15 assault rifle, a .38-caliber Colt revolver, and a folding knife, fired his gun two or three times, and searched the building for hidden tunnels. Fortunately, no one was injured. After his arrest, Welch said he had only recently installed Internet service at his house and that his “intel on this wasn’t 100 percent.”[20]

In this century, conspiracy theories fueled by the Internet have replaced the political ideologies of the previous century, and not just for oddballs, loonies, and crackpots. Ten days before the inauguration of President-elect Donald Trump, BuzzFeed published a dossier, compiled by Christopher Steele, a former British intelligence official, who then worked for a private intelligence firm.[21] His work was initially funded by opponents of Donald Trump in the Republican Party. After Mr. Trump became the Republican Party’s nominee, Mr. Steele’s work was financially supported by persons linked to the Democratic Party.[22]

The dossier alleged the Russia government had been “cultivating, supporting, and assisting TRUMP for at least five years.” And more damaging, the dossier claimed the F.S.B., Russia’s secret service, used hidden cameras to record “TRUMP’s (perverted) conduct in [the] Moscow” Ritz Carlton Hotel with prostitutes.

BuzzFeed did acknowledge that the allegations in the dossier were “unverified and potentially unverifiable.” Yet, the editors of BuzzFeed published the “full document so that Americans can make up their own minds about allegations about the president-elect that have circulated at the highest levels of the US government.”

If the U.S. security agencies with a budget of $53.5 billion[23] and numerous clandestine operatives could not determine whether the allegations in the dossier were true or false because the sources were unnamed, then how could ordinary citizens make up their own minds? A quick glance at the comments on the BuzzFeed Web page showed that readers of the article had no difficulties in determining the “truth.” Here are three representative comments: 1) “Nothing wrong with exposing the details about what a lot of us had already suspected for quite some time;” 2) “If anyone was wondering why Trump consistently defends dictator Putin and criminal Assange over our 17 intelligence agencies, now you know! Russian blackmail could lead to a huge security breach that would seriously harm us all; red, blue and rainbow. I suspect more to come;” and 3) “Trump is gross and most Americans already knew that. Some say there are pics with Russian underage girls. No doubt all will come out at his impeachment hearings.”

These comments reveal that personal opinion converts the unverifiable to “truth,” a principle of the post-truth society that is even present in the august New York Times. Paul Krugman, a Nobel Laureate in economics, asserted in his semi-weekly column, dated December 2016, that the presidential election was “illegitimate” and “tainted” by both Vladimir Putin and F.B.I. Director James Comey; the former ordered his agents to hack the Democratic National Party’s servers and the latter, ten days before the election, made public the possibility of damaging evidence in newly discovered emails of Hillary Clinton. Mr. Krugman claimed that Trump “won the Electoral College only thanks to foreign intervention and grotesquely inappropriate, partisan behavior on the part of domestic law enforcement.” He asks rhetorically, “Did the combination of Russian and F.B.I. intervention swing the election?” and answers, “Yes.”[24]

Three weeks later, Charles M. Blow, in one of his semi-weekly columns for The New York Times, in a near-hysterical tone concurred with Mr. Krugman: “Donald Trump is as much Russia’s appointment as our elected executive.” Mr. Blow claimed that only blathering pundits disagreed with this truth, for no one could deny, “A hostile foreign power [Russia] stole confidential correspondence from American citizens… and funneled that stolen material to a willing conspirator, Julian Assange. The foreign power then had its desired result achieved on our Election Day. This was an act of war and our presidency was the spoil.”[25] Mr. Blow’s emotional rhetoric masks the absence of reason: How the publication of confidential Democrat emails (the cause) resulted in the election of Donald Trump (the effect) is not even suggested.

That Mr. Putin and Mr. Comey swung the presidential election is unverifiable; neither Mr. Krugman nor Mr. Blow cite evidence, such as political polls, for none exists; both pundits invoked personal opinion and neglected the facts that for decades the American political elite presided over a massive and increasing public debt, failed to prevent 9/11, initiated a disastrous war in Iraq, allowed financial markets to nearly destroy the global economy, allied itself to Wall Street, and was blind to a declining middle class. 46% of the electorate, those supporters of Mr. Trump called the “deplorables” by Hillary Clinton, did not take kindly to the political elite.

In the post-truth society, your facts are not my facts; hence, the second principle of the post-truth society—facts are determined by unshakable personal opinions.

Given that the average American spends eight hours and forty-seven minutes in the Image-World each day, direct experience of people is, in the main, replaced by tweets, text messages, and Facebook posts; encounters with nature are superseded by National Geographic photos and Nova programs. Everything that was once directly experienced is receding into images. Literary critic Sven Birkerts likens digital cameras, cable TV, and the World Wide Web to a “soft and pliable mesh woven from invisible thread” that covers everything. “The so-called natural world,” he writes, “the place we used to live, which served us so long as the yardstick for all measurements can now only be perceived through scrim. Nature was then; this is now.”[26]

Carol Kaesuk Yoon in her review of the movie Avatar, the 3-D, five-hundred-million dollar, technological extravaganza, rejoices over the replacement of actual experience with the manufactured. Avatar “has recreated what is at the heart of biology: the naked, heart-stopping wonder of really seeing the living world.”[27] The trek through the local multiplex screen does not include sore feet, a parched throat, or insect bites, much less a pause in silence to hear the baritone squawks of ravens or to see the play of intense sunlight on the bark of a juniper tree.

Andy Warhol was arguably the first painter to portray the new reality most of us inhabit: images of products and movie idols—silk-screened Campbell soup cans and portraits of Marilyn Monroe—rock stars, sports figures, and celebrity politicians… images of totally free individuals, who are subservient to no one and who live full, amazing lives.

Most of us are modern, more sophisticated versions of Myrtle Wilson in The Great Gatsby, who fashions her life after images in moving-picture magazines, stories in Town Tattle, and tabloid accounts of celebrities in the scandal sheets of Broadway.

The “loss of the real”[28] is more complete on Facebook. Jaron Lanier, an early developer of virtual reality and once an uncritical technology enthusiast, argues that the most effective users of Facebook are the “ones who write successful online fictions about themselves.”[29] People communicate online differently than they do in person, because voice inflection and body language are absent in computer chat. Consequently, an online profile can be a carefully crafted fiction meant to impress other Facebook users, something that would be laughable face to face. So abstract and unreal are Facebook profiles that “friends” find it easy to love, hate, laugh at, support, or attack “friends.” With the click of a mouse, a digital relationship between two persons can be deleted and the contrived emotion that resulted from constant updates on Facebook vanishes, forever.

Social networks, just like television, create the illusion that isolated individuals are connected to each other. Identical images are fed into everyone, regardless of age, gender, education, or ethnic background. Individual “experience” is reproduced again and again, not unlike the identical parts stamped out by an assembly line. In this way, the Internet and television destroy personal experience, but isolated individuals believe they are bound together by the same “experience.”

Genuine human experience is always personal. Recently, I went with a group of friends to a concert of American chorale music based on black spirituals. At the intermission, my friends and I spoke excitedly about what we experienced. The sole musician amongst us praised the balance of the ensemble and the conductor’s energy. One woman noticed how nervous the lead soprano was before her solo. Another woman observed that the second tenor, a young man of obvious Mediterranean background, had curled, hairy fingers not unlike an ape, but he sang like an angel. Her observation set us to musing about human nature. For each of us, the concert was a personal experience that we saw with our own eyes and heard with our own ears.

Judged by past standards, citizens of a post-truth society have no real experience and no capacity for critical thinking. According to political theorist Hannah Arendt, a “people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist… is the ‘ideal subject of totalitarian rule.’”[30] For Arendt, the main threats to political liberty were communism and fascism. Today, however, totalitarianism seems benign; the peril on the left is the Nanny State, whose “power is absolute, thoughtful of detail, orderly, provident and gentle,” a state that keeps its citizens in “perpetual childhood,”[31] and the danger on the right is the Corporate State, which uses the powers of government to enrich the one percent at the expense of the citizenry.

Whether America slides into one of these totalitarian regimes is anyone’s guess. But probably a post-truth society cannot rationally solve the enormous problems generated by technology and capitalism: climate change, a declining middle class, racial injustice, extreme materialism, and military adventurism. If a people cannot distinguish fact from fiction, then political disaster awaits them.

The Imaginative Conservative applies the principle of appreciation to the discussion of culture and politics—we approach dialogue with magnanimity rather than with mere civility. Will you help us remain a refreshing oasis in the increasingly contentious arena of modern discourse? Please consider donating now.

Notes:

[1] Oxford Dictionaries Word of the Year 2016.

[2] Neil Midgley, ibid.

[3] Martin Heidegger, The Spiegel Interview (1966), “Only a God Save Us.”

[4] Friedrich Nietzsche, “On Truth and Lie in an Extra-Moral Sense,” in The Portable Nietzsche, trans. Walter Kaufmann (New York: Vintage, 1979), p. 46.

[5] George Levine et al., “Speaking for the Humanities,” American Council of Learned Societies Occasional Paper, no. 7 (1989): 9.

[6] Richard Rorty, Consequences of Pragmatism: Essays, 1972-1980 (Minneapolis: University Of Minnesota Press, 1982), Introduction.

[7] Alexis de Tocqueville, Democracy in America, trans. George Lawrence (New York: Harper & Row, 1966 [1835, 1840]), p. 429.

[8] The Nielson Total Audience Report, Q1, 2016.

[9] Coca-Cola Emoticons.

[10] Official 2016 Jeep Super Bowl Commercial.

[11] Thomas Jefferson, “Letter to Richard Price Paris, January 8, 1789.”

[12] Tocqueville, p. 430.

[13] Clay Shirky, “How Social Media Can Make History,” Ted Talk, 2009.

[14] Statista, “Number of monthly active Facebook users worldwide as of 4th quarter 2016 (in millions).”

[15] Statista, “Number of monthly active international Twitter users from 2nd quarter 2010 to 4th quarter 2016 (in millions).”

[16] Shannon Greenwood, Andrew Perrin, and Maeve Duggan,“Social Media Update 2016.”

[17] Jeffrey Gottfried and Elisa Shearer,“News Use Across Social Media Platforms 2016.”

[18] See Alex Jones, “Pizzagate,” Nov. 27, 2016, Media Matters for America.

[19] Marc Fisher, John Woodrow Cox and Peter Hermann, “Pizzagate: From rumor, to hashtag, to gunfire in D.C.,” The Washington Post, Dec. 6, 2018.

[20] Edgar Welch, quoted by Adam Goldman,The Comet Ping Pong Gunman Answers Our Reporter’s Questions,” The New York Times, Dec. 7, 2016.

[21] Ken Bensinger, Miriam Elder, and Mark Schoofs, “These Reports Allege Trump Has Deep Ties To Russia,” BuzzFeed.

[22] “The Trump-Russia dossier: what we know so far,” Financial Times.

[23] Federation of American Scientists, “U.S. Intelligence Budget Data.”

[24] Paul Krugman, “The Tainted Election,” The New York Times, Dec. 12, 2016.

[25] Charles M. Blow, “Donald Trump and the Tainted Presidency,” The New York Times, Jan. 9, 2017.

[26] Sven Birkerts, The Gutenberg Elegies: The Fate of Reading in an Electronic Age (Boston: Farber and Farber, 1994), p. 120.

[27] Carol Kaesuk Yoon, “Luminous 3-D Jungle Is a Biologist’s Dream,” The New York Times (18 January 2010), D1.

[28] Sherry Turkle, (New York: Simon & Schuster (September 4, 1997), p. 235.

[29] Jaron Lanier, You Are Not a Gadget: A Manifesto (New York: Vintage, 2011), p. 71.

[30] Hannah Arendt, The Origins of Totalitarianism (Cleveland, OH: World Publishing, 1958), p. 474.

[31] Tocqueville, p. 692.

The featured image is courtesy of Pixabay.

All comments are moderated and must be civil, concise, and constructive to the conversation. Comments that are critical of an essay may be approved, but comments containing ad hominem criticism of the author will not be published. Also, comments containing web links or block quotations are unlikely to be approved. Keep in mind that essays represent the opinions of the authors and do not necessarily reflect the views of The Imaginative Conservative or its editor or publisher.

Print Friendly, PDF & Email