Alexis de Tocqueville, while traveling through the dense woods in Michigan, in 1831, came across a pioneer and his family, making the “first step toward civilization in the wilds.”[1] He noted in his travel diary that “from time to time along the road one comes to new clearings. As all these settlements are exactly like one another, whether they are in the depths of Michigan or just close to New York, I will try and describe them here once and for all.”[2] The settler’s log house showed “every sign of recent and hasty work.” The walls and roof were fashioned from rough tree trunks; moss and earth had been rammed between the logs to keep out cold and rain from inside the house. The settler exhibited little curiosity in his French visitor, and in showing hospitality to the stranger, he “seemed to be submitting to a tiresome necessity of his lot and in it saw a duty imposed by his position, and not a pleasure.” The pioneer and his family formed a “little world” of their own, an “ark of civilization lost in a sea of leaves. A hundred paces away the everlasting forest spread its shade, and solitude began again.”

One hundred and twenty-five years after Tocqueville visited the settler in the wilds of Michigan, I as an adolescent, forty miles west of Motown, inherited through American culture the pioneer’s individualism. I thought that to be free meant to be an isolated, autonomous individual and believed that I owned myself and my abilities, that all the relations I had with other persons I voluntarily chose, and that I owed nothing to other persons except what I of my own free will incurred. The pioneer, living in an “ark of civilization lost in a sea of leaves” was more extreme than I was. He had left behind in Old Europe parents, grandparents, siblings, aunts, uncles, and cousins, everyone to impede his freedom and independence.

In America, individualism was not an idea found in philosophical treatises and then put into practice, but a lived experience, despite that the philosopher John Locke developed the logic of the modern democratic state that the Founding Fathers later appealed to. When arguing for the adoption of the proposed United States Constitution, in 1787, James Madison invoked John Locke’s conjectures about how political societies formed.[3] In a state of nature, each individual, alone, “free, equal, and independent,” is “constantly exposed to the invasions of others;” property is “very unsafe, very insecure;” and existence is “full of fears and continual dangers.”[4] For the protection of his goods and life, no individual can rely upon the goodwill of others. Consequently, individuals contract with each other to hand over their natural power to protect themselves and their property to the State. For Locke and Madison, self-interest, weakness, and natural enmity caused isolated individuals to form political societies.

Locke also laid the theoretical foundation of capitalism. He believed that God had given the earth and its fruits to mankind in common: “The earth He has given to the children of men.”[5] How, then, could any man have a right to private property? Locke begins to answer this question by arguing that in a state of nature each individual has the natural right to life and thus to the fruits of the earth to sustain his life. A man is the “absolute lord of his own person and possessions;”[6] he has a “property in his own person” that “no one has a right to but himself.”[7] When a person mixes his labor with anything in the commons, he thereby makes it his property, for his annexed labor makes it part of himself. When a man killed and skinned a deer or picked a basket of blueberries, he mixed his labor with the game or the berries, and they became his.

Notwithstanding the vastness and abundance of the earth—“God has given us all things richly”[8]—nature limits what a man can rightfully take from the commons. A man may appropriate through his labor as much as he can use before it spoils. No rational man would kill ten deer only to have the meat putrefy, or pick two hundred pounds of blueberries only to have the fruit spoil in a week. “Nothing was made by God for man to spoil or destroy.”[9]

The second stage in the development of private property occurred with the introduction of money, “some lasting thing that men may keep without spoiling.”[10] Gold, diamonds, or whatever men by consent deemed valuable allowed a person to legitimately accumulate as “much of these durable things as he pleased,”[11] and not violate the limits of possessions imposed by nature, since money was not perishable.

With money, the industrious enlarged their possessions, and because man by nature has an unlimited desire for material goods, soon the commons vanished, replaced by vast estates. “Since gold and silver, being little useful to the life of man, in proportion to food, raiment, and carriage, has its value only from the consent of men…it is plain that the consent of men have agreed to a disproportionate and unequal possession of the earth.”[12] The introduction of money destroyed the freedom and equality of all individuals in the first stage of the state of nature, and governments had to be instituted to safeguard unequal property.

Men without land were forced to sell their labor for wages; the buyer owned their labor, and thus what they produced belonged to him. The buyer and seller of labor are linked by the exchange of money, not in any permanent way by custom or obligation. According to Locke, unlimited private property and the selling and buying of human labor, the two essential components of capitalism, are rooted in the nature of man. The interference with free markets and class structure by a society or a government goes against God and nature.

No one learns the rules of capitalism by reading John Locke’s Second Treatise of Government or a pamphlet published by the U.S. Chamber of Commerce. Like the majority of Americans, I acquired the individualism that capitalism rests upon through public schooling; my fellow students and I learned to compete in the classroom and on the sports field, an ethic that prepared us for the workplace.

The two founding events of Modernity are the Scientific Revolution and the Protestant Reformation. The first promulgated Descartes’ methodological rule to begin “with the simplest and most easily known objects in order to ascend little by little, step by step, to knowledge of the most complex.”[13] In modern terms, Cartesian reductionism states that every whole is completely understandable in terms of its smallest parts and how they interact. The Reformation destroyed medieval communal life and thereby launched individualism. Protestantism substituted the individual for the community; the new man of God was to achieve “salvation through unassisted faith and unmediated personal effort.”[14] In America, the Puritans put an indelible stamp of individualism upon the New World. The Puritan was “one entire person, who must do everything of himself, who [was] not to be cosseted or carried through life, who in the final analysis [had] no other responsibility but his own welfare.”[15]

Tocqueville captured in one word the essence of Modernity. He was the first person to use the word “individualism” and reports “that word ‘individualism,’ which we coined for our own requirements, was unknown to our ancestors, for the good reason that in their days every individual necessarily belonged to a group and no one could regard himself as an isolated unit.”[16] The Latin word “indīviduum,” the root of the English word “individual,” means an indivisible whole existing as a separate entity.

Stated in its most general form, the defining principle of Modernity is that every whole—a political community, a horse, or a carbon atom—is a sum of its isolated parts. We will call this principle individualism, a straightforward extension of Tocqueville’s original meaning and of Descartes’ rule to begin with the parts. Modernity rests upon three legs: science and technology, democracy, and capitalism; all three legs are bound together by individualism. The overarching principle of Modernity, then, is that things exist in isolation, as separate entities.

In Old Europe, like in every premodern culture, the group was considered prior to the individual in origin and authority. Jacob Burckhardt, the great scholar of the Italian Renaissance, explains that in Medieval Europe a “man was conscious of himself only as a member of a race, people, party, family, or corporation.”[17] When asked “Who are you?”, a person may have replied, “A Vignola from Padua, a stone carver, and a good Christian.”

In Medieval France, the basic unit of society was the peasant family, the domus; the Latin word meant both family and house, for the two were inextricably bound together. The inhabitants of small villages almost never used the word familia; for peasants the “family of flesh and blood and the house of wood, stone, or daub were the one and the same thing,” according to historian Emmanuel Le Roy Ladurie.[18] The central elements of the house were the kitchen fire, goods and lands, children, and conjugal alliances with other domūs. The domus usually went beyond two parents and their children to include servants, boarders, and illegitimate children, if any.

Sociologist Robert Nisbet agrees that in Medieval Europe “the group was primary; it was the irreducible unit of the social system at large. The family, patriarchal and corporate in essence, was more than a set of interpersonal relations.”[19] Taxes and fines were levied upon the medieval family, not the individual. Honors of achievement were bestowed upon the family, rather than the individual. Property belonged to the family, not the individual, and could not easily be separated from the family. The legal rights of the family over its members were inviolable. The family made almost all decisions affecting a person’s occupation, marriage, and the rearing of his or her children.

Since the whole is seen as prior to and greater than any of its parts, the overarching principle in Medieval Europe as well as in all premodern cultures is things exist only in relationship. In Buddhism, a flower or a lion is said to be empty, meaning that the flower or lion has no independent existence separable from everything else. Aristotle takes as obvious that “man is by nature an animal intended to live in a polis…[as shown] by the faculty of speech.”[20] Without a polis, a man is “either a beast or a god.”[21]

The ancient and modern ways of understanding humans, nature, and the transcendent differ radically. In Modernity, the cosmos is “opaque, inert, mute”[22], unlike the ancient outlook, as developed by Aristotle and Aquinas, where Homo sapiens, an integral part of nature, shares a life with plants, animals, and the Prime Mover (God), all of which form a hierarchy ordered by degrees of nonmateriality. While self is not ignored by these two ancient thinkers, they emphasize the soul, what is universal about each person. In Modernity, of course, the soul is replaced by the isolated, autonomous self.

We have arrived at the Great Chasm that separates modern and ancient cultures, a chasm that may be bridgeable intellectually, but not experientially. We moderns live in a totally different cosmos than our medieval ancestors or our Greek forbearers. Aristotle believed that the stars traverse circles about the Earth because of their desire to emulate the Prime Mover, an eternal being beyond the sphere of fixed stars that moves as an object of love, yet itself is unchanging.[23] Aristotle inhabited a tiny, comforting cosmos, strange to us, thanks to the truly glorious scientific revolution initiated by Copernicus. We cannot go home again to the cozy, ancient cosmos, where the night sky displayed the transcendent and Mother Earth manifested harmony and fecundity. Nor can we undo scientific knowledge; we live on a tiny planet, orbiting an ordinary star, near the edge of an ordinary galaxy that contains at least two hundred billion stars, in a universe with more than a hundred billion galaxies.

Aquinas believed the Garden of Eden existed in the East and that the location of Paradise was “shut off from the habitable world by mountains, or seas, or some torrid region, which cannot be crossed; and so people who have written about topography make no mention of it.”[24] Unlike the theologians, saints, and peasants of Medieval Europe, we are not anchored to a narrow tradition ignorant of Buddhism, Hinduism, and Taoism. For us, living in a world of many differing cultures, the spiritual life must include the deepest insights of all the wisdom traditions.

Unfortunately, we moderns are on the wrong side of the chasm, for the principle things exist in isolation is false. The ancient principle that things exist only in relationship, however, is very much present in everyday modern life, contrary to our cultural myth.

In the nineteenth century, physicists hoped that someday they could isolate the atom from the cosmos, for they believed that knowing the properties of isolated atoms was the key to understanding the material world. Physicists later, however, discovered that the more an atom is isolated, the less actual it is. Atoms and elementary particles do not exist in the same way that billiard balls and cue sticks do. Atomic entities exist as potentialities or possibilities rather than as definite concrete objects. In the twentieth century, quantum physicists were forced by nature to renounce the cultural dogma that the world is made up of autonomous parts, each with a separate, independent existence. Physicist David Bohm sums up the essential feature of quantum physics: “The primary emphasis is now on undivided wholeness, in which the observing instrument is not separated from what is observed.”[25]

In quantum physics, nothing has an independent existence separable from everything else. Things exist only in relationship—and, this has always been true, even in Newtonian mechanics, although physicists for over two centuries unwittingly promoted the fiction that the world is made up of separate, independent parts.

In the daily work of science, physicists, mathematicians, and astronomers apply Newton’s three laws to idealized objects that exist by themselves in an imaginary universe. Often professor and student alike take what is constructed for mathematical convenience as reality. Such idealizations often fail to capture the interconnectedness of nature. Physicist Richard Feynman, for example, demonstrated that “even simple and idealized things, like the ratchet and pawl, work…in only one direction because it has some ultimate contact with the rest of the universe.”[26] He showed that if a mechanical watch were in a box, isolated from the universe, the heat buildup from friction would eventually cause the watch to keep time in a chaotic fashion. For a watch, an automobile, or an electric motor to keep running in one direction, it must dump the heat it generates into its surroundings, and at some point this requires that the heat generated on Earth be radiated into empty space. The Earth can cool off only because the universe is expanding and cooling down. Thus, a watch can keep time because the Big Bang started the universe in a one-way direction. For a physicist to understand completely why a machine can run in only one direction, she must understand the Big Bang.

We must not be misled into thinking that things exist only in relationship applies exclusively to the exotic realms of quantum physics and cosmology. If we were raised from infancy as isolated individuals, we literally could not understand what we see. For human vision to be meaningful, a person must be a participant in the world. This surprising property of vision was demonstrated in a series of classic experiments by Theodor Erismann.[27] He fitted persons with vision-distorting goggles that made straight lines appear curved, right angles seem acute or obtuse, and distances seem expanded or shortened. Amazingly, after a few days, a subject’s vision was no longer distorted; he saw normally and functioned normally, even skiing and riding a motorcycle!

The key to vision returning to normal was that the subjects were allowed to move about and act freely, enabling the strange new visual data to be integrated with the subject’s experience of self-movement and self-sensation through touch. Subjects not allowed to move on their own, though they were pushed on gondolas through the environment, never experienced normal vision while wearing the distorting goggles.[28] To see the world we must be participants, not mere spectators.

The senses are meant to be engaged with the outside world, and the mind with something other than its own thoughts. In isolation, the senses and the mind create phantoms. Experiments on human subjects in isolation tanks demonstrated that extreme sensory deprivation induces such psychic disorders as mental confusion, hallucinations, and panic.

A human being exists only in relationship. Perceiving, feeling, imagining, thinking, and willing are impossible in isolation. A person in isolation from a larger whole, say nature or community, is a meaningless abstraction, an idealization that can only occur in philosophy and political theory. The isolated, autonomous self is a cultural myth, whose realization would reduce a person to nothingness.

Social psychologists Hazel Rose Markus and Shinobu Kitayama confirm that human beings exist only in relationship: “Persons are only parts that when separated from the larger social whole cannot be fully understood. Such a holistic view is in opposition to the Cartesian, dualistic tradition that characterizes Western thinking and in which the self is separated from the object and from the natural world.”[29] In their convoluted, social science prose, Markus and Kitayama agree that the Cartesian mantra “begin with the parts” should be replaced by “begin with the whole.”

If we could sever all our ties to nature, family, and community, then we would cease to be. The DNA that each of us bears in every cell of our bodies came from our parents, half from our mother and half from our father. If we tried to remove every trace of parents from our lives, we literally would not exist.

The self exists only when connected to others. Members of a family share the same hopes, the same joys, the same sorrows, and the same experiences; each family member lives a common life, each a part of the others. Divorce severs certain legal obligations, not the ties between spouses and their children, which are inseparable. For better or for worse, a common life yokes persons of the same family together forever. We do not live separate, parallel lives; we are not separate, isolated selves; each member of a family is a part of the others.

Sometimes family life can be so extraordinarily painful and damaging that we wish to be rid of our family forever. A friend of mine in graduate school, John Sullivan, an Irish Catholic from South Boston, hated his family and wanted nothing to do with them, for reasons unknown to me. John escaped to Ann Arbor, cut himself off from his family, and even refused to answer telephone calls from either his parents or siblings. Every weekend, he would drink and curse fate for giving him a family of drunks, nitwits, and general, all-around perverts. Somehow, his parents got my telephone number and communicated important messages to their son through me. One day, John left a message on my answering machine, telling me that he could not stand his family any longer and that he was moving to Australia, so his family would be out of his life forever. Three years later, I received a letter from John. He was in Australia; yet, every morning he woke up cursing his family. He had not learned that he could move to Mars and his family would still be inside of him.

By the second decade of the twenty-first century, the ill effects of a culture founded on things exist in isolation had become widespread. Not unexpectedly, individualism transformed the family. Several of my young adult students liken their home to a boarding house with three generations weakly tied together. One young man claims that what is called the nuclear family often seems more akin to a collection of astronauts in spacesuits, adrift in a vacuum, tethered to a visible, untouchable mother ship, each person alone, shut up in the solitude of his or her own heart.

The 2010 U.S. census uncovered that about one out of every four households consists of only one person;[30] roughly 30 million individuals “feel sufficiently isolated for it to be a major source of unhappiness in their lives,” according to John Cacioppo, a research psychologist at the University of Chicago, and William Patrick, the editor of the Journal of Life Sciences.[31] Over the twenty year period from 1985 to 2005, social isolation in America increased dramatically. In surveys conducted by sociologists Miller McPherson and Lynn Smith-Lovin, the most frequent response to the question “How many persons do you confide in about important, personal matters?” was three in 1985, and zero in 2004, with almost half of the respondents reporting they had either no confidents or only one.[32]

While advanced medical technology is making substantial progress in treating the physical diseases of Western civilization—cancer, atherosclerosis, and diabetes—the diseases of the interior life are taking over—alcohol and drug abuse, sex addiction, binge eating, and depression. The more a person fulfills the cultural dictate of being an isolated, autonomous individual the more lonely, bored, and depressed he or she becomes. From interviews with 39,000 persons, the authors of a paper published in the Journal of the American Medical Association conclude that in the industrialized world the rates of severe, often incapacitating depression have increased in each succeeding generation since 1915.[33] A 2011 study by the Centers for Disease Control and Prevention found that between 1988 and 2008 the rate of antidepressant use by all ages in the United States increased nearly 400 percent; eleven percent of Americans aged twelve years and over now take antidepressant medication.[34] Since the 1930s, anxiety and depression among young people in America have steadily increased.[35] The World Health Organization predicts that by 2020 depression will be the second most prevalent medical condition in the world.[36] These data support the general conclusion that modern life is bad for mental health, with one exception.

Émile Durkheim, the father of sociology, was the first social scientist to notice the positive effects of war on mental health in Modernity; he found that when European countries went to war in the nineteenth century, the suicide rates dropped. He explained this unexpected result by arguing that war causes a “stronger integration of society,” so an “individual thinks less of himself and more of the common cause.”[37] Since Durkheim, numerous sociologists and psychologists have observed that in wartime rates of emotional depression decline, psychiatric wards empty, and homicide and other violent crimes go down.

Sociologist Charles Fritz, using data compiled by a team of twenty-five researchers, concluded that war and large-scale natural disasters produce “mentally healthy conditions,” because such catastrophes establish “transcendental goals,” where a collection of isolated, autonomous individual become a “community of sufferers.”[38]

During World War II, virtually every American participated in a moral crusade of a cosmic order against an evil enemy, sacrificed for the common good, and thereby experienced the happiness that results from altruism and simplified living. The poor mental health induced by Modernity dropped away when citizens embodied the principle I exist only in relationship.

Years ago, I thought that a culture based on principles contrary to nature would eventually collapse, but now I am not so sure. Individualism allowed for the rapid settlement of America, as seen in the pioneer Alexis de Tocqueville described. A landed gentry, an established church, and a class based on birth were left behind in Old Europe. Individual freedom in the New World and the desire for material gain and economic independence unleashed the great potential hidden within every person. America “opened a thousand new roads to fortune and gave any obscure adventurer the chance of wealth and power.”[39]

But Nature cannot be denied forever; surprisingly, the ill effects of individualism were seen by a few of the early settlers of America. “Thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become Europeans!” Hector de Crèvecoeur, a French émigré, lamented in 1782. He thought, “There must be in their social bond something singularly captivating and far superior to anything to be boasted among us.”[40]

Eerily, Tocqueville’s worst fear about what awaited American life in the future seems fulfilled: “an innumerable multitude of men, all equal and alike, constantly circling around in pursuit of the petty and banal pleasures with which they glut their souls. Each of them, withdrawn into himself, is almost unaware of the fate of the rest. Mankind, for him, consists in his children and his personal friends. As for the rest of his fellow citizens, they are near enough, but he does not notice them. He touches them but feels nothing. He exists in himself and for himself.”[41] In the absence of genuine community, the central government is becoming a power that is “absolute, thoughtful of detail, orderly, provident, and gentle…[that] tries to keep [its citizens] in perpetual childhood.”[42]

Even if American democracy, once “more perfect than any of which antiquity dared,”[43] becomes a benevolent despotism, we Americans will call our debased form of government a democracy, an exceedingly depressing result only if one is wedded to the political state.

Cultural upheaval, political turmoil, and religious decline provide the perfect soil for a radical re-examination of human life; no longer can we rest contented in ignorance of who we truly are. History is forcing each one of us to see, perhaps for the first time, how Western culture instills habits of thinking and feeling contrary to nature and directs us to goals that on the whole are unsatisfying. Armed with such knowledge, we become genuinely free to choose a life founded upon human nature, instead of mindlessly living the life given to us by culture. Each of us can lead a new life, once we are fully aware that we exist only in relationship.

The Imaginative Conservative applies the principle of appreciation to the discussion of culture and politics—we approach dialogue with magnanimity rather than with mere civility. Will you help us remain a refreshing oasis in the increasingly contentious arena of modern discourse? Please consider donating now.

Notes:

[1] Alexis de Tocqueville, Democracy in America, trans. George Lawrence (New York: Harper & Row, 1966 [1835, 1840]), Appendix U, pp. 731-733. For narrative consistency, several verb tenses in the text have been changed to the past.

[2] Alexis de Tocqueville, Journey to America, ed. J.P. Mayer, trans. George Lawrence (New York: Anchor: 1971), p. 360.

[3] James Madison, “The Structure of the Government Must Furnish the Proper Checks and Balances between the Different Departments,” Federalist No. 51.

[4] John Locke, Second Treatise of Government, ed. C. B. Macpherson (New York: Hafner, 1980 [1690]), pp. 52, 66.

[5] Psalm 115:16. All Biblical quotations are from the RSV.

[6] Locke, p. 65.

[7] Ibid., 19.

[8] 1 Tim. 6:17.

[9] Locke, p. 21.

[10] Ibid., p. 28.

[11] Ibid.

[12] Ibid., p. 29.

[13] René Descartes, Discourse on Method (1637), in The Philosophical Writings of Descartes Volume I, trans. John Cottingham, Robert Stoothoff, and Dugald Murdoch (Cambridge: Cambridge University Press, 1985), Part II, p. 120.

[14] Robert A. Nisbet, The Quest for Community (New York: Oxford University Press, 1953), p. 90.

[15] Perry Miller, “Individualism and the New England Tradition,” in The Responsibility of Mind in a Civilization of Machines: Essays by Perry Miller, ed. John Crowell and Stanford J. Searl, Jr. (Amherst, MA: The University of Massachusetts Press, 1979.), pp. 5, 6.

[16] Alexis de Tocqueville, The Old Régime and the French Revolu­tion, trans. Stuart Gilbert (New York: Doubleday, 1955 [1856]), p. 96.

[17] Jacob Burckhardt, The Civilization of the Renaissance in Italy, trans. S.G.C. Middlemore (New York: Modern Library, 1954), p. 100.

[18] Emmanuel Le Roy Ladurie, Montaillou: The Promised Land of Error, trans. Barbara Bray (New York: Vintage, 1979), p. 24.

[19] Nisbet, p. 81.

[20] Aristotle, Politics, 1253a.

[21] Ibid.

[22] Mircea Eliade, The Sacred and the Profane: The Nature of Religion, trans. Willard R. Trask (Orlando, FL: Harcourt, 1959), p. 178.

[23] For a detailed discussion of the failure of reductionism, see George Stanciu, “Reductionism: A Reasonable Goal or an Idiotic Quest?”.

[24] David Bohm, Wholeness and the Implicate Order (London: Ark Paperbacks, 1983), p. 134. Italics in the original.

[25] Richard P. Feynman, The Feynman Lectures on Physics Vol. I (Reading, Mass.: Addison-Wesley, 1963), Ch. 46, p. 9.

[26] See Ivo Kohler, The Formation and Transformation of the Perceptual World, trans. Harry Fiss (New York: International Universities Press, 1964).

[27] See Richard Held, “Plasticity in Sensory-Motor Systems,” Scientific American 213 (November, 1965): 84-94.

[28] Markus and Kitayama, “Culture and the self: Implications for cognition, emotion, and motivation,” Psychological Review 98: 227.

[29] 2010 Census Briefs, Households and Families: 2010, Issued April 2012.

[30] John Cacioppo and William Patrick, Loneliness: Human Nature and the Need for Social Connection (New York: Norton, 2009), p. 5.

[31] Miller McPherson and Lynn Smith-Lovin, “Social Isolation in America: Changes in Core Discussion Networks over Two Decades” American Sociological Review 71 (June, 2006: 353-375).

[32] Cross-National Collaborative Group, “The changing rate of major depression,” Journal of the American Medical Association (1992 December 2) 268 (21): 3098-3105.

[33] NCHS Data Brief, “Antidepressant Use in Persons Aged 12 and Over: United States, 2005–2008.”

[34] Jean M. Twenge, Brittany Gentile, Nathan DeWall, Debbie Ma, Katharine Lacefield, and David R. Schurtz, “Birth cohort increases in psychopathology among young Americans, 1938–2007: A cross-temporal meta-analysis of the MMPI,” Clinical Psychology Review 30 (2010): 145-154.

[35] World Health Organization, Mental Health: A Call For Action by World Health Ministers.

[36] Émile Durkheim, Suicide (Glencoe, IL: The Free Press, 1951), p. 208.

[37] Charles E. Fritz, “Disasters and Mental Health: Therapeutic Principles Drawn from Disaster Studies,” 1996.

[38] Tocqueville, Democracy in America, p. 11.

[39] J. Hector St. John de Crèvecoeur, Letters from an American Farmer, ed. Susan Manning (New York: Oxford University Press, 1997), p. 126.

[40] Tocqueville, Democracy in America, pp. 691-692.

[41] Ibid., p. 692.

[42] Ibid., p. 39.

[43] Ibid., p. 39.

The featured image is courtesy of Pixabay.

All comments are moderated and must be civil, concise, and constructive to the conversation. Comments that are critical of an essay may be approved, but comments containing ad hominem criticism of the author will not be published. Also, comments containing web links or block quotations are unlikely to be approved. Keep in mind that essays represent the opinions of the authors and do not necessarily reflect the views of The Imaginative Conservative or its editor or publisher.

Print Friendly, PDF & Email