We are searching data for your request:
Upon completion, a link will appear to access the found materials.
"The decline of a civilization has long been linked, anecdotally, to less moral censure and a decline in manners (manners being self, usually-moral censorship)." (Brock Adams, commenting an earlier question).
My question is: what is the historical pedegree of this notion? Brock Adams has suggested "Cato, Mark Twain, Heinlein, Bismark (I think), and Churchill" but I am not sure about Twain. Cato, for certain. Gibbon also naturally comes to mind.
(To clarify my meaning, J.B. Bury had written two books about the history of ideas: "History of the Freedom of Thought (1914)" and "Idea of Progress (1920)"; I am looking for a similar history of the idea of moral decline).
Edward Gibbon, in "the Decline and Fall of the Roman Empire," lists the following reasons (among others):
The five marks of the Roman decaying culture:
- Concern with displaying affluence instead of building wealth;
- Obsession with sex and perversions of sex;
- Art becomes freakish and sensationalistic instead of creative and original;
- Widening disparity between very rich and very poor;
- Increased demand to live off the state.”
Unfortunately, many of these symptoms are found in America today.
Socialism describes any political or economic theory that says the community, rather than individuals, should own and manage property and natural resources.
The term “socialism” has been applied to very different economic and political systems throughout history, including utopianism, anarchism, Soviet communism and social democracy. These systems vary widely in structure, but they share an opposition to an unrestricted market economy, and the belief that public ownership of the means of production (and making money) will lead to better distribution of wealth and a more egalitarian society.
What Factors Caused the Rise and Fall of Ghana?
The African trade in gold and salt caused the Ghana Empire to rise to prominence, and the disruption of that trade led to its decline. During its time, Ghana was one of the richest polities in Africa.
Though Ghana was not rich in natural resources itself, it was located along an important trade route between gold- and ivory-producing areas in the south and salt miners in the Sahara desert to the north. As a result of this strategically important location, Ghana became a wealthy entrepot.
Though the exact origins of Ghana are clothed in mystery, tradition places the empire's origins in the fourth century AD. By the ninth century, the area had become affluent according to accounts by Muslim traders who began to visit the area. These traders from the north continued to develop the trade, linking its gold resources with the vital markets in the Mediterranean region, and the empire grew larger by incorporating its neighbors.
The decline of the empire started in the 11th century, when the Almoravids, a militant confederation of Muslims, began to attack the empire and even conquered it for a time. Though their grip on power did not last long, the chaos they brought to the region destabilized trade, hurting the empire's sources of income. Decline ensued. The remnants of Ghana were incorporated into the Mali Empire in 1240.
How America Ends
The President Is Winning His War on American Institutions
The New Reconstruction
One question has haunted me while researching this essay: Are we living through a pivot or a decline? During past moral convulsions, Americans rose to the challenge. They built new cultures and institutions, initiated new reforms—and a renewed nation went on to its next stage of greatness. I’ve spent my career rebutting the idea that America is in decline, but the events of these past six years, and especially of 2020, have made clear that we live in a broken nation. The cancer of distrust has spread to every vital organ.
Renewal is hard to imagine. Destruction is everywhere, and construction difficult to see. The problem goes beyond Donald Trump. The stench of national decline is in the air. A political, social, and moral order is dissolving. America will only remain whole if we can build a new order in its place.
The Age of Disappointment
The story begins , at least for me, in August 1991, in Moscow, where I was reporting for The Wall Street Journal. In a last desperate bid to preserve their regime, a group of hard-liners attempted a coup against the president of the Soviet Union, Mikhail Gorbachev. As Soviet troops and tanks rolled into Moscow, democratic activists gathered outside the Russian parliament building to oppose them. Boris Yeltsin, the president of Russia, mounted a tank and stood the coup down.
In that square, I met a 94-year-old woman who was passing out sandwiches to support the democratic protesters. Her name was Valentina Kosieva. She came to embody for me the 20th century, and all the suffering and savagery we were leaving behind as we marched—giddily, in those days—into the Information Age. She was born in 1898 in Samara. In 1905, she said, the Cossacks launched pogroms in her town and shot her uncle and her cousin. She was nearly killed after the Russian Revolution of 1917. She had innocently given shelter to some anti-Communist soldiers for “humanitarian reasons.” When the Reds came the next day, they decided to execute her. Only her mother’s pleadings saved her life.
In 1937, the Soviet secret police raided her apartment based on false suspicions, arrested her husband, and told her family they had 20 minutes to vacate. Her husband was sent to Siberia, where he died from either disease or execution—she never found out which. During World War II, she became a refugee, exchanging all her possessions for food. Her son was captured by the Nazis and beaten to death at the age of 17. After the Germans retreated, the Soviets ripped her people, the Kalmyks, from their homes and sent them into internal exile. For decades, she led a hidden life, trying to cover the fact that she was the widow of a supposed Enemy of the People.
Every trauma of Soviet history had happened to this woman. Amid the tumult of what we thought was the birth of a new, democratic Russia, she told me her story without bitterness or rancor. “If you get a letter completely free from self-pity,” Aleksandr Solzhenitsyn once wrote, it can only be from a victim of Soviet terror. “They are used to the worst the world can do, and nothing can depress them.” Kosieva had lived to see the death of this hated regime and the birth of a new world.
Those were the days of triumphant globalization. Communism was falling. Apartheid was ending. The Arab-Israeli dispute was calming down. Europe was unifying. China was prospering. In the United States, a moderate Republican president, George H. W. Bush, gave way to the first Baby Boomer president, a moderate Democrat, Bill Clinton. The American economy grew nicely. The racial wealth gap narrowed. All the great systems of society seemed to be working: capitalism, democracy, pluralism, diversity, globalization. It seemed, as Francis Fukuyama wrote in his famous “The End of History?” essay for The National Interest, “an unabashed victory for economic and political liberalism.”
We think of the 1960s as the classic Boomer decade, but the false summer of the 1990s was the high-water mark of that ethos. The first great theme of that era was convergence. Walls were coming down. Everybody was coming together. The second theme was the triumph of classical liberalism. Liberalism was not just a philosophy—it was a spirit and a zeitgeist, a faith that individual freedom would blossom in a loosely networked democratic capitalist world. Enterprise and creativity would be unleashed. America was the great embodiment and champion of this liberation. The third theme was individualism. Society flourished when individuals were liberated from the shackles of society and the state, when they had the freedom to be true to themselves.
For his 2001 book, Moral Freedom, the political scientist Alan Wolfe interviewed a wide array of Americans. The moral culture he described was no longer based on mainline Protestantism, as it had been for generations. Instead, Americans, from urban bobos to suburban evangelicals, were living in a state of what he called moral freedom: the belief that life is best when each individual finds his or her own morality—inevitable in a society that insists on individual freedom.
When you look back on it from the vantage of 2020, moral freedom, like the other dominant values of the time, contained within it a core assumption: If everybody does their own thing, then everything will work out for everybody. If everybody pursues their own economic self-interest, then the economy will thrive for all. If everybody chooses their own family style, then children will prosper. If each individual chooses his or her own moral code, then people will still feel solidarity with one another and be decent to one another. This was an ideology of maximum freedom and minimum sacrifice.
It all looks naive now. We were naive about what the globalized economy would do to the working class, naive to think the internet would bring us together, naive to think the global mixing of people would breed harmony, naive to think the privileged wouldn’t pull up the ladders of opportunity behind them. We didn’t predict that oligarchs would steal entire nations, or that demagogues from Turkey to the U.S. would ignite ethnic hatreds. We didn’t see that a hyper-competitive global meritocracy would effectively turn all of childhood into elite travel sports where a few privileged performers get to play and everyone else gets left behind.
Over the 20 years after I sat with Kosieva, it all began to unravel. The global financial crisis had hit, the Middle East was being ripped apart by fanatics. On May 15, 2011, street revolts broke out in Spain, led by the self-declared Indignados—“the outraged.” “They don’t represent us!” they railed as an insult to the Spanish establishment. It would turn out to be the cry of a decade.
We are living in the age of that disappointment. Millennials and members of Gen Z have grown up in the age of that disappointment, knowing nothing else. In the U.S. and elsewhere, this has produced a crisis of faith, across society but especially among the young. It has produced a crisis of trust.
The Trust Fall
Social trust is the confidence that other people will do what they ought to do most of the time. In a restaurant I trust you to serve untainted fish and you trust me not to skip out on the bill. Social trust is a generalized faith in the people of your community. It consists of smaller faiths. It begins with the assumption that we are interdependent, our destinies linked. It continues with the assumption that we share the same moral values. We share a sense of what is the right thing to do in different situations. As Kevin Vallier of Bowling Green State University argues in his forthcoming book, Trust in a Polarized Age, social trust also depends on a sense that we share the same norms. If two lanes of traffic are merging into one, the drivers in each lane are supposed to take turns. If you butt in line, I’ll honk indignantly. I’ll be angry, and I’ll want to enforce the small fairness rules that make our society function smoothly.
High-trust societies have what Fukuyama calls spontaneous sociability. People are able to organize more quickly, initiate action, and sacrifice for the common good. When you look at research on social trust, you find all sorts of virtuous feedback loops. Trust produces good outcomes, which then produce more trust. In high-trust societies, corruption is lower and entrepreneurship is catalyzed. Higher-trust nations have lower economic inequality, because people feel connected to each other and are willing to support a more generous welfare state. People in high-trust societies are more civically engaged. Nations that score high in social trust—like the Netherlands, Sweden, China, and Australia—have rapidly growing or developed economies. Nations with low social trust—like Brazil, Morocco, and Zimbabwe—have struggling economies. As the ethicist Sissela Bok once put it, “Whatever matters to human beings, trust is the atmosphere in which it thrives.”
During most of the 20th century, through depression and wars, Americans expressed high faith in their institutions. In 1964, for example, 77 percent of Americans said they trusted the federal government to do the right thing most or all of the time. Then came the last two moral convulsions. In the late 1960s and ’70s, amid Vietnam and Watergate, trust in institutions collapsed. By 1994, only one in five Americans said they trusted government to do the right thing. Then came the Iraq War and the financial crisis and the election of Donald Trump. Institutional trust levels remained pathetically low. What changed was the rise of a large group of people who were actively and poisonously alienated—who were not only distrustful but explosively distrustful. Explosive distrust is not just an absence of trust or a sense of detached alienation—it is an aggressive animosity and an urge to destroy. Explosive distrust is the belief that those who disagree with you are not just wrong but illegitimate. In 1997, 64 percent of Americans had a great or good deal of trust in the political competence of their fellow citizens today only a third of Americans feel that way.
Falling trust in institutions is bad enough it’s when people lose faith in each other that societies really begin to fall apart. In most societies, interpersonal trust is stable over the decades. But for some—like Denmark, where about 75 percent say the people around them are trustworthy, and the Netherlands, where two-thirds say so—the numbers have actually risen.
In America, interpersonal trust is in catastrophic decline. In 2014, according to the General Social Survey conducted by NORC at the University of Chicago, only 30.3 percent of Americans agreed that “most people can be trusted,” the lowest number the survey has recorded since it started asking the question in 1972. Today, a majority of Americans say they don’t trust other people when they first meet them.
Is mistrust based on distorted perception or is it a reflection of reality? Are people increasingly mistrustful because they are watching a lot of negative media and get a falsely dark view of the world? Or are they mistrustful because the world is less trustworthy, because people lie, cheat, and betray each other more than they used to?
There’s evidence to suggest that marital infidelity, academic cheating, and animal cruelty are all on the rise in America, but it’s hard to directly measure the overall moral condition of society—how honest people are, and how faithful. The evidence suggests that trust is an imprint left by experience, not a distorted perception. Trust is the ratio between the number of people who betray you and the number of people who remain faithful to you. It’s not clear that there is more betrayal in America than there used to be—but there are certainly fewer faithful supports around people than there used to be. Hundreds of books and studies on declining social capital and collapsing family structure demonstrate this. In the age of disappointment, people are less likely to be surrounded by faithful networks of people they can trust.
Thus the Harvard political scientist Robert Putnam argues that it’s a great mistake to separate the attitude (trust) from the behavior (morally right action). People become trusting when the world around them is trustworthy. When they are surrounded by people who live up to their commitments. When they experience their country as a fair place. As Vallier puts it, trust levels are a reflection of the moral condition of a nation at any given time. I’d add that high national trust is a collective moral achievement. High national distrust is a sign that people have earned the right to be suspicious. Trust isn’t a virtue—it’s a measure of other people’s virtue.
Unsurprisingly, the groups with the lowest social trust in America are among the most marginalized. Trust, like much else, is unequally distributed across American society, and the inequality is getting worse. Each of these marginalized groups has seen an additional and catastrophic decline in trust over the past few years.
Black Americans have been one of the most ill-treated groups in American history their distrust is earned distrust. In 2018, 37.3 percent of white Americans felt that most people can be trusted, according to the General Social Survey, but only 15.3 percent of Black Americans felt the same. This is not general misanthropy. Black Americans have high trust in other Black Americans it’s the wider society they don’t trust, for good and obvious reasons. And Black perceptions of America’s fairness have tumbled further in the age of disappointment. In 2002, 43 percent of Black Americans were very or somewhat satisfied with the way Black people are treated in the U.S. By 2018, only 18 percent felt that way, according to Gallup.
The second disenfranchised low-trust group includes the lower-middle class and the working poor. According to Tim Dixon, an economist and the co-author of a 2018 study that examined polarization in America, this group makes up about 40 percent of the country. “They are driven by the insecurity of their place in society and in the economy,” he says. They are distrustful of technology and are much more likely to buy into conspiracy theories. “They’re often convinced by stories that someone is trying to trick them, that the world is against them,” he says. Distrust motivated many in this group to vote for Donald Trump, to stick a thumb in the eye of the elites who had betrayed them.
This brings us to the third marginalized group that scores extremely high on social distrust: young adults. These are people who grew up in the age of disappointment. It’s the only world they know.
In 2012, 40 percent of Baby Boomers believed that most people can be trusted, as did 31 percent of members of Generation X. In contrast, only 19 percent of Millennials said most people can be trusted. Seventy-three percent of adults under 30 believe that “most of the time, people just look out for themselves,” according to a Pew survey from 2018. Seventy-one percent of those young adults say that most people “would try to take advantage of you if they got a chance.”
Many young people look out at a world they believe is screwed up and untrustworthy in fundamental ways. A mere 10 percent of Gen Zers trust politicians to do the right thing. Millennials are twice as likely as their grandparents to say that families should be able to opt out of vaccines. Only 35 percent of young people, versus 67 percent of old people, believe that Americans respect the rights of people who are not like them. Fewer than a third of Millennials say America is the greatest country in the world, compared to 64 percent of members of the Silent Generation.
Human beings need a basic sense of security in order to thrive as the political scientist Ronald F. Inglehart puts it, their “values and behavior are shaped by the degree to which survival is secure.” In the age of disappointment, our sense of safety went away. Some of this is physical insecurity: school shootings, terrorist attacks, police brutality, and overprotective parenting at home that leaves young people incapable of handling real-world stress. But the true insecurity is financial, social, and emotional.
First, financial insecurity: By the time the Baby Boomers hit a median age of 35, their generation owned 21 percent of the nation’s wealth. As of last year, Millennials—who will hit an average age of 35 in three years—owned just 3.2 percent of the nation’s wealth.
Next, emotional insecurity: Americans today experience more instability than at any period in recent memory—fewer children growing up in married two-parent households, more single-parent households, more depression, and higher suicide rates.
Then, identity insecurity. People today live in what the late sociologist Zygmunt Bauman called liquid modernity. All the traits that were once assigned to you by your community, you must now determine on your own: your identity, your morality, your gender, your vocation, your purpose, and the place of your belonging. Self-creation becomes a major anxiety-inducing act of young adulthood.
Finally, social insecurity. In the age of social media our “sociometers”—the antennae we use to measure how other people are seeing us—are up and on high alert all the time. Am I liked? Am I affirmed? Why do I feel invisible? We see ourselves in how we think others see us. Their snarkiness turns into my self-doubt, their criticism into my shame, their obliviousness into my humiliation. Danger is ever present. “For many people, it is impossible to think without simultaneously thinking about what other people would think about what you’re thinking,” the educator Fredrik deBoer has written. “This is exhausting and deeply unsatisfying. As long as your self-conception is tied up in your perception of other people’s conception of you, you will never be free to occupy a personality with confidence you’re always at the mercy of the next person’s dim opinion of you and your whole deal.”
In this world, nothing seems safe everything feels like chaos.
The Distrust Mindset
Distrust sows distrust . It produces the spiritual state that Emile Durkheim called anomie, a feeling of being disconnected from society, a feeling that the whole game is illegitimate, that you are invisible and not valued, a feeling that the only person you can really trust is yourself.
Distrustful people try to make themselves invulnerable, armor themselves up in a sour attempt to feel safe. Distrust and spiritual isolation lead people to flee intimacy and try to replace it with stimulation. Distrust, anxiety, and anomie are at the root of the 73 percent increase in depression among Americans aged 18 to 25 from 2007 to 2018, and of the shocking rise in suicide. “When we have no one to trust, our brains can self-destruct,” Ulrich Boser writes in his book on the science of trust, The Leap.
People plagued by distrust can start to see threats that aren’t there they become risk averse. Americans take fewer risks and are much less entrepreneurial than they used to be. In 2014, the rate of business start-ups hit a nearly 40-year low. Since the early 1970s, the rate at which people move across state lines each year has dropped by 56 percent. People lose faith in experts. They lose faith in truth, in the flow of information that is the basis of modern society. “A world of truth is a world of trust, and vice versa,” Rabbi Jonathan Sacks writes in his book Morality.
In periods of distrust, you get surges of populism populism is the ideology of those who feel betrayed. Contempt for “insiders” rises, as does suspicion toward anybody who holds authority. People are drawn to leaders who use the language of menace and threat, who tell group-versus-group power narratives. You also get a lot more political extremism. People seek closed, rigid ideological systems that give them a sense of security. As Hannah Arendt once observed, fanaticism is a response to existential anxiety. When people feel naked and alone, they revert to tribe. Their radius of trust shrinks, and they only trust their own kind. Donald Trump is the great emblem of an age of distrust—a man unable to love, unable to trust. When many Americans see Trump’s distrust, they see a man who looks at the world as they do.
By February 2020, America was a land mired in distrust. Then the plague arrived.
The Failure of Institutions
From the start , the pandemic has hit the American mind with sledgehammer force. Anxiety and depression have spiked. In April, Gallup recorded a record drop in self-reported well-being, as the share of Americans who said they were thriving fell to the same low point as during the Great Recession. These kinds of drops tend to produce social upheavals. A similar drop was seen in Tunisian well-being just before the street protests that led to the Arab Spring.
The emotional crisis seems to have hit low-trust groups the hardest. Pew found that “low trusters” were more nervous during the early months of the pandemic, more likely to have trouble sleeping, more likely to feel depressed, less likely to say the public authorities were responding well to the pandemic. Eighty-one percent of Americans under 30 reported feeling anxious, depressed, lonely, or hopeless at least one day in the previous week, compared to 48 percent of adults 60 and over.
Americans looked to their governing institutions to keep them safe. And nearly every one of their institutions betrayed them. The president downplayed the crisis, and his administration was a daily disaster area. The Centers for Disease Control and Prevention produced faulty tests, failed to provide up-to-date data on infections and deaths, and didn’t provide a trustworthy voice for a scared public. The Food and Drug Administration wouldn’t allow private labs to produce their own tests without a lengthy approval process.
The sense of betrayal was magnified when people looked abroad. In nations that ranked high on the World Values Survey measure of interpersonal trust—like China, Australia, and most of the Nordic states—leaders were able to mobilize quickly, come up with a plan, and count on citizens to comply with the new rules. In low-trust nations—like Mexico, Spain, and Brazil—there was less planning, less compliance, less collective action, and more death. Countries that fell somewhere in the middle—including the U.S., Germany, and Japan—had a mixed record depending on the quality of their leadership. South Korea, where more than 65 percent of people say they trust government when it comes to health care, was able to build a successful test-and-trace regime. In America, where only 31 percent of Republicans and 44 percent of Democrats say the government should be able to use cellphone data to track compliance with experts’ coronavirus social-contact guidelines, such a system was never really implemented.
For decades, researchers have been warning about institutional decay. Institutions get caught up in one of those negative feedback loops that are so common in a world of mistrust. They become ineffective and lose legitimacy. People who lose faith in them tend not to fund them. Talented people don’t go to work for them. They become more ineffective still. In 1969, Daniel Patrick Moynihan made this core point in a memo to his boss-to-be, President-elect Richard Nixon: “In one form or another all of the major domestic problems facing you derive from the erosion of the authority of the institutions of American society. This is a mysterious process of which the most that can be said is that once it starts it tends not to stop.”
On the right, this anti-institutional bias has manifested itself as hatred of government an unwillingness to defer to expertise, authority, and basic science and a reluctance to fund the civic infrastructure of society, such as a decent public health system. In state after state Republican governors sat inert, unwilling to organize or to exercise authority, believing that individuals should be free to take care of themselves.
On the left, distrust of institutional authority has manifested as a series of checks on power that have given many small actors the power to stop common plans, producing what Fukuyama calls a vetocracy. Power to the people has meant no power to do anything, and the result is a national NIMBYism that blocks social innovation in case after case.
In 2020, American institutions groaned and sputtered. Academics wrote up plan after plan and lobbed them onto the internet. Few of them went anywhere. America had lost the ability to build new civic structures to respond to ongoing crises like climate change, opioid addiction, and pandemics, or to reform existing ones.
In high-trust eras, according to Yuval Levin, who is an American Enterprise Institute scholar and the author of A Time to Build: From Family and Community to Congress and the Campus, How Recommitting to Our Institutions Can Revive the American Dream, people have more of a “first-person-plural” instinct to ask, “What can we do?” In a lower-trust era like today, Levin told me, “there is a greater instinct to say, ‘They’re failing us.’ We see ourselves as outsiders to the systems—an outsider mentality that’s hard to get out of.”
Americans haven’t just lost faith in institutions they’ve come to loathe them, even to think that they are evil. A Democracy Fund + UCLA Nationscape survey found that 55 percent of Americans believe that the coronavirus that causes COVID-19 was created in a lab and 59 percent believe that the U.S. government is concealing the true number of deaths. Half of all Fox News viewers believe that Bill Gates is plotting a mass-vaccination campaign so he can track people. This spring, nearly a third of Americans were convinced that it was probably or definitely true that a vaccine existed but was being withheld by the government. When Trump was hospitalized for COVID-19 on October 2, many people conspiratorially concluded that the administration was lying about his positive diagnosis for political gain. When government officials briefed the nation about how sick he was, many people assumed they were obfuscating, which in fact they were.
The failure of and withdrawal from institutions decimated America’s pandemic response, but the damage goes beyond that. That’s because institutions like the law, the government, the police, and even the family don’t merely serve social functions, Levin said they form the individuals who work and live within them. The institutions provide rules to live by, standards of excellence to live up to, social roles to fulfill.
By 2020, people had stopped seeing institutions as places they entered to be morally formed, Levin argued. Instead, they see institutions as stages on which they can perform, can display their splendid selves. People run for Congress not so they can legislate, but so they can get on TV. People work in companies so they can build their personal brand. The result is a world in which institutions not only fail to serve their social function and keep us safe, they also fail to form trustworthy people. The rot in our structures spreads to a rot in ourselves.
The Failure of Society
The coronavirus has confronted America with a social dilemma. A social dilemma, the University of Pennsylvania scholar Cristina Bicchieri notes, is “a situation in which each group member gets a higher outcome if she pursues her individual self-interest, but everyone in the group is better off if all group members further the common interest.” Social distancing is a social dilemma. Many low-risk individuals have been asked to endure some large pain (unemployment, bankruptcy) and some small inconvenience (mask wearing) for the sake of the common good. If they could make and keep this moral commitment to each other in the short term, the curve would be crushed, and in the long run we’d all be better off. It is the ultimate test of American trustworthiness.
In March and April, vast majorities of Americans said they supported social distancing, and society seemed to be coming together. It didn’t last. Americans locked down a bit in early March, but never as much as people in some other countries. By mid-April, they told themselves—and pollsters—that they were still socially distancing, but that was increasingly a self-deception. While pretending to be rigorous, people relaxed and started going out. It was like watching somebody gradually give up on a diet. There wasn’t a big moment of capitulation, just an extra chocolate bar here, a bagel there, a scoop of ice cream before bed. By May, most people had become less strict about quarantining. Many states officially opened up in June when infection rates were still much higher than in countries that had successfully contained the disease. On June 20, 500,000 people went to reopened bars and nightspots in Los Angeles County alone.
You can blame Trump or governors or whomever you like, but in reality this was a mass moral failure of Republicans and Democrats and independents alike. This was a failure of social solidarity, a failure to look out for each other.
Alexis de Tocqueville discussed a concept called the social body. Americans were clearly individualistic, he observed, but they shared common ideas and common values, and could, when needed, produce common action. They could form a social body. Over time, those common values eroded, and were replaced by a value system that put personal freedom above every other value. When Americans were confronted with the extremely hard task of locking down for months without any of the collective resources that would have made it easier—habits of deference to group needs a dense network of community bonds to help hold each other accountable a history of trust that if you do the right thing, others will too preexisting patterns of cooperation a sense of shame if you deviate from the group—they couldn’t do it. America failed.
By August, most Americans understood the failure. Seventy-two percent of Danes said they felt more united after the COVID-19 outbreak. Only 18 percent of Americans felt the same.
In the spring and summer of 2020 , six years of moral convulsion came to a climax. This wasn’t just a political and social crisis, it was also an emotional trauma. The week before George Floyd was killed, the National Center for Health Statistics released data showing that a third of all Americans were showing signs of clinical anxiety or depression. By early June, after Floyd’s death, the percentage of Black Americans showing clinical signs of depression and anxiety disorders had jumped from 36 to 41 percent. Depression and anxiety rates were three times those of the year before. At the end of June, one-quarter of young adults aged 18 to 24 said they had contemplated suicide during the previous 30 days.
In the immediate aftermath of his death, Floyd became the emblematic American—the symbol of a society in which no one, especially Black Americans, was safe. The protests, which took place in every state, were diverse. The young white people at those marches weren’t only marching as allies of Black people. They were marching for themselves, as people who grew up in a society they couldn’t fully trust. Two low-trust sectors of American society formed an alliance to demand change.
By late June, American national pride was lower than at any time since Gallup started measuring, in 2001. American happiness rates were at their lowest level in nearly 50 years. In another poll, 71 percent of Americans said they were angry about the state of the country, and just 17 percent said they were proud. According to an NBC News/Wall Street Journal poll, 80 percent of American voters believe that “things in the country are out of control.” Gun sales in June were 145 percent higher than in the previous year. By late June, it was clear that America was enduring a full-bore crisis of legitimacy, an epidemic of alienation, and a loss of faith in the existing order.
Years of distrust burst into a torrent of rage. There were times when the entire social fabric seemed to be disintegrating. Violence rocked places like Portland, Kenosha, and beyond. The murder rates soared in city after city. The most alienated, anarchic actors in society—antifa, the Proud Boys, QAnon—seemed to be driving events. The distrust doom loop was now at hand.
The Age of Precarity
Cultures are collective responses to common problems. But when reality changes, culture takes a few years, and a moral convulsion, to completely shake off the old norms and values.
The culture that is emerging, and which will dominate American life over the next decades, is a response to a prevailing sense of threat. This new culture values security over liberation, equality over freedom, the collective over the individual. We’re seeing a few key shifts.
From risk to security. As Albena Azmanova, a political theorist at the University of Kent, has argued, we’ve entered an age of precarity in which every political or social movement has an opportunity pole and a risk pole. In the opportunity mentality, risk is embraced because of the upside possibilities. In the risk mindset, security is embraced because people need protection from downside dangers. In this period of convulsion, almost every party and movement has moved from its opportunity pole to its risk pole. Republicans have gone from Reaganesque free trade and open markets to Trumpesque closed borders. Democrats have gone from the neoliberalism of Kennedy and Clinton to security-based policies like a universal basic income and the protections offered by a vastly expanded welfare state. Campus culture has gone from soft moral relativism to strict moralism. Evangelicalism has gone from the open evangelism of Billy Graham to the siege mentality of Franklin Graham.
From achievement to equality. The culture that emerged from the 1960s upheavals put heavy emphasis on personal development and personal growth. The Boomers emerged from, and then purified, a competitive meritocracy that put career achievement at the center of life and boosted those who succeeded into ever more exclusive lifestyle enclaves.
In the new culture we are entering, that meritocratic system looks more and more like a ruthless sorting system that excludes the vast majority of people, rendering their life precarious and second class, while pushing the “winners” into a relentless go-go lifestyle that leaves them exhausted and unhappy. In the emerging value system, “privilege” becomes a shameful sin. The status rules flip. The people who have won the game are suspect precisely because they’ve won. Too-brazen signs of “success” are scrutinized and shamed. Equality becomes the great social and political goal. Any disparity—racial, economic, meritocratic—comes to seem hateful.
From self to society. If we’ve lived through an age of the isolated self, people in the emerging culture see embedded selves. Socialists see individuals embedded in their class group. Right-wing populists see individuals as embedded pieces of a national identity group. Left-wing critical theorists see individuals embedded in their racial, ethnic, gender, or sexual-orientation identity group. Each person speaks from the shared group consciousness. (“Speaking as a progressive gay BIPOC man …”) In an individualistic culture, status goes to those who stand out in collective moments, status goes to those who fit in. The cultural mantra shifts from “Don’t label me!” to “My label is who I am.”
From global to local. A community is a collection of people who trust each other. Government follows the rivers of trust. When there is massive distrust of central institutions, people shift power to local institutions, where trust is higher. Power flows away from Washington to cities and states.
From liberalism to activism. Baby Boomer political activism began with a free-speech movement. This was a generation embedded in enlightenment liberalism, which was a long effort to reduce the role of passions in politics and increase the role of reason. Politics was seen as a competition between partial truths.
Liberalism is ill-suited for an age of precarity. It demands that we live with a lot of ambiguity, which is hard when the atmosphere already feels unsafe. Furthermore, it is thin. It offers an open-ended process of discovery when what people hunger for is justice and moral certainty. Moreover, liberalism’s niceties come to seem like a cover that oppressors use to mask and maintain their systems of oppression. Public life isn’t an exchange of ideas it’s a conflict of groups engaged in a vicious death struggle. Civility becomes a “code for capitulation to those who want to destroy us,” as the journalist Dahlia Lithwick puts it.
The cultural shifts we are witnessing offer more safety to the individual at the cost of clannishness within society. People are embedded more in communities and groups, but in an age of distrust, groups look at each other warily, angrily, viciously. The shift toward a more communal viewpoint is potentially a wonderful thing, but it leads to cold civil war unless there is a renaissance of trust. There’s no avoiding the core problem. Unless we can find a way to rebuild trust, the nation does not function.
How to Rebuild Trust
When you ask political scientists or psychologists how a culture can rebuild social trust, they aren’t much help. There just haven’t been that many recent cases they can study and analyze. Historians have more to offer, because they can cite examples of nations that have gone from pervasive social decay to relative social health. The two most germane to our situation are Great Britain between 1830 and 1848 and the United States between 1895 and 1914.
People in these eras lived through experiences parallel to ours today. They saw the massive economic transitions caused by the Industrial Revolution. They experienced great waves of migration, both within the nation and from abroad. They lived with horrific political corruption and state dysfunction. And they experienced all the emotions associated with moral convulsions—the sort of indignation, shame, guilt, and disgust we’re experiencing today. In both periods, a highly individualistic and amoral culture was replaced by a more communal and moralistic one.
But there was a crucial difference between those eras and our own, at least so far. In both cases, moral convulsion led to frenetic action. As Richard Hofstadter put it in The Age of Reform, the feeling of indignation sparked a fervent and widespread desire to assume responsibility, to organize, to build. During these eras, people built organizations at a dazzling pace. In the 1830s, the Clapham Sect, a religious revival movement, campaigned for the abolition of slavery and promoted what we now think of as Victorian values. The Chartists, a labor movement, gathered the working class and motivated them to march and strike. The Anti-Corn Law League worked to reduce the power of the landed gentry and make food cheaper for the workers. These movements agitated from both the bottom up and the top down.
As Robert Putnam and Shaylyn Romney Garrett note in their forthcoming book, The Upswing, the American civic revival that began in the 1870s produced a stunning array of new organizations: the United Way, the NAACP, the Boy Scouts, the Forest Service, the Federal Reserve System, 4-H clubs, the Sierra Club, the settlement-house movement, the compulsory-education movement, the American Bar Association, the American Legion, the ACLU, and on and on. These were missional organizations, with clearly defined crusading purposes. They put tremendous emphasis on cultivating moral character and social duty—on honesty, reliability, vulnerability, and cooperativeness, and on shared values, rituals, and norms. They tended to place responsibility on people who had not been granted power before. “Few things help an individual more than to place responsibility upon him, and to let him know that you trust him,” Booker T. Washington wrote in his 1901 autobiography.
After the civic revivals, both nations witnessed frenetic political reform. During the 1830s, Britain passed the Reform Act, which widened the franchise the Factory Act, which regulated workplaces and the Municipal Corporations Act, which reformed local government. The Progressive Era in America saw an avalanche of reform: civil-service reform food and drug regulation the Sherman Act, which battled the trusts the secret ballot and so on. Civic life became profoundly moralistic, but political life became profoundly pragmatic and anti-ideological. Pragmatism and social-science expertise were valued.
Can America in the 2020s turn itself around the way the America of the 1890s, or the Britain of the 1830s, did? Can we create a civic renaissance and a legislative revolution? I’m not so sure. If you think we’re going back to the America that used to be—with a single cohesive mainstream culture with an agile, trusted central government with a few mainstream media voices that police a coherent national conversation with an interconnected, respected leadership class with a set of dominant moral values based on mainline Protestantism or some other single ethic—then you’re not being realistic. I see no scenario in which we return to being the nation we were in 1965, with a cohesive national ethos, a clear national establishment, trusted central institutions, and a pop-culture landscape in which people overwhelmingly watch the same shows and talked about the same things. We’re too beaten up for that. The age of distrust has smashed the converging America and the converging globe—that great dream of the 1990s—and has left us with the reality that our only plausible future is decentralized pluralism.
A model for that can be found in, of all places, Houston, Texas, one of the most diverse cities in America. At least 145 languages are spoken in the metro area. It has no real central downtown district, but, rather, a wide diversity of scattered downtowns and scattered economic and cultural hubs. As you drive across town you feel like you’re successively in Lagos, Hanoi, Mumbai, White Plains, Beverly Hills, Des Moines, and Mexico City. In each of these cultural zones, these islands of trust, there is a sense of vibrant activity and experimentation—and across the whole city there is an atmosphere of openness, and goodwill, and the American tendency to act and organize that Hofstadter discussed in The Age of Reform.
Not every place can or would want to be Houston—its cityscape is ugly, and I’m not a fan of its too-libertarian zoning policies—but in that rambling, scattershot city I see an image of how a hyper-diverse, and more trusting, American future might work.
The key to making decentralized pluralism work still comes down to one question: Do we have the energy to build new organizations that address our problems, the way the Brits did in the 1830s and Americans did in the 1890s? Personal trust can exist informally between two friends who rely on each other, but social trust is built within organizations in which people are bound together to do joint work, in which they struggle together long enough for trust to gradually develop, in which they develop shared understandings of what is expected of each other, in which they are enmeshed in rules and standards of behavior that keep them trustworthy when their commitments might otherwise falter. Social trust is built within the nitty-gritty work of organizational life: going to meetings, driving people places, planning events, sitting with the ailing, rejoicing with the joyous, showing up for the unfortunate. Over the past 60 years, we have given up on the Rotary Club and the American Legion and other civic organizations and replaced them with Twitter and Instagram. Ultimately, our ability to rebuild trust depends on our ability to join and stick to organizations.
The period between the deaths of Eric Garner and Michael Brown in the summer of 2014 and the election of November 2020 represents the latest in a series of great transitional moments in American history. Whether we emerge from this transition stronger depends on our ability, from the bottom up and the top down, to build organizations targeted at our many problems. If history is any guide, this will be the work not of months, but of one or two decades.
For centuries, America was the greatest success story on earth, a nation of steady progress, dazzling achievement, and growing international power. That story threatens to end on our watch, crushed by the collapse of our institutions and the implosion of social trust. But trust can be rebuilt through the accumulation of small heroic acts—by the outrageous gesture of extending vulnerability in a world that is mean, by proffering faith in other people when that faith may not be returned. Sometimes trust blooms when somebody holds you against all logic, when you expected to be dropped. It ripples across society as multiplying moments of beauty in a storm.
Some commodity-rich economies have resolved these tensions. Australia, for example, shared many of the traits of early 20th-century Argentina: lots of commodities, a history of immigration and remoteness from big industrial centres. Yet it managed to develop a broader-based economy than Argentina and grew faster. Between 1929 and 1975 Australian income per person increased at an average annual rate of 0.96%, compared with 0.67% in Argentina.
Australia had some big advantages: the price of minerals does not affect domestic consumers in the same way as the price of food, for instance. But it also had the institutions to balance competing interests: a democracy in which the working class was represented an apprenticeship system an independent Tariff Board to advise the government on trade. Argentina had not evolved this political apparatus, despite an early move to universal male suffrage in 1912. The third theory for Argentine decline points to the lack of institutions to develop long-term state policies—what Argentines call política de Estado.
The constant interruptions to democracy are not the only manifestation of this institutional weakness. The Supreme Court has been overhauled several times since Perón first changed its membership in 1946. Presidents have a habit of tinkering with the constitution to allow them to serve more terms: Ms Fernández was heading this way before poor mid-term election results last year weakened her position.
Property rights are insecure: ask Repsol, the Spanish firm whose stake in YPF, an Argentine oil company, was nationalised in 2012. Statistics cannot be trusted: Argentina was due this week to unveil new inflation data in a bid to avoid censure from the IMF for its wildly undercooked previous estimates. Budgets can be changed at will by the executive. Roberto Lavagna, a former economy minister, would like to see a requirement for parliamentary approval of budget amendments.
Lengthy the history, social changes seem to offer a fertile substrate for the evolution of complex innovative systems of interpreting reality, of attributing the causes and controlling events, of living emotions. A critical study of the historical development and the interpretations of mental diseases may contribute to providing an explanation for the means of psychopathological expression. Moreover, it may provoke a re-discussion of the threshold and vulnerability concept in cases where it could be hypothesized that the new cognitive systems, although adaptive to new social requirements, might represent a factor of vulnerability (𠇌ulturally specific”) to specific mental disorders.
We have seen that both the symptomatic expression of women’ malaise and the culturally specific interpretation of the same malaise witness the changing role of women. From incomprehensible Being (and therefore mean of the Evil) to frail creatures that try, however, to manipulate the environment to their own ends (in Freud's view) to creature arbiter of his fate (in the modern transformation from hysteria to melancholia), where the woman seems to have traded power with the loneliness and guilt.
What's behind precipitous decline in America's morality?
The U.S. isn't only headed for bankruptcy when it comes to our finances. it looks like we could be going morally bankrupt too.
A new Gallup poll paints a depressing picture of the state of our moral values in the U.S.
45 percent of those surveyed describe morality in this country as "poor". only 15 percent - fewer than one in five– say "excellent or good."
These numbers rank among the worst in this poll over the last decade.
The survey also shows 76 percent of Americans say moral values in the U.S. are getting worse. only 14 percent say they're getting better.
Poll respondents give many examples when it comes to how moral values are getting worse. From the disrespect of others to parents not teaching their children good values from dishonesty among government and business leaders to rising crime, loss of religion, breakdown of the family structure and people not being accountable for their own behavior. No one's responsible for anything anymore. Everybody's a victim.
And believe it or nor this is one thing all political parties agree on. neither Republicans, Democrats nor independents give positive ratings of moral values.
It's a crying shame that in this great, free country where anything is possible, we as a society have such a negative view of how we behave and how we treat each other. It's hard to imagine how we can come together to solve our problems. if we have such a poor opinion of the next guy.
Here’s my question to you: What's behind a precipitous decline in America's morality?
Interested to know which ones made it on air?
The rampant narcissism, sense of entitlement and lack of consideration for your fellow human being. People feel like they should be able to do whatever they want, whenever they want, and screw you if you don't like it. Part of it is parenting, but the other part is society and celebrities condoning that attitude. "My family thinks I'm awesome. Why don't you?"
David in Oregon writes:
What's behind the decline in morality? Simple: media. Television, internet, video games, music, and cell phones all play a role in the demoralization of this country. Cry freedom of speech all you like but when nothing is censored then everything eventually leads to the lowest common denominator.
So much of the problem with morals in our society can be traced to the breakdown in the family. Children are not given the guidance that leads to self-respect and if you don't respect yourself, how can you have any respect for your fellow human beings? And if the children try to look up for direction, all they see is the lure of the almighty buck, from the corporations to the government.
I don't think there is any decline in morality these days. Consider that 150 years ago we kept human beings as slaves, 100 years ago American workers worked in terrible conditions for low wages with no safety net, and 50 years ago African Americans were still being lynched. I think we've come a long way and I'd much rather live into today's society than the world of the past.
Jack, This is what happens when people stop going to church. Until people hit "rock bottom" in their carefree lives, religion is not cool. But when they do melt down, I'll bet few turn to Hugh Hefner for answers.
Morality started declining when we stopped wearing hats and gloves and suits and ties out to dinner and a movie.
Lies, damned lies and the truth about Joe Biden
Nancy Pelosi Nancy PelosiDemocrat says he won't introduce resolution to censure Greene after her apology Democrats weigh next steps on Jan. 6 probe 21 Republicans vote against awarding medals to police who defended Capitol on Jan. 6 MORE dismissed Tara Reade’s accusations of sexual assault against Joe Biden Joe BidenMellman: Trump voters cling to 2020 tale FDA authorizes another batch of J&J vaccine Cotton warns of China collecting athletes' DNA at 2022 Olympics MORE . “I know him,” said the House Speaker authoritatively, and that was that.
Does Biden’s record warrant such confidence? Not really. In fact, Biden has a long history of lying — about himself, about his past and about events that never took place.
Democrats want the 2020 campaign to be a referendum on President Trump. Fine, but if this is to be a contest of characters, it is only appropriate that Joe Biden’s history of fabrication and deceit – often intended to bolster his intellectual credentials – also be fair game.
Over the past year, Biden thundered that the Obama administration “didn’t lock people up in cages.” He also claimed that, “Immediately, the moment [the Iraq War] started, I came out against it.” And… “I was always labeled one of the most liberal members of Congress.” Politico’s rating of all three assertions? False.
No one should be surprised. Lest we forget…
A video is making the rounds in which Biden boasts at a 1987 rally, "I went to law school on a full academic scholarship…[and] ended up in the top half of my class."
Biden also maintained that he "graduated with three degrees from undergraduate school" and was the “outstanding student in the political science department.”
Not one of those claims was true, as newscasters at the time affirmed. In fact, Biden graduated 76th of 85 students in his law school class, had only a partial scholarship and did not win top honors in his undergraduate discipline.
Biden explained in his 2007 autobiography “Promises to Keep” that he had been angry at that rally since “it sounded to me that one of my own supporters doubted my intelligence." According to a 1987 Newsweek piece, a supporter had “politely” asked Biden what law school he attended and how well he had done.
Biden bristled, saying “I think I have a much higher IQ than you do,” reeled off his fabricated accomplishments and concluded “I’d be delighted to sit down and compare my IQ to yours if you’d like, Frank.”
The episode reminds us of Biden recently snapping “You’re full of sh*t” at an auto worker who dared to challenge Biden’s stance on guns or calling an Iowa voter a “damn liar” for insinuating that Biden had helped his son gain access in Ukraine.
The Newsweek reporter wrote that Biden appears “hyper, glib and intellectually insecure,” and says the 1987 encounter was critical to understanding why Biden’s first run at higher office flopped. “The clip…reflects a view of Biden's character widely shared in the community. Reporters and political consultants long ago concluded that Biden's chief character flaw was his tendency to wing it. He seems to lack a crucial synapse between brain and tongue, the one that makes the do-I-really-want-to-say-this decision.”
That commentary holds up well, as today more than ever Biden blunders into conversational crevasses, with no way out. (Think: "If they believe Tara Reade, they probably shouldn't vote for me.” A new Harvard-Harris poll shows 55 percent of the country believes Tara Reade. Game. Set. Match.)
Biden’s 1987 campaign foundered also because he was caught lifting passages of a speech given by Neil Kinnock. Biden echoed (falsely) the British Labor leader’s history that he was the first "in a thousand generations" to graduate from college and repeated virtually verbatim the same story about his wife, just as Kinnock had.
More shocking, Biden claimed: “My ancestors…worked in the coal mines of Northeast Pennsylvania and would come up after 12 hours and play football for four hours,’’ even though no one in Biden’s family tree ever worked underground. That was Kinnock’s family.
It wasn’t the first time Biden had also been caught plagiarizing during law school. He “borrowed” an entire five pages from a published law review article without attribution and had to beg not to be expelled.
Interestingly, just last summer complaints arose about Biden “borrowing” the work of others, in putting together his climate plan. As Vox reported, Biden’s plan “contains a number of passages that seem to have been copied and pasted, at times with very superficial changes” from a variety of sources.
Biden supporters will dismiss these episodes as being in the distant past. But Biden’s tendency to mislead did not expire in 1988. More recently, the former vice president has told audiences that after his stint in the White House, “I became a teacher. I became a professor.” While it is true that he took a lofty salary to make a handful of speeches for the University of Pennsylvania, Biden has never taught students.
Then there was the inspiring tale of visiting Afghanistan to honor a heroic naval officer. Biden described the officer’s actions in detail, adding, “This is God’s truth, my word as a Biden.” But according to a review in the Washington Post, no such incident occurred. Biden was lucky not to be hit by lightning.
There were also Biden’s claims of having been arrested in the 1970s because he tried to visit Nelson Mandela in prison. Nope, didn’t happen. He has also cast himself as a civil rights activist and co-sponsor of the Endangered Species Act those things aren’t true either.
Character does not change. Biden’s winning smile and genial nature have granted him license to mislead. But as Biden denies alleged misdeeds related to General Flynn, to his son Hunter’s involvement in Ukraine or to Tara Reade, his history of bending the truth is informative.
Democrats will counter that President Trump frequently exaggerates and embellishes, which is true. But we know Trump he has been on the griddle for nearly four years, and been continually stripped and flayed by a hostile press.
Many of us are just getting to know Joe Biden.
Liz Peek is a former partner of major bracket Wall Street firm Wertheim & Company. Follow her on Twitter @lizpeek.
Kemmu Restoration: 1333-1336 CE
The Kemmu Restoration was a turbulent transition period between the Kamakura and Ashikaga Periods. The emperor at the time, Go-Daigo (r. 1318-1339), tried to take advantage of the discontent caused by the strain of being war-ready after the attempted Mongol invasions and tried to reclaim the throne from the shogunate.
He was exiled after two attempts, but returned from exile in 1333 and enlisted the help of warlords who were disaffected with the Kamakura Shogunate. With the help of Ashikaga Takaujiand another warlord, Go-Daigo toppled the Kamakura Shogunate in 1336.
However, Ashikaga wanted the title of shogun but Go-Daigo refused, so the former emperor was exiled again and Ashikaga installed a more compliant emperor, establishing himself as the shogun and beginning the Ashikaga Period.
History, Critical and Patriotic
Park Ranger Jim Hollister leads a school group at Minute Man National Historical Park in Massachusetts.
When my mother passed away at the ripe old age of 90, several years ago, my brothers and I had the bittersweet task of emptying out the home she and my father had lived in for well over half a century, and where we grew up. We took various keepsakes and mementoes. I made a beeline for the books and magazines. While leafing through, I realized how much my picture of America had been formed by them and the tempered but patriotic history they conveyed. They reflected the middlebrow culture of mid-twentieth century America, which carried many of my generation through the turmoil of social change, war, and political crisis. And they reminded me of the need for robust history and civic education today.
The first collection of books I recovered was from when I was quite young. It was the Landmark series of histories for young people, conceived by Bennett Cerf of Random House and launched in 1948 with books by topnotch novelists like Dorothy Canfield Fisher, C. S. Forester, and Robert Penn Warren, and war correspondents like William L. Shirer, Quentin Reynolds, and Richard Tregaskis. It eventually ran to some 180 volumes and covered not just American history but everything from the pharaohs of ancient Egypt to the United Nations in war and peace. Although mainly out of print, they retain some appeal to homeschooling parents and are easy to find in used bookstores.
Next I found my old copy of Kenneth Roberts’ historical novel about the American retreat from Canada in 1776 and the Saratoga campaign of 1777, Rabble in Arms. In it, Roberts turned my 12-year-old historical consciousness upside down by making Benedict Arnold out to be a hero, by showing how Arnold’s military skill accounted for the deferral of one British invasion of the northern United States and the defeat of another. Roberts described in terms more vivid than all but the best historians what it was like to fight a lake battle in upstate New York in late autumn, be inoculated against smallpox, and deal with the stupidities of legislative politics. Like his contemporaries Walter Edmonds (Drums Along the Mohawk) and Esther Forbes (Johnny Tremain), he made the colonial and revolutionary past live.
The Landmark series of histories for young people, launched in 1948 by Bennett Cerf of Random House, has since gone out of print.
And then I discovered old copies of American Heritage magazine going back to the early 1960’s. Once a minor publication by the American Association for State and Local History, it was relaunched in 1954 as a handsome, 120-page hardcover magazine. The October 1961 issue was fairly typical. At the top of the masthead stood editorial director Joseph J. Thorndike, who after a stint at Time had been recruited to be the managing editor of Life. The senior editor was Bruce Catton, the prolific popular historian of the Civil War the managing editor was Eric Larrabee, who later wrote one of the most thorough and accessible studies of Franklin Roosevelt as commander in chief. Assistant and associate editors included Richard Ketchum and Stephen Sears, excellent historians of the American Revolution and the Civil War. Authors in that issue included Hugh MacLennan, a prize-winning professor of English at McGill University writing about Canadian voyageurs Mark Schorer, a University of California, Berkeley professor and biographer of Sinclair Lewis on the writing of Main Street and John Lukacs, one of the most original historians of twentieth-century Europe writing about George Bancroft, one of the fathers of American history. It wasn’t fluff.
There was a progression here for a young person fascinated by the past and able to engage it at a number of levels, one which unquestionably played a role in shaping my attitudes, and not only mine, to politics. These were works of patriotic history, celebrating the American past and American heroes. They did not, nor did they need to, gloss over the stains and horrors. The heroes could be southern senators standing up to the Ku Klux Klan in the 1920’s or Chief Joseph leading his small tribe in a fight against the United States Army in the 1870s. And the tales could include accounts of political corruption, ambiguous loyalties, and mayhem—patriotic history does not have to conceal any of that, nor need it ignore the ambiguities of the past. But the key was that this was my history, to own and to celebrate, even though my grandparents were immigrants.
Chief Joseph led his small tribe in a fight against the United States Army in the 1870s
A shared story
Particularly for Americans, patriotic history is a kind of glue for an extraordinarily diverse republic. Lincoln used a patriotic version of the nation’s revolutionary past and founding generation to hold the Union together and provide meaning and redemptive hope after the slaughter of hundreds of thousands during the Civil War. The Gettysburg Address, after all, begins by recalling the Declaration of Independence and defining the meaning of the Revolution. And Lincoln in turn became a figure to inspire succeeding generations.
Yet patriotic history is more suspect these days than it was when I was its young student, 50 years ago. In 2014, Kenneth Pomeranz, completing his term as leader of the American Historical Association, chose as the topic of his presidential address, “Histories for a Less National Age.” While grudgingly conceding that nations or states remain important because they have armies, and acknowledging that historians might do some limited good by teaching about the United States, he generally welcomed the shift to spatially and temporally broader history, sweeping across continents and centuries. It is striking that just as he gave that address the forces of nationalism—in Russia, China, western Europe, and most definitely the United States—gave a set of roars that indicated that they were very far from dead. It was an instructive error for an historian to make.
Lincoln used a patriotic version of the nation’s revolutionary past and founding generation to try to hold the Union together and provide meaning in the Gettysburg Address.
George Orwell famously observed in 1945 that nationalism is “the habit of assuming that human beings can be classified like insects,” whereas patriotism is “devotion to a particular place and a particular way of life, which one believes to be the best in the world.” In practice, however, modern academic historians, who are wary of nationalism for reasons good and bad, often conflate it with patriotism. And this is where some of the great divide between contemporary academic history and patriotic history has opened up. When the academy questions the very utility of national history, by necessity it undermines the possibility of patriotic history as well.
Civic education requires students engage with their history—not only to know whence conventions, principles, and laws have come, but also to develop an attachment to them. And civic education is also inextricably interwoven with patriotism, without which commitment to the values that make free government possible will not exist. Civic education depends not only on an understanding of fundamental processes and institutions (for example, why there is a Supreme Court, or why only Congress gets to raise taxes or declare war) but on a commitment to those processes and institutions, and on some kind of admiration for the country that created them and the men and women who have shaped and lived within them. In a crisis, it is not enough to know how the walls were constructed and the plumbing laid out in the house that Madison, Washington, and Lincoln built. One has to think that the architects did remarkable work, that as their legatees we need to preserve the building even if we modernize it, and that it is a precious edifice like none other.
The triangular relationship among civic education, historical knowledge, and patriotism seems in our day to be broken. Survey after survey delivers dismal verdicts about what Americans know about the government under which they live. For example, in a recent survey by the Annenberg Public Policy Center, just two out of five respondents could identify the three branches of government and one out of five could not identify any branch of government. Nearly half thought that illegal immigrants have no rights under the Constitution. Another survey indicated that only one third of Americans would pass a U.S. citizenship test.
The issue appears not to be a lack of civics courses per se, which are required in the vast majority of states. Rather, the issue seems to be the unmooring of civics from history, and in particular history in the curriculum at colleges and universities where the high school teachers of tomorrow are trained.
In a blistering article in the national security-oriented online publication, War on the Rocks, my colleagues at the Johns Hopkins School of Advanced International Studies Francis Gavin and Hal Brands declare that the historical profession is “committing slow-motion suicide.” Able historians themselves, they point to studies showing a decline of 30 percent in history majors at U.S. institutions of higher education in the last 10 years alone—the steepest enrollment slide of any of the humanities. The brunt of their critique is that the discipline of history has walked away from some of the subfields that matter most to the shaping of engaged citizens—politics, statecraft, and war. Meanwhile, fellow historians Fredrik Logevall and Kenneth Osgood have found similar patterns in hiring in the profession in looking at H-Net, the leading website for academic jobs in history, they found a grand total of 15 advertisements in 10 years for tenure-track junior historians specializing in American political history.
Members of the historical profession might, with reason, push back on this bleak picture, noting the robust health of organizations like the Society for Military History. But the truth remains that traditional forms of history—political, diplomatic, and military—have been increasingly pushed to the margins of the field that departments of history have shrunk rapidly because students vote with their feet and that churning out fewer history majors (who in turn are likely to be the future history teachers in middle and high schools) bodes poorly for the future of civic education. If, moreover, those fewer students who remain are themselves only barely familiar with the kinds of history that appeal to young people and can form them as citizens, the cycle becomes a vicious one. If the nuts and bolts of American political and military history are not taught in universities, the chances that they will be passed to a younger generation yet diminishes.
Paul Giamatti as John Adams in HBO’s adaptation of David McCullough’s biography.
Beyond the academy
It is not the case that Americans in general have fallen out of love with their own past. Large numbers visit battlefields and museums—a million a year to Gettysburg, more than that many to Mount Vernon, almost three quarters of a million to the National World War II Museum, and six-digit numbers even to more remote sites. Popular historians do well—David McCullough and Ron Chernow have repeatedly written best-selling historical biographies. On the whole, historical television series may not quite draw the 14 million viewers that Ken Burns’ 1990 Civil Wars series did, but they have done respectably enough. John Adams, for example, attracted something like 2.5 million viewers.
The problem lies not in lack of interest, but in a tension between the academic historical community and both the reading public and popular writers. It is not enough to have best-selling books or television series about the American past, though those are welcome: there is a need for a general awareness of that past that has to be spread indirectly through college and university education and thence to middle and high schools. And while the history of the academy has to be somewhat different than the history of Netflix or the airport bookstand, they cannot be too far apart.
That gap has not always existed. It was possible, for example, for Allan Nevins, an enormously prolific writer about the Civil War and a biographer of Charles Fremont, John D. Rockefeller, and Henry Ford, to be a tenured professor at Columbia and president of the American Historical Association—without a doctorate degree in history. That would be unthinkable today. Yet a contemporary of Nevins who did have a doctorate, Harvard University’s Samuel Eliot Morison, was similarly popular, similarly prolific, and similarly influential.
The Morisons and Nevins of the previous century believed that they had a duty to illuminate the American past for their fellow citizens. They could be nuanced and critical while respecting the patriotic uses of history.
The Civil War documentary series by Ken Burns that aired in 1990 drew about 14 million viewers, a sign that Americans have an appetite for history.
In current times, the weight in the academic historical profession has been, for some time, hostile to that and to anything that smacked of such an approach, making the case against such story telling with a purpose. In a critical review of David McCullough’s biography of John Adams, historian Sean Wilentz of Princeton University lashed not only the author but what he described as the American Heritage style, “brilliant in its detail, evocative in its storytelling, but crushingly sentimental and vacuous,” which he believed had infected Ken Burns’ Civil War docu-series as well. Wilentz celebrated as an alternative Bernard DeVoto, a once well-known popular historian whose work painted a critical, fuller picture of the past and remains well worth reading.
These wars have continued. When in 2011 Harvard historian Jill Lepore published a book on the original Tea Party and its resonance today, she was taken to task by the dean of early American historians, Gordon S. Wood. “Americans seem to have a special need for these authentic historical figures in the here and now. It is very easy for academic historians to mock this special need, and Harvard historian Jill Lepore, as a staff writer for The New Yorker, is an expert at mocking,” Wood deplored this disposition.
After criticizing Lepore for her contemptuous tone toward a political movement that she despised (the Tea Party), Wood argued that societies need memory and a useful and a purposeful past—in other words, heritage. Modern critical historical writing, he said, seeks simply to establish what happened. It is “all head and no heart,” Wood wrote, and citing his own teacher, Bernard Bailyn, argued that it was important to understand that such history could not meet a society’s needs, and something else is required.
This is the nub of the matter. Even if the academy generated more historians (like Wood, Wilentz, and Lepore, for example) who can write compellingly and lucidly for lay audiences, and even if they turned their attention to politics of the kind that citizens need and average readers find interesting, there is bound to be a tension between the outlook of the modern analytic historian and that of the patriotic historian.
Harvard historian Jill Lepore, once deplored by Gordon Wood as “an expert at mocking,” became a patriotic historian, perhaps without even entirely recognizing it herself.
Searching for inspiration
Patriotic history involves, for example, heroes. Most academic historians who write biography (not the most popular genre in universities) specialize in the study of clay feet. Hence David Herbert Donald’s biography of Lincoln depicts a president stumbling from decision to decision and yet somehow presiding over a triumphant Union. Doris Kearns Goodwin—a popular historian—gives a much more sympathetic account in Team of Rivals. Perhaps because she had had closer connections to the world of actual politics, her book is the more popular, and more admiring, one. One may even think it is in some ways the more essentially accurate portrait.
Americans need history that educates and informs, but also one that inspires. If, for example, one gives equal weight to John F. Kennedy’s sordid sexual behavior and the soaring rhetoric of his inaugural speech if one concentrates as much on the personal peccadilloes, inconsistencies, and mixed motives of the Founders as on the marvel that is the Constitution that they created if the shameful relocation of American citizens of Japanese ancestry to concentration camps gets more play than the D-Day landings or Battle of Midway, history cannot serve that inspirational function. And then, in a crisis, you are stuck, because you have no great figures to remember, no memory of great challenges overcome, no examples of persistence and struggle to embrace.
Doris Kearns Goodwin—a popular historian—gives a much more sympathetic and heroic account of Lincoln than does David Herbert Donald.
A notable recent work of scholarship, Richard White’s account of Reconstruction and the Gilded Age, The Republic for Which it Stands, is something of a warning. It is a volume in the excellent series produced by the Oxford History of the United States, which also includes James McPherson’s Battle Cry of Freedom (on the Civil War) and Wood’s Empire of Liberty (on the early republic). Like the other volumes, it is lucid and masterly in its scholarship. But its relentless depiction of an irredeemably sordid past, blotted by the oppression of the African American population of the South, massacre of Indians, despoiling of the environment, horrors of tenement life, and political cupidity, leaves the reader thinking that perhaps the only good thing to be said about the United States during this period is that by contrast, it makes today’s America look good. One could write a history that acknowledges all those things—yet somehow also celebrates the great works of literature and engineering from Mark Twain to the Brooklyn Bridge, or the extraordinary political achievement of the reunification of a country that had experienced four years of unremitting bloodshed, or the heroism (quiet in one case, noisy in the other) of Booker T. Washington and a young Theodore Roosevelt.
Wood recognized in his review of Lepore’s book about the Tea Party that the two forms of history—critical and patriotic—can only coexist, but rarely if ever coincide. Some particularly gifted historians can pull it off, such as David Hackett Fischer, in his magnificent books Paul Revere’s Ride and Washington’s Crossing. But for the most part, the two forms of history have different purposes and tap different skills and sensibilities. The challenge is the management of their coexistence, and in particular the recognition by scholars that both are necessary.
Washington’s crossing of the Delaware is the focus of a book by David Hackett Fischer that melds critical and patriotic history.
Popular and patriotic historians may grumble at reviews of their work by their academic colleagues but in truth, pay them little heed. For academic historians, however, the sentiments can be more acidic. Jonathan Zimmerman, a professor at the University of Pennsylvania, put it sharply in a guarded defense of Ken Burns: “It’s called sour grapes. Put simply, Burns has managed to engage a huge public audience. And that makes him suspect among members of our guild, who write almost entirely for each other. We pretend we don’t envy his fame and fortune, but of course we do. We’re like high-school kids who don’t get asked to the prom, then say they never wanted to go in the first place.”
Zimmerman had begun his career as a high-school social studies teacher, closer to the real needs of the American public for historical education. He noted that writing for lay audiences often counts against a young historian and deplored the guild mentality of a history profession that too often looks down on public engagement. In so doing, he made a point that cannot be put too forcefully. Unless history departments, and university administrators behind them, begin to weight public engagement as a useful academic function, they are likely to pull their discipline further into bitter irrelevance.
A reversal of this trend is not inconceivable, particularly for those faculty members who have tenure, but also have to deal with tight-fisted college administrations in an era when higher education itself is being turned upside down, and when it is becoming harder to sustain departments that do not pay their way with student seats in classrooms. History departments’ disdain over the last few decades for both popular history and the historians who engage the American public may not survive provosts unwilling to hire more expensive professors teaching fewer courses to fewer students.
Moreover, the educational establishment itself has, on occasion, changed its approach to history. After a series of critiques, the College Board revised its course framework for Advanced Placement History. “AP United States History,” in its 2017 version, is both sophisticated and sober, but offers plenty of opportunities to explore learning objectives like “explain how ideas about democracy, freedom, and individualism found expression in the development of cultural values, political institutions, and American identity.”
And then there is politics itself. In 2016, the political tide turned. Instead of a desperately unhappy conservative opposition to a liberal president turning to history for inspiration and consolation and meeting the scorn of liberal history professors, it was the liberals who found themselves looking for a usable past. They saw a president they believed to be a potential tyrant, and a Republican party that seemed to be mastering the legislative and judicial branches of government. They now needed the heroes and the inspiring moments from the past to convince themselves that the country could get through difficult times.
Interestingly enough, it was Jill Lepore who found herself doing in a different way what she had disparaged the Tea Party movement doing. In 2018 she published an ambitious and engrossing one-volume history of the United States, These Truths. It is filled with patriotic sentiment. “The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests too, on a dedication to inquiry, fearless and unflinching,” she writes. The book concludes with the old metaphors of the ship of state in a storm, with Americans called upon to fell majestic pines and “hew timbers of cedar and oak into planks” to rebuild the ship. Depending on one’s literary tastes, the language is either florid or evocative, but it was clear that in the profound crisis Lepore saw in the Trump presidency, history had to come to the rescue. Possibly without recognizing it, she too had become a patriotic historian.
Worth celebrating: Booker T. Washington (right) and young Theodore Roosevelt.
History’s road ahead
We can begin by recognizing that although America’s renewed focus on STEM (Science, Technology, Engineering, and Mathematics) education for K-12 has had some beneficial effects, it is vital to pay heed to supposedly softer subjects—history foremost among them. Evidence suggests that recent focus on STEM as well as on standardized tests in reading and math (and therefore preparing for tests) has come at the expense of civics, social studies, and history. Educational reformers should realize that the time may have come to rein in the obsession with formal testing and to restore some balance to curricula.
While little can be done in the short run about what has happened to history as a discipline, or how history teachers are trained in universities, there is a lot that can be done in summer workshops or through creative forms of part-time education, particularly online. If many conventional universities do not offer adequate instruction in history for teachers, entrepreneurially minded competitors can do so, and with national reach by virtue of online education. All of these are opportunities for creative grant giving and philanthropy.
The federal government’s role is one to be approached with care. Part of the strength of the traditional American educational system has been that it has been decentralized and competitive, and one can argue that attempts to create standardized tests and standards do as much damage as good. Moreover, particularly in the field of history, the temptations for ideological fiddling are too great to make conservatives, in particular, feel comfortable. But there are two areas in which there is good to be done.
The first is through the National Endowment for the Humanities, which has sponsored historical work to include workshops for teachers as well as original productions of videos and the like. The second, and even more important, is the role of the Federal government in properly funding and sustaining national historic sites to include battlefields, monuments, and historical homes, but also the Library of Congress and National Archives with their magnificent collections of historical documents. These offer many opportunities for the millions of Americans who are interested in engaging their past to do so.
A few of the books in Thomas Jefferson’s library, as displayed at the Library of Congress. The Constitution is at the National Archives.
There is also a role for entrepreneurship and philanthropy to play. For example, organizations can support bringing back some of the older material discussed at the beginning of this paper and creating new sources of such work. Further, they might expand opportunities for students to learn history through experiences outside of the classroom. While patriotic history may be imbibed inside a school, it can also be found by singing along to the Hamilton score (see “Hamilton Goes to High School,” features, Summer 2017), while camping on the Lewis and Clark trail, watching Ken Burns’ The Civil War, or even by finding ways to get into the hands of a curious 12-year-old a novel that she or he will never forget. Any good teacher, at any level, knows that the key to success lies in multiple ways in to a young person’s consciousness. “Material things, things that move, living things, human actions and account of human action, will win the attention better than anything that is more abstract,” William James wrote in Talks to Teachers.
There is no more natural subject of fascination than history, particularly the history of one’s own country, and particularly if that country is the United States. The decline of patriotic history is a severe problem for civic education—but fortunately, there are many ways of mitigating and even reversing it.
Patriotic history is a sensitive topic. It can take false and even dangerous forms. The Lost Cause narrative of the Civil War, for example, masked the reality of slavery as the central cause of the bloodiest conflict in American history. But if done well, as many historians, museum designers, and custodians of national parks, public, semi-public, and private institutions have shown, it can both educate and inspire. And it is, in any case, inescapable. Without civics, our political institutions are reduced to valueless mechanisms. Without history, there is no civic education, and without civic education there are no citizens. Without citizens, there is no free republic. The stakes, in other words, could not be higher.
Eliot Cohen is the Dean of Johns Hopkins University’s School of Advanced International Studies. This essay is adapted from the forthcoming book, How to Educate an American, published by Templeton Press.