Online Censorship • Weekend Long Read

This essay is adapted and expanded from The Stakes: America at the Point of No Return, to be published by Regnery on September 1.

A Tyranny Perpetual and Universal?

Is the leftist dream now within reach? If President Trump loses, we will find out.

After “Is 2020 another ‘Flight 93 election?’” the question I most often hear is “What happens if Trump loses?” 

The answer to the first question, unfortunately, is yes, but more so.

The tl;dr summary of the answer to the second is: much more of the same. More of all the trends, policies, and practices that revolutionized American life in the 1960s, that enrich the ruling class and its foot soldiers at middle America’s expense, erode our natural and constitutionally guaranteed rights and liberties, degrade our culture and its people, and dishonor our heritage and history. The war on those who self-identify as Americans, and only as Americans, who love their country despite its flaws—who are certain in their bones that its strengths and glories vastly outweigh its historic and present shortcomings—waged by those who hate America and Americans, who want to destroy the former and crush the latter, will go on.

Two important questions are whether that war will intensify or abate and whether it might abate overtly but intensify covertly. Those questions will be explored in what follows.

First, though, a necessary caveat. A tiresome, sophistic, bad-faith, and inevitable rejoinder to my argument will go something like this: “Trump is the president; therefore, you guys are in charge; this ‘ruling class’ of whom you speak includes him, and you. So you’re lying and contradicting yourself when you criticize an alleged ‘ruling class’ running the country in ways you don’t like.”

No. The only accurate statement in the above summary is “Trump is the president.” And thank God for that; we’d be much worse off if he weren’t.

But the experience of Trump’s first term reveals how weak the presidency really is—not just constitutionally and historically, but, above all, currently. We know the enumerated powers the president is supposed to have, and also those the other branches of government are supposed to have, and not have. The Constitution and other fundamental charters of our liberties—the “parchment”—spell all that out. We also know what the “org chart” of the federal government looks like on paper: a “unitary executive” with an alphabet soup of agencies reporting to the president and therefore, in theory, responsive to his directives. 

But the reality, by now, should be obvious to everyone. Our government in no way functions according to the elevated words on the parchment, and President Trump does not control the executive branch. I say this not to disparage the president but only to state a plain fact. No doubt, he has done his best. I doubt that anyone else could have done better. But while facing a near-universal rebellion from every power center in our society, emphatically including the agencies he was elected to lead, naturally he has found it very difficult to make the federal bureaucracy do what he tells it to do.

That difficulty has astonished even me. I worked in the federal bureaucracy for the first four years of the first George W. Bush Administration. I saw from the inside how the permanent government or administrative state or “deep state” or whatever you want to call it undermined a president with whom they mostly agreed. I knew in advance that, were Donald Trump to win the 2016 election, the effort to undercut him from within would dwarf what happened to Bush. For unlike the 43rd president, who merely held a few opinions unpopular with the deep state, the 45th ran on a program of almost complete repudiation of ruling class dogma and practice.

And yet I vastly underestimated how bad the “resistance” would be. Never in my wildest dreams did I imagine that federal intelligence and law enforcement agencies would try to frame the president with a phony “crime,” launch a pointless two-year investigation over a fraud, then impeach him over the timing of foreign aid payments, all the while lying daily to the American public.

I also saw, again, the beast from the inside during my brief tenure in President Trump’s White House. Given classification and nondisclosure requirements, I can’t say much about that. But I can say this: if anything changed from my time in the Bush Administration, it is that the deep state is vastly more powerful today than it was then, and vastly more willing to use its power—overtly—to flout, undermine, circumvent, and disobey presidential orders. Even, in many cases, to do the precise opposite of what they’ve been ordered.

So we should not gripe about the things not done in President Trump’s first term. We should rather be grateful for all the things he got done—and hope he can do more in a second term.

Neoliberalism Forever? 

But this essay is about what might happen should he lose.

The most plausible outcome would be a return to the “neoliberal” consensus and trajectory circa 2015. A more precise name might be “managerial leftist-libertarianism,” for this governing ideology is top-down, bureaucratic, and anti-democratic, committed to social engineering and grievance politics, while undermining virtue and promoting vice. But that’s something of a mouthful, and “neoliberal,” for better or worse, has gained widespread acceptance.

Neoliberalism elevates as a matter of “principle” the international over the national; it rejects the latter as narrow, particular, cramped, even bigoted, and celebrates the former as cosmopolitan and enlightened. Neoliberalism is (for now) forced to tolerate nations and borders as unfortunate and unhelpful obstacles but it looks forward to a time when such nuisances finally are behind mankind forever.

Until that time, neoliberalism works to warp state power into instruments whose primary mission is not to secure the well-being or interests of individual peoples or nations but instead to enforce the international neoliberal order—in particular the movement of capital, goods, and labor across borders in ways that benefit the transnational neoliberal ruling class. In practice, this amounts to widespread, close-knit cooperation between business and government—or what neoliberals euphemistically refer to as “public-private partnership.” 

This benign-sounding phrase—who could object to “cooperation,” to government and business “solving problems” together?—masks a darker reality. What it really describes is the use of state power to serve private ends, at private direction. Its proponents always leave out the little detail that big business is the senior partner. 

Hence, for instance, without Trump, foreign policy—that quintessentially public function, to “provide for the common defense”—will be further reoriented around securing trade, tax, and labor (“migration”) patterns and paradigms that benefit finance and big business. American conservatives, still fighting “government regulation” as if America were stuck in Groundhog Day 1981, have yet to grasp the reality that the majority of this country’s policies are oriented around securing trade, tax, and labor (“migration”) regimes that benefit finance and big business.

The real power in the neoliberal order resides not with elected (or appointed) officials and “world leaders”; they—or most of them—are a servant class. True power resides with their donors: the bankers, CEOs, financiers, and tech oligarchs—some of whom occasionally run for and win office, but most of whom, most of the time, are content to buy off those who do. The end result is the same either way: economic globalism and financialization, consolidation of power in an ostensibly “meritocratic” but actually semi-hereditary class, livened up by social libertinism.

This consensus and the people who profit from it are still very much in charge of America today. They control everything: corporations, banks, tech firms, media (legacy and social), universities, primary and secondary schools, foundations, mainline religious organizations, and of course the entire federal bureaucracy. They also control, in all but the very reddest counties and municipalities, local governments and agencies. Is it any wonder, then, that it’s so hard for the president to govern against the neoliberal consensus?

The only top-tier power center the ruling class currently doesn’t have is the White House itself. If (or when) they get it back, the basic contours of the back-to-normal regime will look much as they did at the height of the Obama Administration—or, in hindsight, the Bush-Clinton-Bush-Obama Imperium: a high-low coalition against the middle in service of big tech, high finance, and woke capital. The sanctification of immigration, the glorification of “free” trade, jingoistic celebration of constabulary use of force in parts of the world most Americans can’t even name: expect lots more of all that. 

“Getting back to normal” will also require the ruling class’ propaganda apparatus to amp into overdrive on all the alleged “failures” of the Trump interregnum. Getting out of the Trans-Pacific Partnership (which even Hillary Clinton had disavowed by 2016)? Disaster! Played right into the hands of China and alienated our allies! Blaming China for the coronavirus? Another disaster! Racist and xenophobic and alienated a key trading partner!

Surface consistency is not a strong suit of America’s contemporary propagandists. There is, however, an underlying consistency: any statement that serves the interests of the ruling class and hurts Trump and his supporters is good. Period.

Hence expect to hear endless denunciations of Trump’s renegotiated trade deals as catastrophic—and inconsequential. Similarly, on immigration, the narrative will be: Trump’s racist xenophobia was a racist overreaction to a nonproblem—that crippled our economy by depriving it of desperately needed workers . . . when the unemployment rate was over 15 percent. On foreign policy, the line already is: Trump’s recklessness risked calamitous war—while he recklessly pulled American troops out of combat zones in Syria and Afghanistan and tried to negotiate a peace deal with North Korea.

Only More So

All of the “post-Cold War era” trends that Trump ran against and has opposed or sought to moderate will be intensified. The ruling class will get right back to elevating the international over the national while tolerating the national only insofar as state power is used to bolster the international neoliberal order and enforce its edicts to facilitate the movement of capital, goods, and labor across borders in ways that benefit themselves.

The economy will become even more artificial and jury-rigged. We shall test supposed iron laws of economic gravity—for instance, whether it’s possible to maintain a fiat currency indefinitely with endless money printing and whether the dollar can long maintain its global reserve status. The longer the rigging goes on, the more rigging will be required.

Overall, the economy will become more techified, more financialized, more concentrated at the coasts, and more unequal. Expect the rich to get a lot richer and the middle class to disappear. Wages will fall.

COVID-19 has been a godsend to the oligarchs, who are licking their chops as one small business after another fails, leaving Americans with no choice but to spend whatever money they have with corporate behemoths. 

Since small businesses are one of the last redoubts of the middle class (owing to the disappearance, via outsourcing and immigration, of middle-income American jobs at big companies), expect what’s left of the middle class to shrink further. If it seems incredible—as it should—that financial markets are at or near all-time highs when GDP has plunged, unemployment reached levels not seen since the Great Depression, and our cities and towns have been repeatedly sacked, looted, and burned for three straight months, the biggest reason is the consolidation of corporate control over the economy.

Don’t expect big firms’ vastly increased power and wealth to trickle down to the little guy. Corporate America loves the so-called “gig economy,” a euphemism for “We don’t have to pay benefits!” Employer-provided healthcare will disappear for all but the most senior executives, a trend that, in turn, will make some form of socialized medicine inevitable. Quality of care will fall for all but the people at the very top who can buy out of the government system. Eventually, however, even their care will decline, since there will no longer be enough money in the system to keep medical innovation going.

On Trump’s big three—immigration, trade, and war—America will be back to the status quo ante, and then well beyond. 

Biden has already promised to amnesty every illegal immigrant currently in the country. According to a 2018 Yale study, that’s at least 22 million people—all of whom, under America’s idiotic immigration laws, immediately would be eligible to bring over relatives in the name of “family reunification.” If each newly minted American brings over just one relative, that’s another 20 million new immigrants in Biden’s first term alone. And nothing in the law would stop people from bringing over more than one. Most sponsor several. 

Such an amnesty, once the Democratic machine got everyone registered to vote, would tip many purple states permanently blue. That’s the whole point. After that, the electoral map would become impossible for Republicans ever to win the presidency again. Which is also the point.

The fundamental right of self-defense—the bedrock foundation of all our other rights—increasingly is not honored if you’re a member of a disfavored group and your attacker is not.

Beyond amnesty, there would be no pretense of enforcing any of our immigration laws. We’ve already seen entire communities become demographically overwhelmed in the space of a decade or two. That will keep happening, but on a much wider scale.

It will also become much more difficult and more expensive to wall oneself off from the consequences, which means that the number, or at least the share, of “regime winners” who can afford “good” suburbs or private schools will shrink while the share of “losers” increases. As a result, native birthrates are likely to drop further, while pathologies such as addiction will increase and life expectancy will fall.

On trade, the government will revert to its customary practice of enacting policy to further enrich the rich, no matter the consequences for the middle and working classes. On war, the particulars are harder to foresee since it’s never been clear (at least not to me) what the ruling class gets out of endless, pointless, winless conflict. But they certainly have an affinity for it, which means we should expect more, with all the attendant negative consequences: more death, more of the nation’s wealth sunk in wasteful adventures, the continued erosion of the military, and the further squandering of our national pride, international prestige, and many of our best young men.

Government collusion with big business, especially tech and finance, and the ceding to corporations of vast swaths of territory that the state used to occupy exclusively will intensify and expand. The “unpersoning” of dissenters will mimic what the government of China does through its “social credit system”: ranking people based on their opinions—and wokerati opinion of them—and then granting or limiting access to basic freedoms and services. This will be, and already is being, justified because it is done primarily by the private sector, whether by for-profit businesses that lock people out of entire sectors or “nonprofits” such as the odious Southern Poverty Law Center that identify targets.

If you think we have mass surveillance already, just wait. The government and the tech companies already work hand-in-glove, the latter helping the former in exchange for favorable tax, regulatory, and immigration treatment. More of that is coming, and on a bigger scale.

When the last checks on such collusion from the Trump Administration are gone, expect this joint censorship and oppression of dissent to increase by orders of magnitude. The Left finally has found a way around the First Amendment: consolidate all “speech” and public expression onto a handful of private-sector platforms run by oligarchs and staffed by wokerati; let them do whatever they want and when anyone complains, reply that “these are private companies that can run their businesses however they want; you don’t have to use their platforms and if you don’t like it, start your own.” The Left knows it can count on the moronic, friendly-fire-spraying libertarians to sing that tune the loudest. Free speech as we have known it—as our Founders insisted was the natural bedrock of political rights, without which self-government is impossible—will not survive coming leftist rule.

The playbook is already being expanded to banking and credit. To be on the wrong side of elite-woke opinion increasingly is to find yourself locked out of the financial system: no bank account, no credit card, no ability to get a loan, or pay a mortgage. Pay cash? The move to a “cashless society”—purely to prevent drug lords and Russian spies from laundering money, you understand—will obviate that option right quick. 

There’s no reason to assume the oligarchs will limit these types of actions to speech and money. Why would they? Especially when the woke vanguard consistently will clamor for more action and insist that any company that does business with “racists” is complicit in evil—“racist” being defined as anyone who hasn’t bent the knee. China already restricts travel for the disfavored. Why wouldn’t U.S. airlines? Car rental companies? Dealerships are independent, but they also depend on the big automakers for their stock. And, anyway, who can possibly buy a car if he can’t get a job or a bank account?

Britain’s nationalized healthcare service now denies medical care to those deemed “racist, sexist, or homophobic.” What’s to stop the wokerati from pressuring America’s patchwork of public and private healthcare providers to do the same? And why stop there? Why should “racists” even be allowed to buy food? That is, assuming they can even earn the money to pay for it. But that problem can probably be taken care of by denying the bad guys credit or debit cards and phasing out cash. 

Isolation, loneliness, desperation, addiction, and suicide all will increase as ostracization condemns heretic after heretic to a sort of internal exile. The most vocally strident among the Left will call the resulting deaths just deserts; the rest will brush them off as perhaps sad, but the direct consequence of bad choices or bad natures. “That racist had it coming.” 

And every step of the way, the narrative’s reply to those who raise the alarm will be: That’s not happening, and it’s good that it is. You’re a paranoid lunatic for even suggesting that censorship, de-platforming, or un-personing are problems—and also a racist who deserves it.

Not long ago, I thought the point of all this—aside from being punitive to enemies for the sheer pleasure of it—was to find the sweet spot between too much overt oppression, which might provoke a backlash, and too little, which might allow opposition to gather strength. To expand firings, un-personings, bank lockouts, and the like too rapidly might raise alarms; kept at the creeping level, they serve to keep most of red America locked into the blue system and thus dependent. A bit of caution thus would seem to serve ruling-class interests.

Reigning in the Mobs?

But signs of moderation, of magnanimity, of any recognition that “we won” and so can ease off are, to say the least, not common among the woke Left. And the extent to which the ruling class can control its foot soldiers has been very much called into question by the events of 2020.

Consider that New York City—the global neoliberal oligarchy’s unquestioned capital and home to, by far, the largest concentration of America’s elites—was sacked by mobs several times in the same week. The NYPD—the largest, best-equipped, and most competent police force in the country—stood by and did nothing. Granted, they likely were ordered to stand down by the mayor or had no confidence that if they took necessary action, the political leadership would back them up. But the result is the same.

The rescue, reordering, and rebuilding of New York City since the early 1990s arguably is the greatest political and public policy success story of the last generation. Beginning with the feckless John Lindsay, who became mayor in 1965, cynical and/or deluded elites decided to make the city a Petri dish for all their idiotic social experiments including but not limited to: extreme leniency, reduced policing, neglect of basic services and public spaces, and steadfast refusal to do anything about quality of life. All while punishing taxpaying and order-supporting citizens and businesses. Crime soared, the city crumbled, decent people fled. 

It took 30 years to raise New York back from that hell—and then the fools who run it, tacitly backed by the grandees who live in the world’s most expensive apartments, gave it all back to the forces of evil in a matter of months. As Machiavelli said of the Venetians, at the battle of Vailà, they “lost in one day what they had acquired with such toil and trouble over eight hundred years.”

The only difference is that this time the city appears to be worse. As businesses have remained closed, crime, disorder, and filth have surged. People have left and not returned. And not just from New York but from other ruling class citadels such as Los Angeles, San Francisco, Seattle, and Washington. Chicago seems to be sacked every other week. Minneapolis was sacked yet again just this week. That’s before you even mention the dramatic surge in shootings and killings in all these cities. In all such cases—and many others—the political leadership in these places eggs on the mob, refuses to enforce the law or even call for calm, and immediately release without bail the few people caught committing crimes. 

What does the ruling class gain from destroying its own cities? I’ve asked myself this a thousand times. I can’t come up with an answer. Is it that they want all this to happen or that they lack the will or ability to stop it?

The former possibility seems preposterous. But the fact that it’s not out of the question is suggested by the following data point. Jeff Bezos, the world’s richest man, owns the Washington Post outright. He can, with a single phone call, make it publish, or not publish, anything he wants. The Post has always been a liberal rag. But it was never so brazenly anti-American, so shamelessly dishonest, so unreservedly dedicated to racial grievance-mongering. It used to be able to call a riot a riot and not resort to ridiculous euphemisms like “mostly peaceful” to describe rampant street violence.

Above all, the Post used to care about its home town. It doesn’t anymore. The District of Columbia, into which I must occasionally (and reluctantly) venture, is a shell of what it was just months ago. For the span of several weeks, there was a riot roughly every night. Businesses everywhere are now closed—and not just from COVID-19. Windows everywhere are boarded up. Graffiti, much of if revolutionary, covers seemingly every square inch of vertical space. The streets are deserted. To the extent that you see anyone, chances are above 50 percent it’s a vagrant. At the few restaurants that remain open, thanks to outdoor seating, diners are as likely as not to be accosted by bullhorn-wielding anarchists. The town has a post-apocalyptic feel. (As, I am reliably informed, do Manhattan and San Francisco.)

The newspaper founded in Washington in 1877 speaks of all this out of three sides of its mouth: none of it is happening; all the protests are peaceful and justified; it’s good that America is finally getting the thrashing she deserves. Bezos could instantly turn off this endless destructive propaganda if he wanted to. Does he not want to? Or is he not paying attention? Maybe not living in the District, he hasn’t noticed the extent of the Post’s recklessness and dishonesty. Has he also not noticed the similar destruction of his own home town of Seattle, and the role played in that destruction by the national media, of which he is one of the topmost moguls? Bezos did not become the world’s richest man by being stupid. Does he somehow believe that all this is good—for him?

Whatever the answer, the fact that all this mayhem is happening now, and the ruling class can’t or won’t control it, more than suggests that a lot more of it will happen if Trump loses. 

A Dark Age of White Noise

The ruling class has built a well-honed apparatus to inculcate docility in the people. Components include cheap, puerile mass entertainment, ubiquitous smartphones and social media, video games, porn, drugs, sportsball, and so on across the whole dreary panoply of lowest-common-denominator “culture” in the current year. 

We should expect all this to increase. The ruling class’ recent and ongoing enthusiasm for marijuana legalization and its total indifference to the opioid epidemic suggest that they’re seeking to drug as many non-elites as they can out of any potential resistance. Combine these factors with leftism’s top-down, total control of thought, and the picture becomes bleak indeed. The times are already quite vapid; very little (if anything) of lasting merit has been produced in literature, philosophy, music, film, or the other arts in several decades. The trend seems to be getting worse.

But at least we still have that older stuff to fall back on, right? Not necessarily. The cherished and iconic works of our past are also threatened, in two ways. First, the movement that originated on campus more than a generation ago to get rid of core curricula and reinterpret in light of leftist orthodoxy those bits allowed to remain has borne fruit. We’ve now “educated” generations of students—even (especially) elite students—either 1) to have no familiarity with the Western canon; and/or 2) to despise it as inherently evil; or 3) to see it only through leftist lenses that make it seem as if it merely confirms current orthodoxy; or 4) to believe it was all “stolen” from other cultures. That last one, of course, is a bald-faced lie, but one that at least implicitly concedes there’s something valuable in the tradition. But the point is never made to spur anyone to actually read the books, rather only to validate in-group confidence. My people, and not yours, did that, hence we are great and you are not. The result is that the whole Western tradition is at risk of atrophy, and even death, simply from ignorance and neglect.

As if that were not enough, the Left is starting to get even more actively hostile to the tradition. Certain elite intellectuals, led by Mark Zuckerberg’s sister, Donna, have noticed that some young autodidacts have taken to reading the great books and listening to classical music. The elites see this as a threat. There are serious calls not merely to police how the canon is taught but to attack and even censor its “misuse” by “bad actors” who use it to challenge the narrative.

It may not be long before Amazon, which has virtual control of the entire book market, stops selling the classics altogether. Or perhaps a new industry will arise to bowdlerize them of all non-woke teachings. The worst-case scenario, which doesn’t yet feel imminent but which cannot be ruled out, is that eventually such books get banned.

Far more likely—and quite imminent in a world where Elizabeth Warren’s nine-year-old trans friend gets to pick the Secretary of Education—is a time in which all the institutions that teach the canon, and the scholars who write about it seriously, will be attacked over petty and invented infractions. The real purpose of those attacks will be to silence those scholars and eventually shutter their institutions.

The climate of acceptable opinion in this country—already very narrow—will constrict further still. The necessity for self-censorship will increase dramatically. The core function of the narrative will remain telling you what to think—and more important what not to think—but its message will get even more tendentious, hateful, omnipresent, and so, so much louder. Imagine TV screens playing CNN, volume cranked to 11, not just in airport waiting areas, but everywhere—forever.

Anarcho-Tyranny, Intensified 

In modern America, hypocrisy and double standards aren’t merely part of the business climate; they’re endemic to the whole society. Former Heritage Foundation scholar and Washington Times writer Sam Francis dubbed this system “anarcho-tyranny”: complete freedom—even exemption from the gravest laws—for the favored, maximum vindictive enforcement against the pettiest infractions by the disfavored.

Rarely has an analysis been so vindicated by events. Even before the 1619 Riots began in May, anarcho-tyranny was already the de facto law of the land. Can you remember the last time anyone in Antifa was punished for anything? I can’t. But I do remember community college adjunct philosophy professor and Antifa thug Eric Clanton walloping three people on the head with a five-pound iron bike lock—and the Alameda County, California district attorney letting him go with probation. 

I also remember, in pre-apocalypse New York City, Antifa goons getting into a fight—it’s hard to say who started it—with a group of men calling themselves the Proud Boys. Although no one was seriously injured, the NYPD expended significant time and resources tracking down the Proud Boys, but none whatsoever looking for any Antifa figures involved. Two Proud Boys were sentenced to four years in prison. No Antifa members were ever identified, much less charged with any crime, still much less tried or convicted. At most, the incident was a mutually idiotic brawl for which only one side was punished. The real distinction here is that the Proud Boys are regime dissidents while Antifa thugs are ruling-class shock troops.

All that, though, was child’s play compared to the nightly horrors Antifa—and their BLM allies—have wreaked on America’s streets for three straight months with close to zero official attempt to rein them in, and often with officials cheering them on. Examples—from Joe Biden and Kamala Harris to Governors Gavin Newsom, Andrew Cuomo, and Tim Walz, to Mayors Bill de Blasio, Eric Garcetti, Lori Lightfoot, Jenny Durkan, Ted Wheeler, and Jacob Frey—are too numerous to catalog fully. The latest atrocity came from Wisconsin Governor Tony Evers, who, the instant a career criminal wanted on an active warrant was shot while resisting arrest and, it appears, reaching into his car for a knife, took to Twitter to fire up the mob. His state’s fourth-largest city has burned every night since. It took him days to make even a token appeal for calm.

Then consider the fates of those not destroying America in the name of “social justice.” This story is still “developing” as they say, but as of Friday, August 28, a young man who appears to have had a Molotov cocktail thrown at him, a loaded pistol pointed at his face, and his head bashed with a skateboard after being chased by three Antifa thugs is currently in jail on a charge of first-degree murder.

The fundamental right of self-defense—the bedrock foundation of all our other rights—increasingly is not honored if you’re a member of a disfavored group and your attacker is not.

Our officials—at least in all the Democratic Party-controlled state and local governments—operate in the precise opposite way that they are supposed to. Instead of enforcing the law and maintaining order, they encourage riots, refuse to enforce the law, and then leap into action only when a victim defends himself. This is not incompetence or misguided idealism; it is evil. Should it continue, it will lead either to the collapse of the country or to revolution.

To return to more prosaic matters, should Trump lose, the repowered thought police will greatly expand its “enemies list.” Those deemed “dangerous” by the wokerati will be dogged by authorities. Any suspected dissident not as scrupulous as Caesar’s wife in his every interaction with the state will get the book thrown at him for minor, technical infractions of some law, executive order, or administrative rule. As the poor sucker is hauled away in cuffs by a heavily armed team of feds in windbreakers, CNN and MSNBC reporters—tipped off in advance to get it all on camera—will intone that this “dangerous white supremacist” with “ties to neo-Nazi groups” was “planning attacks.” Months or years later, after being held without bail, the unfortunate target will be convicted of something like mail fraud and given the maximum sentence. 

The Left will seek to use this same combination of maximum megaphone volume and maximum federal zeal to target lawful gun owners using the alleged threat of “domestic terrorism” as a pretext. “Red flag” programs operated by public-private cooperation will identify “potential terrorists” and the like who, to the extent that they don’t suffer the fate described above, will at the very least have their guns seized. The future of the Second Amendment in the coming leftist regime is hardly any brighter than that of the First. 

Meanwhile, other, much larger violations of the law will go unpunished, so long as the perpetrators are from “protected classes.” This is another of those assertions that many will wish to dismiss as paranoid. Yet even before the 2020 riots, we had already seen tremendous pressure from the Left for “criminal justice reform”—meaning leniency for the favored—and Soros-backed leftist “law enforcement officials” being elected all over the country. 

Need evidence that going soft on crime is high on the agenda? Consider how, early in the COVID-19 panic, leftist pols prioritized letting criminals out of jail, ostensibly because they were at risk of infection. The real reason, though, is explained by Rahm Emanuel’s famous exhortation to his comrades to “never let a serious crisis go to waste.” Governors, mayors, district attorneys, police chiefs, and sheriffs in blue zones across the country simply followed hizzoner’s advice and did what they always wanted to do anyway but hitherto could, or dared, not. Emboldened by the crisis, drunk with power, and half-convinced that a scared population wasn’t paying attention, they let the bad guys go. They got away with it, and they’ll make the policy permanent once they have lasting power.

It’s tempting to call this emerging America a “failed state,” but it isn’t really. The state is more than capable of acting on its own priorities, which emphatically include crushing known or suspected regime dissidents. Far from being incapable of enforcing the law, the state rather chooses which laws to enforce, and which not to enforce, and whom to exempt, in accordance with ruling class interests. That trend, too, will intensify.

The Wish List 

All of this is easy enough to predict because it is either what the Left is already doing where it has the power, or what it says it wants to do. 

The lessons of California and New York show that when leftists no longer face opposition, they do whatever they want—or try to. The problem (for them, for now) is that they still face opposition from the red elements still extant in the federal government, from red states, and from red communities in their own states. Once the whole country has gone blue, though, things will be . . . different.

With opposition eliminated or neutralized, the Left will gin up new enthusiasms with unprecedented zeal, which they will then impose nationwide with enforced pro forma approval, even to the point of requiring feigned enthusiasm. Gay marriage and transgenderism were just the beginnings. We may speculate as to what exactly they will choose next, but they’ll certainly pick something and force the wider society to accept it. Here’s one guess: forcible removal of children from parents who resist their kid’s demand to get on puberty blockers. At first, silence will be acceptable—barely—but over time the Left will insist on affirmative demonstrations of approval for whatever it is they dreamed up yesterday and now insist is an eternal, inviolable principle. 

Everyone will have to wear the ribbon. Humiliation is part of the appeal. While most leftists tend to believe in the urgency of whatever cause they happen to be worked up over at the moment, their deepest satisfaction arises less from seeing what they call justice done than in watching the retrograde being forced to submit. It lets the bad guys know who’s boss. “How many fingers am I holding up, Winston?” Forcing you to call a woman a man, or vice versa, is all the more satisfying when those holding the gun to your head know you don’t really believe it. That, and the constant invention of new hysterias keep deplorables off-balance and on the defensive.

Religious persecution necessarily will have to increase because much of what the Left is doing and wants to do directly contradicts the tenets of faith. We’ve already seen this with state orders forcing people to bake cakes for ceremonies that the bakers believe contradict their religion; we shall certainly see more of it, perhaps to the point where traditional Christianity will have to return to the catacombs.

There will be one major exception, however. A double standard will be ruthlessly enforced to allow Muslims (at least those who are regime allies) to do whatever they want in violation of leftist tenets. Leftism is incoherent in many ways, but it’s clear on its priorities, and on the intersectionality pillar, Muslims rank very high—perhaps not at the tippy-top, but high enough to be exempt from leftist religious persecution, which will be directed only at Christians and Orthodox Jews. This is a reason to suspect polygamy might be the next leftist enthusiasm.

Other items on the wish list include abolishing ICE, not just halting the construction of Trump’s border wall but tearing down sections already built before Trump’s inauguration, and extending Medicare to the entire population—including all current and prospective illegal immigrants. Then there’s the “Green New Deal,” which would ban air travel by 2030 (at least for those of us who can’t afford private jets or have access to government planes) and eliminate the parts of America’s energy sector that actually generate power.

And finally, the granddaddy of them all: “reparations.” Long a fringe idea, it was revived by Ta-Nehisi Coates and recently endorsed by the venerable Brookings Institution, the premier “establishment,” “respectable,” “moderate” center-left think tank in the country. Brookings scholars tend to be overrepresented in Democratic administrations and the ideas they get behind tend to become policy. So when Brookings backs reparations, you can be pretty sure that once the Democrats are back in office, reparations will happen. 

iStock/Getty Images

Goodbye, Constitution

Nothing gets a conservative’s patriotic blood up more than effusive praise of the United States Constitution. God knows, I love it too—at least as much as any of them do.

Which is why it pains me to write that its future is bleak. To do what the Left wants to do will require riding roughshod over our sacred parchment—even more than we’ve already been doing.

The Constitution has been under explicit attack since the beginning of the Progressive Era, nearly 125 years ago. Those attacks exponentially intensified with the advent of 1960s leftism. They retreated a bit in the face of the Reagan and Gingrich counterattacks but are now back with a vengeance. If conservatives were to tally the score, we might take some consolation from the fact that from time to time we’ve been able to put points on the board. But we would also be forced to concede that we’ve been massively outscored, and that our losses are mounting and accelerating.

I take the liberty of quoting myself, from the “Restatement on Flight 93” (originally published on September 13, 2016), because the words remain apt and I can’t think of a better way to make the point:

For now, let’s just ask ourselves two questions. First, how do the mechanics of government, as written in the Constitution, differ from current practice? Second, how well is the Bill of Rights observed? As to the first, we do still have those three branches of government mentioned. But we also have a fourth, hidden in plain sight within the executive, namely the bureaucracy or administrative state. It both usurps legislative power and uses executive power in an unaccountable way. Congress does not use its own powers but meekly defers to the executive and to the bureaucracy. The executive [at least when Democrats are in power] does whatever it wants. The judiciary also usurps legislative and, when it’s really feeling its oats, executive power through the use of consent decrees and the like. And that’s just the feds—before we even get to the relationship between the feds and the states. As to the second, can you think of a single amendment among the Bill of Rights that is not routinely violated—with the acquiescence and approval of the Left? I can’t.

This situation has gotten considerably worse since I wrote that. 

To cite only two examples: free speech is under attack as it never has been before. Right now, the battlefield is mostly social media sites, hence the attacks are publicly justified as legitimate acts by private businesses. “The First Amendment doesn’t cover the private sector; property rights mean they can do what they want!” Leave aside the extent to which the ruling class cares about property rights for property not their own (answer: not much); how meaningful is the distinction between public and private speech when the modes of public discourse increasingly are concentrated in the private hands of the ruling class? Answer: also not much. 

When it comes to freedom of association, the government arm of the ruling class is absolutely ruthless in declaring everything a “public accommodation” so that freedom effectively becomes nonexistent. But when half a dozen (or fewer) big tech companies take over the means of disseminating speech and ideas—oh, no! That’s not a public accommodation! Those are private firms and the rights of private firms are sacrosanct! As if this were not enough, take a look at how free speech polls these days: the younger the demographic, the less support one sees.

Consider also the incredible abuses of power from the Justice Department, the FBI, the intelligence community, and other agencies. Even with Trump in the White House, the administrative state still does whatever it wants while hampering the lawful directives of the elected chief executive. So long as their targets are in the disfavored party, they can spy on American citizens—up to and including presidential candidates—lie to and entrap public officials, extort plea deals from the innocent, and leak highly classified information for political purposes. (This is, needless to say, a very partial list.) They get away with all this scot-free: no punishment, no correction, no rebuke. They not only pay no price for shredding the Constitution and violating myriad statutes, but they are also lionized: the entire media and commentariat cheer them on. The fix is in, and has been for some time, but we still pretend we live in the age of Eliot Ness, the incorruptible G-man.

The fate of the Constitution is also inseparable from demographic change. Just as the least conservative and Republican areas of the country are the most foreign-born, so are such areas the places where the Constitution is least honored and operative. Lest someone object, “It’s not about race!” I agree: whites themselves are sharply divided about the merits of the Constitution. A plurality at least, and all of the elite, despise constitutional limits. The only people in America who en masse still care about the Constitution and how it’s supposed to work are conservatives, whose numbers—in absolute terms and as a share of the population—are dwindling. The bluer an area is, the less purchase constitutional principle holds. 

If present trends continue, the Constitution has no future. Neither its letter nor its spirit will be honored—either in ordinary circumstances or in the breach.  

Not only will none of the Constitution’s guarantees be upheld nor any of its limits respected, but the document itself will be increasingly denounced as a hateful tool of racist oppression, a relic of a benighted, evil past best left on the ash heap of history. That judgment is already the norm in academia and elite intellectual circles. And the history of the past 50 years shows that the Left is extremely effective at ensuring that every fringe, radical idea to emerge from academia becomes mainstream. How many times have we scoffed at some academic insanity, only to see it become federal law 10 or 20 years later? Ivy League law professors explicitly argue in the pages of the New York Times that the Constitution is evil and has to go. We already don’t govern ourselves according to its letter as a matter of practice. How long before what is today de facto becomes de jure? And even if it doesn’t, what difference would that make?

Elective Monarchy

To give new practices a veneer of continuity, in the manner of Augustus Caesar insisting he was just another senator, the more the ruling class departs from the letter and spirit of the Constitution, the more they will (at least for a while) pledge ever-greater fealty to both. Which in practice will mean only one thing: they will still hold elections every two, four, and six years, and the terms of office will remain the same length. These are, for the average American, still uncrossable lines and also impossible to fudge.

But politics—in the sense of reasoned deliberation about common ends—will cease. Instead, “politics” will be further divided along two tracks: one visible, the other not.

The real ruling will take place “administratively,” as described in detail in chapters two and three of my book (buy it!). The cogs and lickspittles in the bureaucracy, led by a small elite in corporations, above all in Big Tech and finance, will determine all important policies, foreign and domestic. Congress will be a bigger joke than it already is. Even the presidency will get weaker, as Democrats tacitly admit by nominating a man obviously incapable of fulfilling his constitutional oath of office. They know where the real power lies, and they know that all the power centers in the country are theirs.

But there is a certain cast of person who likes the trappings of office, and such persons will jockey for who gets which jobs when. There likely will still be general as well as primary elections, but only the latter will matter. The former will be mere formalities, as are gubernatorial contests in California and New York. It will probably take the Republican donor class a while to realize that their party is no longer viable at the national level, but eventually they will figure it out. After that, the party will become, for a few election cycles at least, what it is in New York and California: the plaything of billionaires who want to run for chief executive without the bother of a primary. All of them will lose. Then the party will die altogether.

The Democratic primaries will be the election. That is, to the extent that such contests actually are elections. It’s safer and more reasonable to assume they’ll increasingly be rigged, similar to the way the Democrats—twice—prevented Bernie Sanders from getting their party’s nomination. Insurgency, outsider candidacies may still be attempted for a few cycles, but they’ll get nowhere and pretty soon outsiders with anything on the ball will stop trying. We are, in a sense, headed back to the era of the “smoke-filled room”—though naturally there will be no smoke, unless it’s from pot.

To help us understand what’s coming, a more precise regime category exists: the elective monarchy, in which the true electors are not “the people” but a handful of horse-trading elites. Historical examples include the Western Roman Empire (where hereditary succession was the exception to the rule), the Mamluk Sultanate, the Papacy, and the Communist regimes of the USSR and the People’s Republic of China. The grandees of the Democratic Party will get together every eight years (needless to say, no president will ever be denied reelection again) and decide who gets to “run.” That person, facing no or merely token opposition, gets the big chair.

The fundamental question of every Democratic presidential primary season will simply be: Whose turn is it? That question will be asked in two senses: 1) which particular luminary gets to sit in the Oval Office for the next eight years? and 2) which group gets to reap the honors for a while? 

The ideal—the plan—will be to keep the globalization gravy train rolling by sharing the spoils “more equitably,” “spoils” in this case being both offices and remuneration (and, given the way our system now works, the former is the surest path to the latter). The economy’s actual masters naturally will prefer to be more generous with offices than with money. 

As for those quadrennial November contests, we’ll still go through the formality of elections, but for show—like senate votes in imperial Rome. The less consequential elections become, the more our elites will insist on their sacrosanct significance. The mere fact of holding elections will become ipso facto proof that the regime is “democratic” and therefore legitimate.

This is another thing New America’s rulers will share in common with their Communist forebears: the yearning for a veneer of democratic legitimacy. Near the end of the Cold War, columnist Charles Krauthammer coined the term “Tirana Index”—after Tirana, Albania, where tyrant Enver Hoxha once “won” an “election” 1,627,959 to one—which holds that “the higher the score rolled up by the ruling party in elections, the more tyrannous the regime.” The wonder is not that Hoxha won, nor even his margin of victory, but that he felt obligated to stage the sham in the first place. 

I don’t expect our coming overlords to rig our elections that badly; they won’t need to.

The “State of Exception” 

With an (alleged) Biblical plague, the worst economic crash since the Great Depression, and a three-month-long nationwide rolling riot that shows no signs of slowing down, you might find it hard to choose the worst aspect of 2020. Yet it may turn out to be none of these things.

In 2005, Italian philosopher Giorgio Agamben published the book State of Exception. The title refers to an old idea, traceable at least to the Roman dictatorship, which holds (to coin a phrase) that extraordinary times require extraordinary measures.

Of course, sometimes extraordinary times do require extraordinary measures—e.g., the American Revolution. The problem, of which ancient thinkers and jurists were well aware, is that there are always people wishing to proclaim any and every time “extraordinary” so they can grant themselves extraordinary powers which they resist ever giving up. The Roman solution was to limit a dictator’s term to six months and to enforce a strong political-cultural norm that the sooner a dictator surrendered his office, the more honor he gained. Whatever the precise solution, for law and liberty to endure, some means has to be found to deal with extraordinary moments without permanent recourse to lawless power.

Agamben argues that few, if any, countries—and virtually none in the West—have any such means anymore. And all the elites like it that way. Hence the “state of exception” has everywhere replaced the rule of law and is, de facto, the rule.

Nothing has made this clearer than the COVID-19 lockdowns, mask mandates, and other executive directives by governors and mayors who make no pretense of even consulting legislative bodies, much less going to the trouble of passing actual laws. They just decree what they want, and that’s that. 

Americans initially were willing to go along because they feared that COVID-19 would turn out to be what the ruling class and the “experts” still lyingly insist that it is: a once-in-a-century plague primed to kill millions within months. By now it’s obvious that this virus is not that. But the “state of exception” remains.

Will we ever get our liberties back? And if so, how many and to what extent? Agamben is not sanguine. He notes that emergency decrees often are formally lifted, but only after all necessary precedents have been set and new norms sink in. Then, all too often, the measures that were supposed to be “temporary” are later quietly written into ordinary legislation or regulation. Even if they aren’t, an inured and cowed populace finds it hard to muster the will to fight back. Which is the point.

“The disproportionate reaction to something not too different from the normal influenzas that affect us every year is quite blatant,” Agamben wrote in February. “It is almost as if, with terrorism exhausted as a cause for exceptional measures, the invention of an epidemic offered the ideal pretext for scaling them up beyond any limitation.”

You may agree or disagree with Agamben’s assessment of COVID-19 as “not too different” from the flu. He’s not an epidemiologist—then again, likely neither are you. But one characteristic the COVID-19 panic shares with other “crises” cited to justify “extraordinary” measures is that doubt becomes disallowed, dissent censored and purged. It’s settled science! 

Maybe. Though there are plenty of actual epidemiologists, other scientists, and doctors who disagree. Try finding their reasoned arguments on Google, Facebook, or Twitter. You can’t.

Whatever COVID-19 is, its effects so far in no way justify the measures taken allegedly to stop it. One can be excused for wondering if the real purpose of those measures is to stop us. From doing what?

In any case, in the event of a Trump loss, expect more. Joe Biden has already, more than once, threatened to impose a national lockdown and other mandates—if the “experts” say they’re required. But the relationship between alleged “expertise” and ruling class power is so yin-yang incestuous it’s impossible to say where one ends and the other begins. Do the “experts” want to lock us down and rely on their allies in the halls of power to do the dirty work? Or do the powerful use the “experts” to justify what they want to do anyway? Is it both? Does it matter?

A Limited Shelf Life?

It’s safe to assume that little, if any, of the above is out of reach if or when the Left finally achieves total dominance. They’ve already substantially built the regime they want. Is the structure 70 percent complete? Eighty? At any rate, it’s well over 50.

The fact that they’ve gotten this far more than suggests that their ambitions are realizable—for a time. But how long?

In the middle of the last century, Leo Strauss warned:

We are now brought face to face with a tyranny which holds out the threat of becoming, thanks to the “conquest of nature” and in particular of human nature, what no earlier tyranny ever became: perpetual and universal. Confronted by the appalling alternative that man, or human thought, must be collectivized either by one stroke and without mercy or else by slow and gentle processes, we are forced to wonder how we could escape from this dilemma.

The specific regime to which Strauss referred lasted 72 years—a long time for a system so antithetical to human nature, but shorter than many historical tyrannies. The conquest of human nature to which Strauss referred appears to have evaded Communism’s grasp.

Will it be possible for our ruling class to exclude from all political power half or more of the population for even 27 years, much less 72?

Or is the leftist dream now within reach? Can the toxic brew of drugs and porn and tech and all the other tools the ruling class uses to pacify the citizenry and collectivize human thought achieve what the tyrannies of the last century could not?

If President Trump loses, we will find out.

Weekend Long Read

Only Muscular Civic Nationalism Can Save America

America’s destiny can be to remain a leader and an example to the world, while caring for its own citizens in a way that doesn’t alienate the world, but inspires other nations to do the same.

America today faces challenges that cannot be overcome without national unity. Desperate economic hardship and existential international threats are beyond the living memory of most Americans, but they could be coming back. The Pax Americana, in effect since 1945, may be coming to an end. Since the end of the Cold War in 1991 America has been a hyperpower, dominating the world economically and militarily. All of that is now in question.

Every aspect of American power is threatened. America has a new peer competitor, China, controlled by a regime determined to attain superiority over the United States in all aspects of national power: technological, economic, and military. As Chinese power grows, America’s response is increasingly inadequate. American corporations are more than just reluctant to abandon Chinese markets, some of them, such as Google, appear to be more responsive to China’s security concerns than they are to America’s.

America’s culture of tolerance of individual rights and free enterprise has morphed into a dysfunctional encouragement of anti-American dissent that reaches well beyond appropriate grievances. In pursuit of worldwide profits and power, America’s corporate elites have abandoned the culture that nurtured them. In pursuit of utopian ideals, America’s colleges and universities have trained American students to despise America for its failure to be perfect. All the while, America’s politicians in both parties have pandered to America’s most vocal, embittered, and unrepresentative activist factions.

This is America as it enters the third decade of the 21st century. What ideology, what form of revitalized patriotism can heal America? What agenda will awaken a national spirit of unity sufficient to meet and navigate what may be a perilous future?

As it is, America’s corporate and political elites disparage nationalist sentiment, even conflating it as inherently racist. Is this motivated by a benign desire to hasten America’s evolution as a people? Is the genuine intention to make America a better, more open society? Is that what is behind the popular condemnation of nationalism and the endorsements of globalism, a peaceful world without borders? And if that is truly the benevolent core of this consensus, what parties to this consensus may have hidden agendas?

A discussion of this topic need not dwell on the threats confronting America today. They’re real enough. The military threat is expertly described in the 2020 book The Kill Chain, a visceral recitation of China’s rise and America’s negligence. The economic threats are equally obvious; for global economic analyses we’ll never see on ABC Nightly “News,” the forbidden fruit of Zero Hedge, or the suppressed musings of Felix Rex are as good as any.

Perhaps the most palatable reason America’s corporate and political elites have decided to cater to violent mobs, and America’s cultural elite have tried to transmute all of it into some new version of radical chic, is because there isn’t a more attractive alternative. What unyielding and persuasive ideological counterpoint exists in America that feels safe enough for the establishment to embrace? In the discussion to follow, the thesis to be explored is how civic nationalism can be defined in a manner that goes well beyond its current, barely intellectualized, tepid iteration, bereft of passion, uncertain of purpose, and devoid of popular support.

The case must be made that civic nationalism, colorblind but uncompromising in its adherence to traditional American values, is the only hope to unify Americans, which in turn is the only way a revitalized American civilization can hope to counter the rise of China. 

To use an allegory from the 1930s, civic nationalism offers an American unity that could galvanize this nation with fireside chats instead of Nuremberg rallies. It is an inclusive version of American patriotism that rallies all Americans to meet the challenges of the future with pride and unity. It may very well be America’s only hope.

Absent civic nationalism, America’s ruling class is adrift. The wealthiest look east and see even more wealth to be had, rationalizing their anti-American greed with heaping helpings of outdated free-trade liberalism. American politicians look to the Left and see righteous passion, while on the Right there is only defensive mutters or divisive bellicosity. The smartest among America’s ruling class see racial tension and have no answer but to let it fester and grow. Perhaps their thinking goes something like this: We’ll sell America to China, and when the masses realize what’s happened, they’ll blame each other instead of us.

How else to explain what’s happening, as mass unrest continues across America? Paul Joseph Watson, one of those inconvenient YouTubers who is not quite impolitic enough to get banned, but provocative enough to deliver insights (along with insults) you may not find elsewhere, featured this quote from an unidentified guest in one of his recent videos:

People who support Black Lives Matter, people who support the Left; a lot of them think they are in possession of radical political opinions. But how radical is your opinion when the cops and the national guard are kneeling and doing the Macarena, dancing with protesters, and every major corporation has put out a message and donated money to this cause? To the people who are spray painting and burning cop cars and smashing windows, how radical are your opinions, really, when these actions are allowed to take place? Because this is a tactical decision. It’s not like the cops and the national guard couldn’t crack down on this if they wanted to. It’s that it’s being allowed to happen, and if you think otherwise you’re a fool.

This sums it up quite well. The mayhem erupting across the nation since George Floyd’s death on May 25 has been allowed to happen. It is orchestrated by well-funded organizations that are collecting millions from mega-donors and mega-corporations, and egged on by months if not years of propaganda. The unrelenting havoc in the wake of George Floyd’s death is not a precipitous spasm of unrest that will eventually pass. It is a deliberate escalation of an ongoing insurrection.

The primary goals of this insurrection supposedly are to protect black lives and to oppose fascism, with a strong LGBTQ contingent also represented. The ongoing rampage has impacted almost every large American city. Despite dozens of deaths, thousands of injuries, and probably billions in property damage, compounded by the COVID shutdown, this insurrection has plenty of support. Blithely ignoring the destruction, the “peaceful protesters” have received sympathetic treatment from Democrats, and their slogans have been turned into marketing campaigns by major corporations. The media’s coverage of the insurrection has been predictable.

“America is irredeemably racist” is the message spread, with rare exceptions, by every establishment media property, online and offline. So desperate is the media to stoke this message that when a young man who probably just had a few too many drinks uttered a few anti-Asian slurs at a family in a California restaurant—no context was ever provided, despite it being very unlikely that “people being Asian” was the only thing that made this man angry—it was a top story on every major television network in the country. Similarly, when a young woman and her dog felt threatened by a black birdwatcher in New York City’s Central Park, her alleged overreaction was national news.

These are unpleasant events. They are examples of bad judgment, a failure to communicate, a loss of civility. They are not national news. The very idea is ridiculous. But on and on and on this story goes, desperate for fodder. America is a horrible nation, filled with horrible white racists.

Why? Who benefits by making white people out to be so rotten? Who benefits by convincing nonwhites, especially blacks, that whatever challenges they face as individuals and as a community are solely the fault of white people?

The “racist” stigma has been deployed by politicians and activists to manipulate American public policy for decades, because regardless of justification, it worked. But the perception that America is rotten to the core, comprehensively and indelibly defined as racist, used to be a notion largely restricted to academia. No more. Now it’s everywhere. ABC’s David Muir, with his carefully fabricated gravitas, intones yet another variation on the theme literally every newscast, often several times per newscast. The rest of the gang follow suit, from CNN’s Don Lemon to NPR’s Judy Woodruff.

But America is not an irredeemably racist nation. America in the 21st century is the least racist nation on earth. And yet this destructive lie is the currency of Democrats, the obsession of the media, and the marketing message of global corporations. The duration and weight of this lie, its steady growth despite a steadily vanishing basis for it, go well beyond its obvious goal of convincing Americans to vote President Trump out of office. What else is going on?

Finding a Scapegoat for Present and Future Problems

Behind the mere momentum and opportunism propelling the false and divisive narrative of endemic white racism, there is a formidable alignment of special interests. Foreign adversaries, China and Russia in particular, want the United States weakened by violent internal conflict, and fomenting racial polarization furthers that objective. Multinational corporations benefit by convincing Americans that fighting racism is a national imperative, because it makes it easier for them to stigmatize and silence as a racist anyone who objects to them exporting jobs and importing cheap labor.

Also propelling and profiting from the “America is racist” narrative, of course, are socialists, who have realized that as long as any group disparity exists between whites and blacks, they can argue that racism is the cause, and redistribution of wealth is the cure. But there is an even more insidious motive perpetuating the lie of endemic white racism—the need for a scapegoat.

Anybody familiar with the momentum of history must wonder how long the United States Treasury can continue to electronically materialize trillions of dollars to finance federal spending deficits. They must wonder how long American society can continue to function with every small business in the nation destroyed by the shutdown. They look with fear upon the millions of American youth who are disenfranchised by globalization robbing them of the ability to make a decent living, and environmentalism run amok robbing them of the ability to afford a home.

If the last few months have demonstrated nothing else, it is that anything can happen. Who would have thought one year ago that a global pandemic would strip away our constitutional rights as if they never existed? Who would have thought six months ago that our nation would be convulsed with violent riots, and major cities would become virtually ungovernable? And the other shoe has yet to drop. America’s economy remains precariously intact. There are (at least) no new foreign wars. Statues have toppled, select urban streets are still on fire, but widespread, horrific chaos is not yet here. Will it come, and if so, when? It could happen fast.

This is the scenario that confronts America’s billionaires and the political and corporate strategists who serve them. What happens when Americans aren’t just upset and financially squeezed, but desperately hungry and financially broken? What happens when small business owners and their workers aren’t just on a pandemic hiatus, but permanently ruined with no hope? What happens when the only businesses left standing are multinational corporate franchises? What happens when the inner cities are unlivable, the suburbs are besieged, and supply chains for essential products are broken? What happens when the Chinese cold war goes ice cold, and Americans quit their addiction to China’s exports cold turkey?

When people’s lives and livelihoods are destroyed en masse, they look for someone or something to blame. That’s human nature. And perhaps unwittingly, perhaps as a precaution, but regardless of intention or conscious planning, America’s corporate and political elite are preparing the target and hedging their bets, with the full complicity of the establishment media. It goes like this, “when the shit hits the fan, and all hope is lost, don’t blame the people who got rich selling America to China, blame white people. It’s their fault.”

By doing this, not only will the fury of a disenfranchised citizenry be turned upon itself in a fratricide that will be horrific to its participants but relatively harmless to the elites, the ideology of the socialists will be co-opted and used by corporate and political elites to further centralize their power. This is already happening, in slow motion. And the more crises that hit America, the more the narrative will intensify. Whites are the problem. Whites are to blame.

Understanding the True Political Conflicts in America

Stoking racial hatred is a dangerous game. Encouraging identity politics is only a winning strategy if the identities being nurtured or disparaged continue to be the chosen targets. But it doesn’t take an expert in political jujitsu to redirect all this poisonous swill. Most Americans have already realized that both of the  mainstream political establishments do not represent them. A majority of Americans already understand they cannot trust the establishment media. Only two more axioms have to be broken to change the game: First, that this is not a battle between capitalists and communists, it’s a battle between nationalists and globalists, and second, that in this battle, whites and blacks are not enemies, but allies.

The elites are making race and racial oppression the central topic in American politics, and for good reason. Because if you take race out of the equation, there is very little of substance separating the grassroots on the Left from the grassroots on the Right. Why? Because communists and corporations in 21st-century America are working together to advance big government globalism; they both support an authoritarian, collectivist, micromanaged society.

On most of the big issues of our time, including the rejection of traditional moral values, the centrality of “climate change” as a transformative economic and political agenda, and the need for affirmative action, racial redress, and open borders, share a surprisingly congruent agenda. Only on the issue of private property do they diverge, and even that may be illusory when considering the realistic prospect of publicly held corporations with activist directorates owning virtually the entire economy.

So where are the actual divergences in American politics, if not the distinctions between Left and Right, Conservative and Liberal? The following chart attempts to depict the more relevant political dynamics in America today. The vertical axis represents the split between supporters of nationalism vs supporters of globalism, and the horizontal axis represents the split between supporters of ethnically homogeneous societies and supporters of multiethnic societies.

 On the above chart, for ease of explanation, the quadrants are numbered. In quadrant No. 4, which represents the multiethnic globalists, you find everyone supposedly at odds with each other in conventional political paradigms. The establishment Democrats and establishment GOP (indistinguishable from “Conservatism, Inc.”) are joined by America’s Social Democrats, along with multinational corporations and corporate media, academia, and foundations and think tanks on the “Left” and the “Right.” All of them envision a multiethnic, globalist future.

Diagonally opposite and diametrically opposed to the multiethnic globalists are the ethnic nationalists (quadrant No. 1). As in any of these quadrants, there exists a continuum of passion, from the most hardcore extremists to merely insouciant rebels and provocateurs. But at whatever level of extremism, here is where you find white nationalists, black nationalists, Chicano nationalists, and various other smaller cadres of ethnic nationalists.

Why Civic Nationalism Offers a New American Consensus

Civic nationalists, occupying quadrant No. 2, are beleaguered, inadequately defended or explained, and under attack from all sides. The ethnic nationalists consider civic nationalists to be naïve, incapable of recognizing that cultures are inextricably connected to race, or that some cultures are incompatible. Virtually all ethnic nationalists consider themselves to be oppressed, which tinges their animosity towards their counterparts among the civic nationalists with the additional insult of betrayal.

Multiethnic globalists, for their part, also view civic nationalists as naïve, but for the opposite reason. They consider civic nationalists merely by virtue of their nationalism to be pairing up with “white nationalists,” possibly unaware of their complicity, or possibly even deliberately camouflaging their own racist tribalism. After all, how can it even be possible to be a nationalist if you aren’t a racist?

But the whole point of civic nationalism is to reject racism while embracing patriotism. It expresses the quintessentially American ideal of the melting pot. It expresses—not nearly forcefully enough—America’s history of assimilating immigrants into the mainstream culture. Our tradition of assimilation offends ethnic nationalists, who are skeptical that it can still work, but it has also become problematic for multiethnic globalists who typically must defer to identity politics.

The exploration of what it means to be multiethnic but monocultural is one of the prevailing challenges for civic nationalists, and to-date they are not up to the task. They are so busy defending charges of secretly harboring feelings of ethnic nationalism that they don’t have time to distinguish themselves from the multiethnic globalists. But these are fatal distractions to civic nationalists, if it means the bigger questions aren’t answered.

What is American culture? What defines the American civilization, and how can it be defended? What is America’s tradition of assimilation, if not the preservation of a unique core culture that nonetheless constantly evolves and incorporates dazzling new ideas from around the world, while retaining the foundational values of individual freedom, free enterprise, and European Christian heritage?

It is a tragedy that America’s civic nationalists are a barely recognized and often stigmatized movement. For one thing, once you escape the corridors of the chattering classes or the cadres of extremists, small in number but vocal and politically connected, you find that civic nationalists describe the majority of Americans. To the extent their exposure to unrelenting globalist, anti-nationalist, anti-American, anti-white bombast coming from academia, media, entertainment and politicians hasn’t corrupted their hearts, most Americans love America. They love it just the way it is, imperfect but always evolving and improving, offering opportunities to everyone willing to work hard, a big, sprawling nation with all kinds of different people who are united by the American dream.

That dream—individual freedom and economic prosperity—is threatened as never before, but instead of speaking up louder than ever, civic nationalists are hunkering down. Many of them are afraid to defend their biggest champion, President Trump, who epitomizes the American dream and would be far more popular if he weren’t demonized by the establishment at every turn. But Donald Trump personifies the nightmare of the globalists—an American president who embraces civic nationalism.

Now more than ever, civic nationalism is a movement that must find new adherents and persuasive advocates across American society because, in troubled times, it is America’s only hope for unity.

The Toxic Realignment That Must Not Happen

Where will Americans turn, if the social contract is broken by economic devastation, or an even more serious pandemic, or any other sort of seismic hiccough that inaugurates not weeks or months, but years of turmoil and suffering? What happens if America descends into a new depression, requiring a decade or more of mass hardship that dwarfs anything in living memory?

Here is where the riots and the racism narrative become even more useful to globalists. Here is why the BLM and Antifa militants, with their passionate denunciations of racist America, are being allowed to carry on. Here is why a full-spectrum campaign is being waged to push whites into either paroxysms of self-hatred and guilt, or reactionary anger, and here is why, at the same time, nonwhites are being pushed into blaming whites for literally anything in their lives that isn’t right. Just before the deluge, get them busy fighting each other.

What better way to prevent a populist rebellion against globalism, or, in a related and even more sinister twist, a realignment that embraces conspiracy theories? Referring to the previously discussed chart of political alignments in America, what constitutes ethnic globalists (quadrant 3 on the chart)? Is there such a thing? If you ask the black nationalists, or if you ask the white nationalists, you will get the same answer. Yes. There is.

The conventional establishment analysis of anti-Semitism in America focuses almost exclusively on its embrace by white nationalists, and the response has been to expose and deplatform any online content that includes criticism of the Israeli lobby in the United States, or assertions that Jewish individuals own a disproportionate share of America’s media, entertainment, and financial sectors.

The problem with this focus on possible anti-Semitism on the part of white nationalists is that it ignores far more pervasive anti-Semitism coming from so-called social justice warriors and Democratic Socialists. And that omission, that selective focus, exposes a deeper problem: In the radicalizing environment of a social and economic collapse, the American corporate establishment may not effectively counter an anti-Semitic narrative from spreading, because in their attempt to co-opt America’s Left, they fed an anti-Semitic beast that got too big to control. They are funding Social Democrats who, in their obsessive hatred for “Zionism,” are a heartbeat away from publicly hating the disproportionate influence of Jews in American media, politics, and finance. Many of them already do.

Some of the Left’s highest-profile leaders, certainly including members of the “Squad” in the U.S. Congress, have openly spouted anti-Semitic rhetoric. Some members of the BLM movement and its sympathizers also have been openly anti-Semitic, and every time one of their voices is canceled, they become more susceptible to conspiracy theories. If America’s economic and political stability continues to deteriorate, the schizophrenic strategy of the corporate establishment—embracing anti-Semitic Leftist groups at the same time as they crush any expressions of anti-Semitism—will fall apart.

The nightmare scenario that the multi-ethnic globalists are flirting with is a toxic realignment in which American nationalism captures a majority no longer divided by race, because they are unified by hatred of global elites. In the worst case, the perception could spread that the crash was planned in advance, and that a specific tribe of individuals is to blame. If that happens, the populist momentum that will fuel it will come from Leftists. It will come from the same people who this year occupied a section of downtown Seattle, fought pitched battles with police for months on end in Portland, and spread violence and vandalism from coast to coast.

The conspiracy theories that give rise to toxic mobs don’t have to be anti-Semitic. That’s just one possibility that history has taught is impossible to ignore. But a populist rebellion against globalists can apocalyptically target any group perceived as exploiting the people or lying to them. Global elites. Bankers. Television news anchors. Tech Barons. Stock traders. Anyone living in a gated community. Race or creed may have nothing to do with it. It may simply be the upper class, the one percent. That’s still a tribe. It still becomes us vs them. Where were you, when the dam finally broke? If you were a propertied landowner, living on high ground, perhaps you were in on it. And if so, now you deserve to lose everything.

Only Civic Nationalists Can Counter Conspiracy Theorists

To understand the potential of civic nationalism to peacefully unify Americans even in the face of great economic and geopolitical challenges, one must return to the shared agenda of Social Democrats and corporate globalists. The rejection of the traditional nuclear family, the climate change agenda, the rejection of a meritocracy in favor of race and gender quotas, enforced equality instead of equality of opportunity, and mass immigration.

The common thread in all of these policies is that they will harm middle- and low-income Americans, regardless of race. Children need a father and a mother. Climate change policies that enrich corporations and empower leftist bureaucrats will impoverish everyone not wealthy enough to be indifferent to the crushing cost. Abandoning meritocracy in favor of quotas will destroy America’s ability to compete and innovate at the same time as it will breed cynicism and alienation. Mass immigration drives down wages and bankrupts social services.

Civic nationalists are the only ones who can explain that of course Democrats, establishment Republicans, and corporate globalists want to distract us by turning us all into racists and anti-racists who consume one another in endless conflict. Without this massive distraction, how would globalists get away with destroying America’s standard of living while enriching themselves? As we kneel before BLM activists, they take away our freedom. As our pregnant women form a skirmish line to protect Antifa militants, they take away our prosperity. It’s a good scam. Get everybody fighting. This devious, epic, diabolical fraud and hidden agenda must be exposed at every opportunity. But there is also a positive message, promoting hopeful solutions, that is desperately necessary in order to avoid radicalization.

A muscular civic nationalism incorporates opposing alternatives to every one of these pillars of corporate globalism and promotes them without apology and without reservation. The traditional family is the backbone of society. Fossil fuel, hydroelectric power, and nuclear energy are absolutely necessary to grow a healthy and prosperous economy. Immutable colorblind standards are the only fair and legitimate way to allocate opportunities in all aspects of society. Immigration must be strictly regulated to protect the interests of American citizens, not global corporations.

With these principles forming an uncompromising foundation, civic nationalists have the credibility to reject racism and anti-Semitism. They have an appealing, prosperity-oriented narrative that will attract wavering adherents of ethnic nationalism as well as reluctant globalists. They offer common sense and hope. They offer calm unity. They can reject extremism of all types, whether it’s classic racism or teaching transgender ideology to prepubescent students in the public schools. And they love America.

Emphasizing these policies—pro-family, pro-conventional energy, and pro-meritocracy—have not been the common currency of civic nationalists. Instead, with good reason, they’ve been stereotyped as waffling on immigration, lukewarm on climate realism, AWOL on expressing the problems with race and gender quotas, and, if anything, antagonistic to pro-family sentiments. No wonder they are barely relevant. And no wonder Trump’s enemies get away with accusing him of catering to ethnic-nationalists and conspiracy theorists. They claim nobody else is out there, and in one important respect, they’re right. The civic nationalist movement, despite its potential to become the center of gravity in American politics, lacks a critical mass of leaders with the voice and visibility to give it an undeniable presence.

America’s Civic Nationalism and Foreign Affairs

An important criticism of nationalism of any kind in America is that it allegedly ignores the interconnected community of nations and steers America towards isolationism. A related criticism is that America cannot abandon the multilateral agreements and security guarantees that have guaranteed relative stability in the world for the last 70 years. Underlying these criticisms is an argument in favor of globalism, something with too much legitimacy to be merely dismissed. But a distinction must be made between globalization and globalism. Globalization is a phenomenon. Globalism is an ideology. Even more to the point, globalism can be understood in various ways, including ways that embrace nationalism.

The phenomenon of globalization is unrelenting, fueled by trade, migration, capital flows, technological innovations, revolutions in transportation and communications. It can’t be stopped, but it can be managed. American nationalists are correct to point out that for Americans, in recent decades, the benefits of globalization have been largely illusory if not negative. While free movement of capital and people has made multinational corporations more profitable, it has hollowed out American industry and depressed American wages.

At the same time, the American consumer has paid far more than his foreign counterparts for drugs and medical treatments, effectively subsidizing the development of these cutting edge therapies in order for American manufacturers to sell them at competitive prices in the rest of the world. The American taxpayer has paid for a military establishment that guarantees open sea lanes and global security. The American worker has paid the price for job creation and economic growth overseas. The American household is overwhelmed with debt, borrowing against the bubble value of their home in order to pay for overpriced tuition and imports from foreign manufacturers. The American economy has been turned into a gigantic, overleveraged mass of collateral for foreign debt.

For the rest of the world, there’s been upside to all of this. And for dispassionate economists, if the overall economic growth of the world exceeds the negative impact all of this has on American economic growth and median household income, that’s a worthwhile exchange. But it’s also short-sighted. Liberal free trade policies work until they don’t. How does it benefit global stability when the economy of the United States implodes after decades of unsustainable, debt-fueled growth, leaving nothing but a hollowed out nation, riven by social conflict?

Navigating this rebalancing, where the United States continues to provide global leadership in a community of nations, but no longer sells its national assets to fuel half-trillion-dollar annual trade deficits, is something that globalists have to come to terms with because it is inevitable, and nationalists have to come to terms with because complete isolation is impossible.

A Civic Nationalist Approach to Globalism 

To manage globalization, what sort of globalist ideology to reject or embrace is a choice. The conventional globalist ideology is that borders should be erased, and that people and capital should move freely. In this manner, according to this version of globalist ideology, all people on earth, overall, will be better off. An alternative version of globalism is that a community of sovereign nations is the only fair and realistic way to manage globalization. It holds that unrestricted movement of people and capital destabilizes nations, punishing the cultures that historically have been successful.

A civic nationalist doesn’t have to be an isolationist. Civic nationalism can promote the vision of a community of nations, competing and cooperating, with each managing globalization on its own terms. For Americans, civic nationalism can recognize that American leadership in the world remains essential not only to promote Western ideas of individual freedom and free enterprise but also as a purely pragmatic matter. When America is socially unified and economically and militarily strong, it deters war, sets an example for emerging nations, and generates the wealth necessary to invest in the developing world.

Much of what passes for foreign aid in the developing world is wasted. Without wholesale changes in priorities, calls to end foreign aid and foreign investment are justified. But here is where the primary foundations of civic nationalism in America can also find expression in foreign aid and foreign investment. If America needs upgraded infrastructure and more cheap, reliable, conventional energy, imagine how much greater the need in African nations. And yet today the preponderance of foreign aid and foreign investment go to expensive and ineffective solar energy projects and other “sustainability” initiatives that arguably do more harm than good, while, in a gross and hypocritical inversion of logic, untapped hydropower and conventional power grids are not considered “sustainable” options.

Much of today’s globalist approach to foreign development has not only been fundamentally misanthropic, but misguided. Investment in developing nations that emphasize cost-effective conventional infrastructure and power generation will yield profitable returns, reducing if not eliminating the need for government involvement.

This, too, is an alternative view of globalism that America’s civic nationalists should embrace without reservation. It would pay financial dividends to American investors, manufacturers, and civil engineering firms, and it would offer developing nations a viable way out of poverty. It would even spare these nations further environmental degradation, as population growth eases through prosperity, and these economies move from burning wood, eating endangered game, and practicing inefficient subsistence agriculture to, for example, developing nuclear power and adopting modern agricultural techniques.

By exposing these misanthropic experiments in foreign aid and foreign investment and abandoning them in favor of initiatives that will deliver rapid and genuine prosperity, civic nationalists can present new ways to manage globalization that entice developing nations. As it is, the practical effects of globalist policies in the developing world are exploitative. Like their domestic equivalents, their only benefit is to subsidized investors, misguided (or wholly corrupt) nonprofits, and state bureaucrats. Civic nationalists can break this cycle, and offer a hopeful vision to aspiring communities at home and around the world.

The Constituency of Civic Nationalists

On the surface, the coalition that constitutes multiethnic globalists in America presents seemingly insurmountable power. After all, they have the corporations, the entire establishment uniparty, the colleges and universities, the public schools, the media, and entertainment complex, most of the billionaires. But this coalition is not invulnerable, for reasons that have already been explained in part. As noted, the blatant anti-Semitism of the American Left threatens to short-circuit their marriage of convenience with the corporations and billionaires that currently indulge them with money and supportive marketing campaigns.

There is another constituency that currently enjoys a marriage of convenience with America’s corporations and billionaires, and that is the labor movement. Despite their reliance on rhetoric that attacks corporations and billionaires, for the most part, America’s union leadership shares the same agenda as their supposed adversaries. Immigration is the prime example. To counter the systemic racism that defines white America, and to partially redress the colonial theft of North America by whites, and to partially atone for supposedly causing catastrophic climate change which disproportionately harms people in the “global south,” America’s leftist dominated unions clamor for mass immigration. But how is this in the interest of the American worker?

There’s a reason President Trump has enjoyed support from millions of union workers across America. This is another example of support that invalidates conventional political antagonisms. Trump is supposedly “right-wing” and labor unions are supposedly organized to fight exploitation by right-wing interests. But none of that is applicable. Trump is American, and is promoting America First policies that benefit the American worker. Trump is a nationalist, and nationalist policies resonate with workers, and ought to resonate with any union that cares about Americans.

The labor movement in America is destined to split. Those unions that truly support the American worker will embrace the foundational economic premises of civic nationalism—specifically, development of conventional energy, expansion of practical infrastructure, strictly regulated immigration, and America First trade policies. Those unions that cling to the corporate globalist narrative will become nakedly internationalist and anti-American and anti-white, and they will continue to pretend that economically unsustainable “renewable” energy and open borders will further the interests of the planet and humanity. Labor unions, if they are true to the interests of the American worker, can play a critical role in the ascendance of civic nationalism.

Another constituency of the civic nationalists is every parent in America. It is hard to imagine any special interest more deserving of indictment than the teachers’ union. If you want to know what animates the thousands of rioters and their millions of sympathizers, look no further than the 12 years of anti-American, anti-white, anti-capitalist indoctrination they got in America’s public schools, thanks to the teachers’ unions. Along with what passes for a college education these days in America, the publicly funded, unionized education establishment from kindergarten through high school has brainwashed a generation.

Any parent or community leader who recognizes that children and young people need to acquire basic skills and a work ethic, and that those priorities have been abandoned thanks to the influence of teachers’ unions, is ready to embrace civic nationalism. Any parent who recognizes that race-baiting and identity politics are dead-end curricula, offering nothing but excuses for failure and rationalizations for government handouts, is ready to embrace civic nationalism. Breaking the ideological monopoly that America’s teachers’ unions have on its public schools is a goal that millions of Americans will share, and should be a top priority of civic nationalists.

Civic nationalism appeals to additional powerful constituencies. Corporations that still aspire to serve Americans first but have been marginalized by the Left are plentiful if you know where to look. The nuclear power industry is a prime example, ready to help expose the lies that have stopped its progress in America for decades. The American oil and gas industry is another one. It is one election away from being systematically shut down, with catastrophic consequences to the economy and national security. Civil engineering firms that want to rebuild America are ready to embrace civic nationalism.

The list of potential advocates of civic nationalism is bigger than people imagine. EcoModernists who recognize the environmental benefits that will result from investment in conventional infrastructure in America as well as in developing nations. Members of law enforcement who have had it with, for example, “catch-and-release” laws in progressive, criminal-friendly cities. Members of the military who realize that if more naval officers had been assigned to fire safety, and fewer officers had been preparing PowerPoint presentations on transgender sensitivity training, the ”Bonhomme Richard might be ready for redeployment instead of an incinerated hulk.

A Common Fate, a Common Vision

The appeal of nationalism ought to be obvious. It is natural to yearn to be part of something bigger than oneself. This is why racial division is the most potent weapon that globalists have to divide Americans. Why should a black American revere the history of America, if the emphasis that is continuously thrust at him is the legacy of slavery? Why should a Chicano accept the sovereignty of America over Aztlan, if all he’s taught is how the land was stolen from Mexico? Why should Native Americans accept the white majority, if the entire North American continent was stolen from them? Why, indeed, should Asians move to America and assimilate, if assimilation has become a dirty word, whereas their cultures of origin remain proud and confident?

This is where a muscular civic nationalism offers the only viable hope to unify America’s ethnicities into a coherent national culture. It’s a cliche, but nonetheless true that America’s white majority historically has been slow to accept immigrants who were different. But to America’s exceptional credit, assimilation eventually happened, every time. Today there are ethnic groups that at one time were not assimilated that now are considered to be part of America’s white ethnic group: Eastern Europeans, Southern Europeans, Irish. Moreover, millions of Americans are considered “white Hispanics,” and additional millions of Asians have blended into the American mainstream. The secret you’ll never hear from race-baiting propagandists like ABC’s David Muir is the fact that 17 percent of American marriages are now between couples of “mixed race.” America’s ability to assimilate is ongoing, despite all attempts to divide us.

This all begs the question: What is “white?” Is it skin color, or cultural affinity? A civic nationalist has to confront this question squarely. Americans don’t have to be white. But they do have to be American. This means adopting America’s values and traditions, and feeling part of an American culture that has its roots in white European civilization. That’s just historical fact. Racists, whether they are those white nationalists who fit the overhyped stereotype, or the suddenly fashionable black nationalists, or the racially obsessed mainstream American globalist establishment, care very much about skin color. Civic nationalists do not care about skin color. They care about preserving American culture, and welcome anyone who shares that goal.

So why shouldn’t black Americans embrace civic nationalism? Because it would mean they are selling out and becoming “white?” This is a stigma that school-aged studious blacks are apparently tagged with. This element of some inner-city cultures holds that to carry your books home to study after school is to become “white,” and is somehow a betrayal of black identity. This is an absurd and destructive notion that should be opposed by any black person serious about the success of their community.

Equally important, for a white person to kneel to BLM activists is not only an act of cowardice and ignorance, but it reinforces two lies. First, it is a lie that white people have no right to suggest anything to black people. This is a ridiculous lie, because blacks are the swing vote that will decide the fate and future of the nation. Second, the chief obstacle to black achievement is not racism. Rather, the primary barrier to black achievement in America is a thug culture that undermines if not terrorizes black communities, expressed in broken homes, substance abuse, gang violence, contempt for education, and rejection of law enforcement.

If white people truly care about black people, they will challenge these lies. And if they do, they will have plenty of help from black conservatives, who unfortunately are ignored by the establishment media, but whose messages are bubbling up through the internet and in churches and education reform organizations and elsewhere.

The notion of shared fate can be colorblind, and a civic nationalist has to emphasize this while at the same time not succumbing to the premises and the language of the Left. For example, “colorblind,” “assimilation,” and “meritocracy” are not code words for racism. They are noble concepts to live by, they are the inclusive premises of the American civilization, and they must be defended at all costs.

Americans can unify as a single, colorblind culture. There is no reason why any American citizen, of any color, cannot read the founding documents of America and be inspired by them. There is no reason why any American, regardless of his ethnic background, cannot appreciate America’s unique commitment to individual rights and free enterprise and private property, and understand its transcendental value. There is no reason why Americans of all races cannot view America’s history not as “deeply flawed,” but instead as an illustrious story of evolution from an inspiring beginning to what it is today, through perpetual refinement—a nation of unparalleled opportunities for everyone willing to work hard.

America’s destiny, according to civic nationalists, can be to remain a leader and an example to the world, while caring for its own citizens in a way that doesn’t alienate the world, but inspires other nations to do the same. America’s destiny can be to invest in practical, prosperity-oriented projects at home and abroad, to maintain technological and military preeminence, and to blaze a trail into the solar system. This is a vision that civic nationalists need to let every American know they can share.

What Does the Future Hold?

The multiethnic globalists are on a path that leads, ultimately, to the destruction of America as a sovereign nation. 

There is no guarantee they will succeed, or if they do, that it will be a smooth transition. The conflagration they’re inviting by fomenting racial conflict, especially in the event of economic collapse, may not be kind to the corporations, the billionaires, the bankers, or anyone else perceived as party to the chaos, including Malthusian environmentalists who oppose everything that might create sustainable prosperity. 

But then again, they might win.

Examples of nations where the elites successfully consolidated their power have taken many forms throughout history: Centuries ago, feudalism united the elites atop the peasantry. In the 20th century, with stupefying brutality, we saw Spanish and Italian Fascism, German Nazism, Soviet Communism, Japanese Militarism, Chinese Maoism. Today there are plentiful examples, including President Xi’s fascist, racist, expansionist Chinese empire, or Maduro’s pathetic, brutal oppression in Venezuela.

Here in America, perhaps the best vision of where we’re ultimately headed if the globalists get their way is a society that lies somewhere on a spectrum between Aldous Huxley’s Brave New World and George Orwell’s Nineteen Eighty-Four. Whatever individual freedom Americans once knew will no longer exist. We will become undifferentiated human matter, as much a commodity as the products we consume, Pavlovian in our political rectitude, under a watchful corporate Panopticon,

Americans have been betrayed by their elites. The globalist agenda of open borders, unfettered movement of capital, the rejection of traditional values, the rejection of the meritocracy, the hysterical overreaction to “climate change,” and the heedless accumulation of debt to fund the development of foreign economies—including the Chinese military—has been accepted and promoted by virtually every major institution in America: unions, corporations, academia, K-12 public education, the media and entertainment business, the Democrats, and most of the Republicans. They lied about all of this, and in so doing elevated the cost-of-living at the same time as they deprived Americans of good jobs.

There is an irony, here, because for multiethnic globalism to succeed in the long run, America’s elites needed to treat American citizens better in the short run. They did not. Now they are hiding behind racial tension, stoking it, funding it, allowing it to happen, hoping to suppress the truth of their betrayal, hoping to deny us consciousness of our shared and colorblind fate.

Civic nationalism is the only alternative to a bleak future for Americans. It can offer compassion, inclusion, optimism, an agenda of hope, and economic revival. Absent the achievement of national unity through a civic nationalist realignment, Americans may either descend into racially motivated civil war, or acquire a stability akin to Xi’s Xinjiang.

The choice is ours to make, and time is running out.

Weekend Long Read

This article originally appeared on MindingTheCampus.org and is republished with permission.

Minute 14 for Robin DiAngelo

It’s minute 14 for Robin DiAngelo, and the clock is ticking. Perhaps it’s time for her to bank those royalties, cash those speaker-fee checks, and fade out of the public consciousness.

What happens when a non-psychologist sets up a small and shoddy human psychological experiment in a university almost two decades ago—an experiment in which the eight subjects are repeatedly lied to, in which she brings in hand-picked collaborators to commit a deception scripted by critical racialist ideology, and all while she sits and watches the project careen out of control but does nothing to stop the debacle?

If you’re Robin DiAngelo, you scrape together the rubble of your failed experiment, write a sloppy dissertation rife with metaphysical jargon and riddled with spelling errors, and serve it up as the centerpiece for your Ph.D. that you variously describe as in the field of “education,” in “curriculum and instruction,” in “critical discourse analysis,” or in “whiteness studies,” depending on the audience.

And based on this, you announce the discovery of something you call “White Fragility,” so that you can collect millions of dollars from gullible folks who prefer their prejudice served up in a way not seen since wily medieval alchemists duped supposedly savvy aristocrats to believe they could transmute lead into gold.

Yes, that Robin DiAngelo, she of the bestselling racial-flagellant manual White Fragility, which has quickly become part of the diversity industry canon.

Today, DiAngelo is likely America’s best-known diversity demagogue, one of the many folks who travel the land armed with motley credentials and lots of hubris to “train” people in “anti-racism.”

Her particular shtick is quite possibly the sweetest of the current crop. Some have even called what DiAngelo does a “grift.”1 A “grift” is universally defined as a petty or small-scale swindle, one that is perpetrated by a streetwise conman. In the 1973 film “The Sting,” the Robert Redford character Johnny Hooker was a grifter. In running his small-scale con, he stumbled into a world of big-time organized crime, and he scored his chance to run a Big Con.

Some consider DiAngelo a kind of Johnny Hooker of the diversity industry. Like Hooker, DiAngelo takes her shot at the bigtime to goose some life into her 2018 social fantasy White Fragility.

As this is written, her book sits atop the New York Times bestseller list, and White Fragility is being discussed on college campuses nationwide. She’s in-demand, and for some reason, administrators are more than willing to pay her $12,000 speaking fee.

DiAngelo often describes herself as a “sociologist,” and so you might assume that she is a researcher with a significant body of original research on which she grounds her newly popular notion of “White Fragility.”

This is not the case. Far from it, in fact.

DiAngelo has an undistinguished track record of publishing her opinions in what are sometimes called cargo cult journals. Without exception, hers are introspective journalistic pieces that reiterate what other opinion writers have said, that recite the catechism of critical racialism, and which share her personal experiences as a diversity jongleur on the front lines of “multicultural education” and “anti-racist training.”2

DiAngelo has no original research record to speak of, unless we count her substantively disastrous human subject psychology experiment, which is described in her 2004 dissertation: “Whiteness in Racial Dialogue: A Discourse Analysis.” This is where she first used the term “white fragility,” which she apparently lifted from a person by the name of David G. Allen, who served on her committee.

In point of fact, the entire foundation of DiAngelo’s “theory” of “White Fragility” is constructed from the fruits of this human subject experiment gone awry 17 years ago.

For Robin DiAngelo’s fans, it makes not one whit of difference that she could be a creation of P. T. Barnum’s “humbugs, delusions, impositions, quackeries, deceits, and deceivers,” and surely no better than a Melanesian weather doctor casting chicken bones in the dust and intoning about mana and the impending yam harvest. But for normal people, hers is a fascinating and cautionary tale of how a provincial striver can cobble together a dramatic career ascent out of academic fakery and pseudoscience to ride the madness of the crowd to riches and fame. In crafting her diversity narrative, she succeeds by combining the fakery of critical racialist ideology with the general tendency for soft minds to express guilt and to confess to most anything.

This piece provides the backstory to one of the greatest social fantasies of critical racialism perpetrated in the 21st century by one of the country’s least likely racialist antiheroes, and yet currently America’s hottest diversity jongleur.

When She Was Just Another Workshopper

Who is this “academic” and “educator” DiAngelo?

Before she attended the University of Washington to acquire the ultimate academic credential, DiAngelo was one of many small-time jongleurs who milked the diversity scene in the 1990s, sometimes contracting for herself and at other times partnering. She signed on early to the diversity enthusiasm and conducted the now ubiquitous “workshop” in places as diverse as the Seattle police department, Seattle city schools, and the National Coalition Building Institute.

On entry to graduate school at the University of Washington in 2000, she taught courses in the School of Social Work and teachers college. You can guess the topics that serve as markers for a narrowly educated and professionalized ideologue: “Multicultural Education,” “Cultural Diversity and Social Work,” “Intergroup Dialogue Facilitation.” That sort of thing.

At UW, she was also engaged in small-time human subject experimentation to reap an academic credential—a Ph.D. in, well . . . whatever is convenient that she says it is today.

DiAngelo progressed to the point where she would conduct her own research that she would later chronicle in a dissertation. This research was a human subject experiment. Its purported results would serve as the basis for the notoriety and riches to come almost two decades after she used the term “white fragility” the first time in her dissertation.

In 2003, DiAngelo indeed piloted the techniques that have given her cachet in 2020—tautology and circular reasoning, intellectual parochialism, thought reform, and outright academic fakery. But in point of fact, this project she conducted 17 years ago likely should have barred her from receiving the credential that she bruits so prominently today.

DiAngelo was clearly in over her head as she tampered with the psyches of a group of eight unsuspecting “white people” through an experiment grounded in the well-worn psychological techniques of thought reform in a racialist workshop format. This type of cavalier treatment and occasional outright abuse of human subjects in a workshop format is a hallmark of those engaged in critical racialist ideology in its thought reform version, both on and off the university campus.  It is this ideology that animates DiAngelo and many other jongleurs like her, and so it is worth a moment to learn something about it.

The “Dialogue” of Thought Reform

One of the first characteristics to learn about critical racialist ideology is that its propagation is largely performative and is publicly presented in theatrical productions called workshops, caucuses, dialogues, or conversations, particularly on university campuses as part of what is euphemistically called the “co-curriculum.”3

The “co-curriculum” is the vehicle by which superstition and pseudoscience gain access to university campuses outside the purview of faculty, who would vet and block these types of programs as clear-cut charlatanry. This aspect of the racialist “workshop” goes largely unremarked in much of the mainstream discourse about the academy, and yet it is a ubiquitous activity in the university, an important component of the “co-curriculum.” In her experiment-cum-workshop, DiAngelo used a critical racialist technique called “intergroup dialogue.”

The primary point to understand with the critical racialist “dialogue” technique is that it does not constitute a “dialogue” in the generally accepted sense of a conversation between two or more persons that may or may not include an exchange of ideas and opinions. Euphemistic tropes such as “difficult dialogue,” “intergroup dialogue,” or “courageous conversations” mean something significantly different in the critical racialist lexicon. These “dialogues” consist of discussions guided by “facilitators” trained in the tenets of racialist doctrine; their task is to ensure that participants understand their scripted roles and adopt and perform those roles as they journey to “critical consciousness,” which is shorthand for acceptance of the conspiracist worldview of critical racialism.

This is the only discourse permitted in the “dialogue,” and facilitators are trained to ensure that this happens. Indeed, facilitators are cadre-trained to the role of what psychologist Irving Janis called the “mindguard.”4 The self-regarding workshop facilitator is a familiar bit-player in both the university and, increasingly, in the larger corporate world. While generalizations always admit of exceptions, here we emphasize the rank-and-file workshops to be found nationwide, which are similar enough in staffing, form, and content to permit us to draw a number of conclusions. Almost without exception, the personnel who run “diversity” workshops are either poorly credentialed, inadequately credentialed, or credentialed not at all in the fields for which they claim expertise. In DiAngelo’s experiment, the facilitators were both 23 years old and just-graduated from college.

If such workshops actually encouraged universally appreciated values, such as equality under law, mutual tolerance for opinion differences, and morally correct behavior toward each other, a majority of people would likely support them. And this is the impression, I wager, that most folks develop of such activities. Who could be against “anti-racism training” or “diversity” or “learning about race” or having “courageous conversations?” But this is far from what occurs in such events that carry these euphemistic labels.

Undergirding and permeating all such “diversity” events—without exception—is the racialist ideology of critical race theory and critical pedagogy, both of which have been confected by second-rate academics and their fellow travelers on the workshop circuit.

These workshops offer material that is sourced from the critical racialist ideology that originates in critical discourse communities in the university. Too often, the material in these workshops is brought in from sketchy outside non-academic operations.5 Investigation into the academic backgrounds of a random sample of workshop facilitators, education “counselors,” and quasi-academics reveals a cohort of substandard practitioners of racialist ideology, who use their narrow educational brief—often a master’s level degree in “counseling”—to extend themselves into fields for which they are often unqualified, primarily into the field of psychology.

This lack of credentials and the amateurish coercive character of these workshops is a telltale feature of the workshop phenomenon, not an exception; detailed after-action accounts are published in cargo cult journals along with discussions of coercive tactics designed to increase the didactic effectiveness of critical racialist ideology. The most egregious aspect of these workshops is the content, which is informed by a particular brand of critical racialist ideology that is unsupported by vetted mainstream scholarship.

Workshops grounded in critical racialist ideology are invariably performative. In this sense, they are scripted affairs of psychological manipulation designed to identify villains and victims in a larger ongoing ahistorical drama, and to confront the villains in a well-practiced theatrical performance.

Critical racialist ideology leans heavily on the notion of “systemic racism” that confers something called “privilege” on favored racial group(s). Workshops informed by this ideology are constructed to confront people who have been identified as “privileged.” This confrontation consists of an accusation that the subject is complicit in “racism”—the accusation is framed as a “difficult dialogue” or “courageous conversation.” Workshop facilitators expect a range of responses to their charges of complicity in the “racist” system. In this case, when the facilitators confront “white people” with alleged complicity in a structurally racist system from which they benefit in the form of “unearned privileges,” the people challenged respond in particular and predictable ways.

Racialist workshop facilitators proceed on this assumption, attack certain participants, and then invariably observe the behavior that their theory “predicts.” Critical racialist ideology interprets these reactions as “resistance” to what is being “taught.” Oddly, this “resistance” also constitutes evidence for the actual material presented in the workshop. In this way, a primitive circularity of argument provides faux “evidence” of the central contentions of the “diversity” event. Self-contained tautological systems such as Marxism, psychoanalysis, and astrology have always worked in this same pseudoscientific way. In the workshops themselves, people do not respond well to this, just as most people do not respond well to perceived false accusations. Because it is expected, this “resistance” becomes evidence of the critical racialist ideology, by virtue of its “successful predictions.” This fallacious process has become known in the vernacular as Kafka-trapping, which derives from Franz Kafka’s classic work The Trial—required reading for anyone who would understand the thrust of critical racialism today.

It is likely that no better example of performative pseudoscience exists than the contemporary theatrics of the critical racialist workshop.

Performative Pseudoscience

Steeped in racialist ideology, facilitators in these workshops catalog the reactions of “white people” to their accusations and believe themselves to be engaged in a kind of psychological exercise of enlightenment. They are, however, engaged in an entirely different activity, one which is strikingly familiar to anyone acquainted with the behavior modification techniques of Maoist China.

Ideological attacks on captive audiences to achieve behavior modification are nothing new. This is called “thought reform,” a form of coercive psychological human experiment in behavior modification used with various levels of intensity across a range of situations, some of them called “educational.” Robert Lifton, the world’s expert on thought reform of this type describes it this way: “There is the demand that one confess to crimes one has not committed, to sinfulness that is artificially induced, in the name of a cure that is arbitrarily imposed.”6 This is the undercurrent of every racialist “dialogue,” “workshop,” “caucus,” and “courageous conversation,” and was clearly present in DiAngelo’s “dialogue” experiment.

DiAngelo set up her human experiment utilizing an “intergroup dialogue” template, one she actually taught at UW. The idea of “Intergroup Dialogue” originated at the University of Michigan in Ann Arbor in the 1980s, and it emerged in the 2000s as a distinct program of thought reform that relies upon proven techniques of group therapy. As is the case with most of the racialist thought reform programs, this one carries a neutral, anodyne description that is clarified by the final two words, which communicates a powerful and seductive agenda.

Intergroup dialogue is a face-to-face, interactive, and facilitated learning experience that brings together twelve to eighteen students from two or more social identity groups over a sustained period to explore commonalities and difference, examine the nature and consequences of systems of power and privilege, and find ways to work together to promote social justice.7

Anything that facilitators offer from the menu of critical racialism, of course, becomes de facto a promotion of this “social justice.”

Intergroup Dialogue employs a “conveyer belt” approach in its application, and the purpose is to inculcate into human subjects the worldview of critical racialism. This conveyer belt moves participants along to “critical consciousness,” a state of full acceptance of the ideology.

Theorists suggest that the process of understanding one’s social identities in relation to systems of oppression such as racism and sexism generally moves from unawareness to exploration to awareness of the impact of social group membership on the self and finally toward internalizing and integrating this awareness.8

The “conveyer belt” metaphor is used often within the critical racialist literature to describe the process to move the target subjects along in a stage-by-stage process, none of which is revealed to the victims.9 This is a common metaphor in the thought reform literature.10

This workshop template informed DiAngelo’s experimental construct. Afterward, she employed something called critical discourse analysis (CDA) to evaluate these faux “dialogues,” and to generate her conclusions. More on this CDA in a moment.

Let’s review the specifics of this human subject experiment to gain an idea of DiAngelo’s mindset and method, the near-malfeasance that bears scrutiny, and the almost nonexistent basis for DiAngelo’s grand idea that is enriching her even now.

 Human Subject Experimentation: The Setup

DiAngelo wanted to discover how “whiteness is manifested” among white preservice teachers. Or, at least this is the purpose she claims in her write-up. Says DiAngelo initially:

The purpose of this study was to describe and analyize [sic] the discourses used by White preservice teachers in a dialogue about race with people of color.

That’s no misprint—she misspells the word “analyze” just 10 words into her first major academic work. As if to say “yes, I really meant it,” she misspells it again the same way on page 22. For academics—would-be academics—this is no small thing, and these types of errors riddle the piece, leeching away credibility vowel-by-consonant-by-vowel.

The experiment itself, however, seems simple enough. Even elegant.

It consisted of a series of four “conversations about race,” each lasting two hours. The discussants were eight white preservice teachers and five “persons of color.” The white subjects self-selected into the experiment in answer to an ad. All of the white subjects were apparently students in the UW teacher education program. What about the “persons of color?”  They arrived via a different selection process.

The “persons of color” were selected by DiAngelo herself from other departments, such as the “School of Social Work.” DiAngelo also selected the facilitators: two fresh college graduates—both 23—who were trained in “leading racial dialogues.” DiAngelo herself was the one who had trained them in the techniques of “intergroup dialogue.” The sessions were filmed, and DiAngelo watched as an observer present in the room.

This seems reasonably simple and straightforward. But it was not.

What DiAngelo presented as “dialogues on race” among participants “from a range of racial backgrounds” was actually an ideological set-up of the racialist workshop variety. In the vernacular, it would be called an ambush.

The eight unsuspecting white participants self-selected into the study, while the five “persons of color” were selected by a different method known only to DiAngelo, while the two facilitators were trained in the precepts of critical racialist ideology, a pillar of “intergroup dialogues.” Even non-psychologists can sense something amiss here already. The participants certainly did, almost immediately.

From the very beginning of the experiment, the “purpose” of the sessions became an issue. Why? The white participants believed that they would participate in a study described in the advertisement and in the consent form, but what actually transpired in the meetings was something dramatically different. This is because DiAngelo and her collaborators were engaged in an entirely different enterprise than what the subjects were told.

DiAngelo reveals in her dissertation that she and her collaborators were, in fact, enacting a critical racialist thought reform script whereby her “trained facilitators” would lead these unsuspecting subjects into a thicket of ideology on “whiteness.” This led to contention and outright conflict in each of the sessions as the white participants realized something very different was unfolding than what they expected.

To their credit, some of the white participants repeatedly challenged the facilitators on the purpose of the experiment. In answer, they received only “policy readings.” These were repeated readings of the advertisement and consent form rather than an answer to their specific queries. This retreat into policy readings, of course, is the face of bureaucracy, when functionaries either cannot or will not engage with facts on the ground. It is also a red flag that something unsavory is afoot. This was not lost on the targets of the experiment.

Throughout all four of these sessions, the subjects rebelled against the apparent trickery. In the last of the four sessions, in fact, one angry participant walked out, even as facilitators badgered her to stay. This badgering alone was a violation of the participant consent form.

How did DiAngelo interpret all of this after-the-fact, after she deployed her method of critical discourse analysis?

DiAngelo interprets the facts in ways that seem strangely disconnected from the reality of what actually transpired, and there is good reason for this. To compound what some might consider malfeasance, DiAngelo evaluated her “data” from her psychological experiment using a discredited method that was guaranteed to yield her “hypothesized” results: critical discourse analysis. It sounds sort of impressive until you poke around a bit, for DiAngelo is neither a psychologist nor is she a linguist.

Let’s look first at this critical discourse analysis and then at DiAngelo’s extrapolations.

The Progressive Scam of “Critical Discourse Analysis”

When we think of a social scientist utilizing a method to explore a question—or test a hypothesis—we think of a researcher trying to discover something new, to generate new knowledge. To substantiate or to disconfirm the question on the table, with the ultimate result in doubt. “Critical discourse analysis” (CDA) does something quite different.

CDA is one of a handful of “guarantor methodologies” that, as the name suggests, guarantees delivery of the results the researcher desires.11 Critical Discourse Analysis is an ideologically driven version of discourse analysis that is specifically crafted to yield desired ideological results. In this, it is not a real method of inquiry at all, but rather a common heuristic tool that sanctions what one wants to see. It constitutes codified confirmation bias. In other words, anyone who employs CDA knows the “results” beforehand, and these results always confirm progressive notions. It’s no secret that this is CDA’s purpose, which its proponents freely acknowledge; the political agenda of CDA is overt and “unabashedly political and responsive to social injustices.”12

In this CDA enterprise, high spirits, sensitivity, and emotional investment carry the day. Northrop Frye described the style many decades ago, offering the coinage of “kinetic emotion” to capture its fevered tenor.

The further we go in this direction, the more likely the author is to be, or to pretend to be, emotionally involved with his subject, so that what he exhorts us to embrace or avoid is in part a projection from his own emotional life. As this increases, a certain automatism comes into the writing: the verbal expression of infantile-centered hatreds, fears, loves, and objects of adoration. . . . Such writing is a familiar and easily recognized phenomenon: it is tantrum prose, the prose of so much Victorian criticism, of several acres of Carlyle and Ruskin, of clerical denunciations of heresies or secular amusements, of totalitarian propaganda, and in fact of nearly all rhetoric in which we feel that the author’s pen is running away from him, setting up a mechanical for an imaginative impetus. The metaphor of “intoxication” is often employed for the breakdown of rhetorical control.13

Linguistics academic H.G. Widdowson, sympathetic to CDA, nonetheless echoes Frye:

The commentary is effective to the extent that it has affective appeal, that it carries conviction, resonates persuasively with the attitudes, emotions, values of the reader. And since it is the avowed pretextual mission of CDA, as an approach, to induce sociopolitical awareness and inspire social action, this kind of commentary is very well suited to its purpose. Promoting the cause of social justice does not depend on being methodical in analysis, nor even on being coherent in argument. The case for CDA is subservient to its cause, and if the case carries conviction that is all that counts.14

And so, by using this method of critical discourse analysis, DiAngelo knew exactly the results of her human experiment before she ever deployed her method to evaluate the transcripts of her workshop-experiment. In the vernacular, it’s a fake methodology to generate fake scholarship, no better than the yam-harvest predictions of our esteemed Melanesian weather doctor, which are always correct, regardless of what happens.

But DiAngelo was not nearly done with her manipulations.

Universalizing the Provincial

She then claimed to universalize these contrived results drawn from her tiny sample of eight self-selecting persons from a college teacher program. From her findings, such as they were, she sought to generalize about “white people” in all of America. This included her social fantasy of “white fragility,” which she derived from this tiny, skewed sample of unwitting subjects.

A person who has facility with any scientific undertaking knows that a project with an n of 8 would not win entry into the typical middle-school science fair, and it surely is an unacceptable sample from which to draw any conclusions whatever, other than about the people involved. Bizarrely, DiAngelo herself acknowledges this in writing: “Less than 10 participants would not have provided a wide enough range of discourses.” Yet she inexplicably includes only eight white subjects; the additional five “persons of color” were simply shills that DiAngelo selected to perform to her script. Says DiAngelo: “The research project itself set up Malena [facilitator] and the participants of color as a platform for White performers.”

So with eight white teachers, DiAngelo generalizes her “white fragility” fantasy for the rest of the nation. And how is it “generalizable?” Well, it looks and sounds just like all of the other critical racialist journalism she’s seen. Let her speak for herself:

My primary measure of generalizability was my ability to tie the discourses documented in this study to the larger body of research in the Whiteness literature. The ways in which the discourses here fit within the literature of Whiteness indicates that this group was not idiosyncratic.

Is this surprising? That DiAngelo believes her fake results to be generalizable because folks like her all say the same things?

DiAngelo here reveals that her use of selection bias, confirmation bias, and the guarantor methodology in her experiment on eight teachers in Seattle 17 years ago yields results sufficient to generalize about millions of anonymous Americans today. What are we to make of this grand pronouncement, other than to conclude that a kind of parochial arrogance afflicts DiAngelo? You can draw your own conclusion about this.

But let’s turn back to that business of a non-psychologist conducting human subject experimentation.

“If It Was Good Enough for Victor Frankenstein . . .”

Again, Robin DiAngelo is not a psychologist, so it is troubling that her human subject experiment passed muster by the University of Washington’s Institutional Review Board (IRB), the federally mandated entity to prevent the abuse of human subjects by researchers.15 My view is that the UW IRB is probably top-notch, so the explanation lies elsewhere. To this day, it’s not clear whether the human subject experiment design that the UW IRB approved was the experiment that DiAngelo actually conducted.

This is in serious doubt, because the subjects of the experiment protested throughout the experiment that what was actually happening was not what they had consented to. Nor was the study described accurately in the consent form that they had signed (a form that carries a code that confirms IRB approval: HS#03-7679-E 01). The UW Human Subjects Division confirms that DiAngelo’s project was given IRB approval, but it is not clear that what transpired in the experiment is what DiAngelo proposed. Considerable discrepancy exists between what the subjects consented to, and what they actually experienced—severe pressure, ridicule, emotional trauma, anger, and psychological stress, the possibility of which appeared nowhere in the study advertisement or the consent forms. DiAngelo also acknowledges serious shortcomings in the structure of her experiment that she was aware of beforehand but did not correct: “[T]here were a few simple safeguards that I knew to put in place but didn’t.”

DiAngelo sat silent and watched it all crumble in front of her as she took meticulous notes to inform the subsequent dissertation. Her self-exculpatory discourse puts the best face on a failed experiment but does little to hide the disaster and, in fact, carries more than a whiff of “let me scrape together something usable from the ashes of this debacle.” Moreover, her 1,900-word concluding chapter constitutes a mea culpa confession for committing the sin of striving for objectivity in her research.

In retrospect, she believes that she should have abandoned any pretense of objectivity and instead should have incorporated herself into the study as a kind of white racist sinner playing her prescribed role, making her more in-tune with feelings and needs in what Northrop Frye identified earlier as “kinetic emotion.” This confessional should alone disqualify her from ever again being mistaken as committing serious scholarship. But on the upside for DiAngelo, those same 1,900 concluding words constitute a classic “critical white confession” to lift her to Elysian status in today’s pantheon of white flagellants, otherwise known as “white allies” in the vernacular of critical racialism. 

The Big Con of White Fragility

This experiment is the origin of the “White Fragility”that has become popular with a segment of the population eager to confirm their prejudices and to find reason for self-flagellation. It’s DiAngelo’s original sin, and her entire edifice of “White Fragility” is based on these conversations 17 years ago among eight unsuspecting, angry white preservice teachers, as DiAngelo and her collaborators contrived to set up their “whiteness” experiment under less than honest pretenses.

This is the shaky basis for DiAngelo’s claims today to be an academic and educator. The possibility for malfeasance, unintentional perhaps, in this research project is so manifest that it should at least give pause to those salivating over the prospect of plunking down DiAngelo’s $12,000 speaking fee to mouth provincial platitudes.

DiAngelo’s account of her original sin is fascinating, and you can read about this yourself, as her 2004 dissertation is available—for now—from ProQuest as an abject lesson in pseudoscholarship. People may judge for themselves. It likely could be restricted after interested persons begin to discover this trove. In such a case, a copy can be found at this link.

The upshot of all this is that Robin DiAngelo is revealed as little more than a shallow-thinking provincial, modern-day jongleur, who draws a huge paycheck peddling prejudice. But people who get snookered out of significant capital eventually wise up . . . and they usually aren’t happy about it.

It’s minute 14 for Robin DiAngelo, and the clock is ticking. Perhaps it’s time for her to bank those royalties, cash those speaker-fee checks, and fade out of the public consciousness.

Someone should tell her.

 


 

1 See Rod Dreher in The American Conservative, June 29, 2020: https://www.theamericanconservative.com/dreher/journalism-propaganda-press-robin-diangelo/  Scholar Heather Mac Donald is particularly harsh, calling DiAngelo a “diversity scammer.”  See Heather Mac Donald in The American Mind, April 1, 2019: https://www.manhattan-institute.org/html/fake-bigotry-real-money

2 Like wayward troubadours of medieval Europe, today’s critical racialists serve as modern-day jongleurs or the medieval wandering monks called gyrovagi, who travel from discipline to discipline searching for theories to undermine, boundaries to “transgress,” premises to “interrogate,” and invisible assumptions to “demystify.”  “Day after day, walking, begging, sweating, whining, on they go, rather than stay in one place, there to toil, and there abide: humble at their incoming, arrogant and graceless at their outgoing.” See Helen Waddell, The Wandering Scholars (New York: Doubleday, 1955), p. 179.

3 The “co-curriculum” is the raft of seminars and workshops that university administrators sponsor outside of the actual curriculum to avoid the necessity of academic rigor and standards imposed by faculty. It is a kind of simulacrum of the actual curriculum, distorted and bearing the trappings of academia, yet deficient in every aspect that matters—a kind of cargo cult curriculum. This is how superstition and pseudoscience gain purchase in the academy. “[T]he use of terms such as co-curricular and co-curriculum articulates academic bureaucrats’ ambition to claim equal status for the activities they sponsor. The emergence of the co-curricular transcript gives administrative form to the co-curricular bureaucracies’ claims to equal status with the professoriate in higher education.” David Randall, Social Justice Education in America (New York: National Association of Scholars, 2019), p. 154.

4 Irving L. Janis, Groupthink (2e), (Boston: Houghton Mifflin Company, 1982), p. 174-175.

5 See Shakti Butler’s material at the University of Delaware, sourced from something called Undoing Racism: The People’s Institute for Survival and Beyond. Shakti Butler, University of Delaware Office of Residence Life, Diversity Facilitation Training, August 14 and 15, 2007. See Tema Okun’s “dismantling racism” project based out of Durham, NC. Particularly noteworthy is the provincial provenance of Ms. Okun’s racialist material, which she simply contrived as a “quick and dirty” list in a fit of pique.  See Tema Jon Okun, The Emperor Has No Clothes: Teaching about Race and Racism to People Who Don’t Want to Know, Dissertation: 2010, University of North Carolina-Greensboro, p. 29.

6 The literature on the concepts of thought reform and thought remolding is immense. Major works on these concepts are Robert Jay Lifton, Thought Reform and the Psychology of Totalism (W. W. Norton & Company, Inc., 1961), Theodore E. H. Chen, Thought Reform of Chinese Intellectuals (Hong Kong: Hong Kong University Press, 1960), and Hu Ping, The Thought Remolding Campaign of the Chinese Communist Party-State (Amsterdam: Amsterdam University Press, 2012). Particularly on-point is William F. O’Neill and George D. Demos, Education Under Duress: Behavior Modification Through Thought Reform (Los Angeles: LDI Books, 1971).

7 Ximena Zuniga, Biren (Ratnesh) A. Nagda, Mark Chesler, and Adena Cytron-Walker, “Intergroup Dialogue in Higher Education: Meaningful Learning about Social Justice,” ASHE Higher Education Report, Volume 32, Number 4, 2007, p. vii.

8 Ximena Zuniga, Biren (Ratnesh) A. Nagda, Mark Chesler, and Adena Cytron-Walker, “Intergroup Dialogue in Higher Education: Meaningful Learning about Social Justice,” ASHE Higher Education Report, Volume 32, Number 4, 2007, p. xi.

9 Among those who use the “conveyer belt” metaphor are Beverly Tatum and Derald Wing Sue. See: Beverly Daniel Tatum, “Why are all the Black Kids sitting together in the Cafeteria” (New York: Basic Books, 1997, 1999), p. 11. See also:  Derald Wing Sue, “The Challenges of Becoming a White Ally,” The Counseling Psychologist 2017, Vol. 45(5), p. 707.

10 “The Lenient Policy,” Appendix 1 in Edgar Schein, Coercive Persuasion: A Socio-psychological Analysis of the “Brainwashing” of American Civilian Prisoners by the Chinese Communists (New York: W. W. Norton & Company, Inc., 1961, 1971), p. 287-288.

11 We recognize here several of these guarantor methodologies, including authoethnography, critical discourse analysis, “testimonios,”grounded theory, various forms of qualitative research, narrative, and storytelling (both true and fictional).

12 Katherine Bischoping and Amber Gazco, Analyzing Talk in the Social Sciences: Narrative Conversation and Discourse Strategies (London: Sage, 2016), p. 154. This notion of “social injustice” is rarely identified clearly, and if it is, it usually constitutes a specific state of affairs that the author(s) finds unpleasant or undesirable and whose actual origins the author(s) has no desire to discover. The battle for social justice or against social injustice becomes the reflexive justification and overarching rubric for any action, program, opinion, or activity of the moment, which may or may not have a connection to anything real.

13 Northrop Frye, Anatomy of Criticism (Princeton, NJ: Princeton University Press, 1957, 1971), p. 328.

14 H. G. Widdowson, Text, Context, Pretext: Critical Issues in Discourse Analysis (Oxford: Blackwell Publishing, 2004), p. 163.

15 “IRB approval is not only required for traditional research such as clinical trials, experimental studies with control groups, and biomedical research but also for nontraditional research conducted in the community, classroom, and health promotion programs.” Whitney Boling, Kathryn Berlin, Rhonda N. Rahn, Jody L. Vogelzang, Gayle Walter, “Institutional Review Board Basics for Pedagogy Research,” Pedagogy in Health Promotion: The Scholarship of Teaching and Learning, 2018, Vol. 4(3), p. 173.

Weekend Long Read

Orson Welles: American Maverick

Through his work, Welles revelled in the game of concealment and illumination but, in the end, reveals it to be more than just a silly game.

The scene is perfectly set, full of trickery and magic: from a seemingly abandoned train station, a towering figure of a man in a cape and a hat emerges out of a cloud of smoke. The man decides to do a few magic tricks for a couple of children and is caught by the knowing gaze of a beautiful and exotic looking woman, dressed in fur. She registers a look of slight disapproval at the man in the cape, as if she caught a child fumbling through a cookie jar. But her disapproval quickly melts away and she forgives the man for being playful.

“Up to your old tricks, I see,” says the beautiful woman.

“Why not?” says the man in a rather booming and distinctive voice. “I’m a charlatan.” 

With that, he gives her a mischievous smirk betraying his true sentiments. The man is not offering a confession or looking for absolution but is instead offering his own assessment of any expert who might wish to judge him a charlatan. 

The man in the cape is the great actor and director, Orson Welles (1915-1985), in one of his later films, “F for Fake” (1973). Welles, a charlatan? Of course, this description is preposterous and untrue, and Welles himself said it in jest. His grin, both devious and utterly innocent tells us something quite the opposite: Welles is asserting his artistic superiority yet he really doesn’t care what he is called, whether by the critics, “experts,” or even his loyal audience. Orson Welles cared only about one thing: making great movies.

One Man’s Fight Against the Establishment

It would be all too easy and lazy to dismiss Orson Welles as a “has-been”—someone who quickly rose to fame and then crashed, disappearing from the public eye. Welles is usually captured in the American imagination as three personalities: a radio broadcaster who delivered the infamous “War of the Worlds” broadcast that scared many people into believing an alien attack was underway in rural New Jersey; a director of the best American film ever made, “Citizen Kane” (1941); and a spokesperson for Paul Masson wine, Japanese whiskey, and frozen peas. Welles’ weight problem, the subject of many jokes (some of which he didn’t mind, as he often made fun of himself), is too often the focus of many critics and people who worked with him. These are all beside the point. They constitute such a small part of who he was that they obscure the big picture. 

In many ways, Welles lived thousands of lifetimes in just one life—whether it was in acting or in innovating new cinematic forms. Welles was the founder of what we now call “independent cinema,” even as most so-called “independent filmmakers” are nothing more than the fakers Welles so obviously abhorred. The only other American film director who can authentically claim the mantle of independent filmmaker is John Cassavetes, who, like Welles, hated the phoniness and corruption for which Hollywood was known.

Naturally, Welles was aware that he was not a “joiner.” He didn’t join any clubs of any kind. He was too free and supremely confident in that freedom to submit his personhood to any authority, let alone any creative authority coming out of Hollywood. In his book, Whatever Happened to Orson Welles: A Portrait of an Independent Career (2006), the Welles scholar Joseph McBride singles out some of Welles’ more vocal pronouncements about Hollywood. 

Even as early as 1939, when Welles first came to Hollywood, he was considered a man to watch. His anti-establishment credentials were fully perceived, in other words. Welles recalls that he was considered a

. . . terrible maverick . . . I was sort of 40 or 30 years ahead of my time . . . a sort of ghost of Christmas future. There was the one beatnik, you know, there was this guy with a beard who was going to do it all by himself. I represented the terrible future of what was going to happen to that town. So I was hated and despised, theoretically, but I had all kinds of friends among the real dinosaurs, who were awfully nice to me. And I had a very good time. But I believe that I have looked back too optimistically on Hollywood. Because my daughter has a group of books about Hollywood that she bought, I don’t know, probably vainly looking for references of her father in them. And I took to reading them lately. And I realized how many great people that town has destroyed since its earliest beginnings—how almost everybody of merit was destroyed or diminished, and how the few people who were good who survived, what a great minority they were . . . And I take my own life out of it and see what they did to other people, I see that the story of that town is a dirty one, and its record is bad.

This is most certainly a moment of honest self-reflection on Welles’ part. He was quite capable of that, despite his constant humor and laughter, which can suggest some kind of avoidance of life’s realities. He may have fashioned himself into that figure of a magician, of the “Great Orsini,” who laughs loudly and boisterously, but do not be fooled: Welles was a true artist. He saw more clearly than the rest of humanity what this very humanity is made of, what its joys and sorrows are, its faults, and attempts at perfection. The movie establishment, naturally, was not particularly interested in looking deeply into the question of what it means to be human.

The fact that Welles had no regard for anything other than his films did not help his cause, and this was especially true in his first film, “Citizen Kane” (1941). The sheer audacity to make a film based almost entirely on the life of William Randolph Hearst, the publishing magnate, was palpable in this 25-year-old dynamo. Not only did he challenge every possible convention in the business of technical filmmaking (his employment of different camera angles alone changed the trajectory of filmmaking forever), but the barely veiled correlations to Hearst’s life was what got Welles into a heap of trouble. Hearst did not like the way his mistress, Marion Davies, was portrayed in the film. In “Citizen Kane” she is an empty-headed bimbo incapable of good acting, although in reality was quite an actress and a comedienne. This caused Hearst to want to destroy Welles at any cost. 

Since he had power over the newspaper industry, Hearst began circulating rumors that Welles was a Communist—an insult that at that time turned the public against the person so accused. Of course, Welles was never a member of the Communist Party. 

But there was also a psychological component of establishment Hollywood’s rejection of Welles. Any figure in any field who demonstrates the potential to disrupt the machinery churning out secure and profitable projects is bound to be hated. Individual creativity under such circumstances is often scorned, and as Joseph McBride points out, “Welles serves as a perfect whipping boy for those in the film industry and in the media who uncritically worship the imperatives and products of the commercial system.” 

Welles most definitely was a mischievous man and he had no qualms about causing certain disruptions. But these disruptions were more along the lines of explorations in art, rather than deliberate salvos aimed at film studios. After all, he needed the structure of the studio in order to make movies. But oftentimes, collectivism wins the battle over individual vision.

What is most ironic is the fact that this brilliant American filmmaker was unable to be an individual in America. This certainly points to a sad state of affairs when it comes to individual liberty and what it means to be an artist in America. The American consciousness somehow rejected Welles despite the fact that Welles never abandoned it. When he died, many obituaries continued the utter fiction that Welles only made one great film—“Citizen Kane”—and the rest of his life was spent in sad isolation, a tragic end of what could have been a great career. 

Even his collaborator in the Mercury Theater productions, John Houseman (the infamously cranky professor in “The Paper Chase,” 1973) contributed to this unfair and untrue portrayal of Welles: “If there was a downfall, then it was entirely of his own doing. I mean, nobody stopped him from producing more ‘Citizen Kanes.’” 

The dismissiveness of that statement is self-evident. How can any self-respecting artist bring himself to engage in some “wash, rinse, repeat” cycle of creating art? Such an approach is the very definition of an artistic fraud. Commenting on the various negative obituaries of Welles, Joseph McBride writes, “Scorning Welles as a tragic failure of gargantuan proportions seems to satisfy a public need (at least in America) to point a finger at an archetypal ‘spoiled artist,’ to bring genius down to the level of everyday mediocrity.”

But did the American audience truly reject Welles’ contribution to film? Or should we blame the film studios, most notably RKO Pictures, whose administrators and producers repeatedly butchered Welles’ films in order to make them more ‘accessible’ to mainstream audiences? Welles’ second feature-length film, “The Magnificent Ambersons” (1942) certainly suffered such a fate. 

This particular film was an adaptation of Booth Tarkington’s eponymous novel that centered on one family’s magnificence and its loss. The film primarily focuses on the dark side of the unraveling of the family, which is paralleled with the invention of the automobile. As everything is becoming faster and more industrialized, the Ambersons can’t seem to or don’t want to catch up and instead live in a reality that resembles more nostalgia than the hard facts of life.

Just like in “Citizen Kane,” Welles excelled here at new and innovative techniques in filmmaking. By this point, that probably wouldn’t have bothered the studio executives but after screening the entire film to a few select audiences, they weren’t pleased with the reception, and thus embarked on what ended up being the greatest ever mutilation of a film in Hollywood.

Almost 50 minutes were cut and a new ending was shot, one that was “happier” since Welles’ ending was far too grim. The logic of the studio executives was that the film wasn’t going to make much money unless there is a happier ending. Pearl Harbor was just attacked, and who wants to watch a dark and gloomy film about a bunch of American aristocrats?

According to Joseph McBride, some of the producers on “The Magnificent Ambersons” also lied to Welles about the budget, informing him that he was over budget when in reality he was under budget. The conversations between studio executives reveal a series of paranoid and vindictive attacks on Welles at one point even entertaining a plan to throw Welles to “the authorities,” whomever those were supposed to be. One can only imagine. The missing 43 minutes of film reels allegedly were destroyed. At least, the studio executives revealed in their conversations that the footage needed to be destroyed. Whether it happened is impossible to know at this point. It may yet turn up in some basement of some bar or a warehouse that has nothing to do with storing film reels.

Welles faced a similar problem with his brilliant work of film noir, “Touch of Evil” (1958). Initially, when the film was released, it was shortened, and badly edited. Welles was so angry that he sent a 58-page memo to the producers with strict and specific instructions on how the film should look. They didn’t listen. It was only decades later that the film was edited by Walter Murch according to Welles’ exact instructions. Murch said that Welles “. . . was a guy who was 20 to 25 years ahead of his time. That was his glory and why he had such problems. Hollywood doesn’t like people who are ahead of their time. They like people who are just ahead of their time, like six seconds ahead of their times, because those persons make the most money.”

It is generally accepted that Welles was at times difficult to work with but he was also a consummate professional and, according to many actors, a very fast-moving film director. He may not have been Mr. Sunshine but he certainly did not deserve to have his work thrown to the vulgar wolves who knew nothing of film as an elevated art form. Welles was a master at exploring human interiority and no director since has achieved his incredible level of precision when it comes to the development of characters as well as cinematic innovations.

Cecil Beaton/Condé Nast via Getty Images

Reality or Illusion

It is often said that all of philosophy is just a series of footnotes to Plato, and the same can be said of cinema and Welles: almost every cinematic expression that came after “Citizen Kane” owes something to Orson Welles. Yet if one is continuously misunderstood and not accepted by the establishment governing one’s industry, then how does one manage to continue to create? 

Will the artist break at some point because of the slow disappearance of the essence that makes him an artist? Some might but Welles never did. 

After he directed “Citizen Kane,” Welles went on to direct 11 more feature-length films. And then, there is also an incredible amount of his unfinished work. He was often accused of being fearful and it is suggested that it was fear that kept him from finishing many films. But this is little more than ridiculous psychologizing; it couldn’t be further from the truth. In fact, the reason many of Welles’ projects were interrupted was not that he was flighty or had some mental hang-up about completion. It had simply to do with the fact that he so often would run out of money. That was the reason he did all those silly commercials: his devotion to art was so strong that he fell in love with this “crazy profession” of movie-making. He couldn’t extricate himself from the magic or the hold that movies had on him.

Although he may not have been fearful, there was one aspect of Welles’ expression as an artist that, at times, had negative effects on his process. Film critic Molly Haskell points out that Welles had “almost debilitating dissatisfaction that sprang from the very nature of his genius: an overabundance of ideas. . . . Because he was a master of so many (too many?) of the facets of the cinema—cutting, staging, camera movement and framing, dialogue, sound, and performance—there was always some new angle to try. Every film contained many films, unrealized possibilities.”

This is not an easy predicament for an artist. On one hand, ideas and inspiration are constantly flowing and Welles, as a director, could envision all of them. Yet, the unstoppable stream of possibilities could also render him silent, incapable of realizing anything at all. 

Ever since he was a boy, Welles loved magic tricks. He was fascinated by the response of the audience, while being, quite literally, the center of attention. According to one of Welles’ biographers, Frank Brady, Welles was introduced to the world of magic by Dr. Bernstein (his mother’s partner and a surrogate father). Welles took to it, and “to further Orson’s apprenticeship as a magician, Dr. Bernstein took him backstage at the theater where Houdini was performing so he could meet the master magician and escapologist. Impressed by the child’s knowledge of the art of magic through the ages and his grasp of the technical aspects of the craft, Houdini taught the boy a simple but effective trick with a red handkerchief.”

In many ways, Welles’ artistic drive and hunger had a lot to do with the way he grew up. He always had to prove himself with some kind of a performance, and later in life described himself as a very “precocious child,” which probably was not pleasant for many people.

Welles’ parents divorced very early on in his life; his mother died when he was 9 years old, and his father died when he was just 15. These were the great tragedies and sorrows in Welles’ life which, undoubtedly, he carried with him throughout his career. He was educated in an unconventional way—he did not particularly like to go to school—so his mother encouraged him to read Shakespeare, Keats, and Tennyson. 

It was said his father died alone in a Chicago hotel room, and throughout his life, Welles suspected that it might have been suicide by drink. But like many things in Welles’ life, this story may be an exaggeration. Many of the tales that Welles spun about his life turned out either to be untrue or just too vague to confirm. We are always left to wonder whether Welles, the consummate magician, is playing a trick on us. Just as we are about to get close to Welles and perhaps even to understand a small part of him, he plays a magic trick and poof! He disappears from view.

It is easy to psychologize and psychoanalyze an artist in order to try to gain a better understanding of his life, and in doing that we could conclude that Welles’ childhood was the major factor in his artistic choices. But what purpose would such a conclusion even serve? It is hard to know and easy to suppose. It is in Welles’ films that he most readily reveals and conceals himself. We must look to his creations. 

In many cases, he picked subjects that deal with action and that play with the notion of reality and illusion. Both reality and illusion can be disorienting but Welles’ aim was never to disorient the audience. Perhaps he wanted them to wake up from the slumber of illusions, but even that assertion may be forcing Welles into a symbolic box of cinematic analysis. The truth is that he constantly evades analysis as every true artist should do.

Sometimes the illusions he created were dark and demanding, leaving one ill at ease,—as in the case of Welles’ adaptation of Franz Kafka’s novel, “The Trial.” Other times, the illusions were humorous and meant to provoke, as in the case of “F for Fake.” This “documentary essay,” as Welles’ friend Peter Bogdanovich called it, is full of “trickery,” as Welles warns us at the beginning. The audience is not sure whether to accept anything as true. In the film, we follow the story of two hoaxers: Elmyr de Hory, a famous art forger, and Clifford Irving, a writer who famously wrote a hoax biography about the reclusive director, inventor, and aviator, Howard Hughes. Both de Hory and Irving ended up serving some time in prison for their forgeries and lies.

The film’s art rests in editing, and it shows the power of narrative that is created before our eyes. Should we trust the images that are superimposed and built up like modules one after another? Should we take Welles, our narrator and guide throughout the film, seriously or dismiss him as yet another artistic scoundrel? Welles wants us to ask these questions but he is not going to spoon-feed us any answers. 

The questions that arise in this film are not merely about proving the guilt or innocence of de Hory or Irving, or even Welles. They are about the nature of art in our lives. Welles wants us to ponder what things ought to be considered art. Is it something that is beautiful? Surely. But who decides what is beautiful? The so-called experts? An establishment that decides what is good or bad? 

It is hardly surprising that “F for Fake” did not go over well in the United States, though this might be primarily because there was hardly any distribution for it. When Bogdanovich tried to distribute “F for Fake,” the distributor he screened the film for fell asleep during the presentation. As Joseph McBride points out, “By 1970, the American public barely knew Welles as a director. With a myopic perspective built by the largely hostile American media, they knew him mostly as a buffoonish has-been, a cameo player in bad movies and a guest on Dean Martin’s television variety show.”

Hollywood was changing fast and all the glamour was turned into the worship of the bearded and drugged hippie. The question of whether movies are “magical” was one to laugh and sneer at. But despite the aesthetic changes of Hollywood culture, it remained an immovable and corrupt establishment. 

At the time of this film, Welles had already been an American exiled in Europe. He left America, rather reluctantly, simply because it became harder and harder to make a film here. His vision was rejected. The more America rejected him, the more Europe loved him. The French director, François Truffaut admired Welles immensely and thought “Citizen Kane” was the film of films, and “the only ‘first’ film directed by a famous man.” Truffaut’s admiration was correct and well-founded. As a great artist himself, Truffaut understood how difficult it is not only to make a film but to accomplish it fully in accordance with one’s artistic vision and intent.

Welles’ love of movies transcended his love of America, and so he made films in Croatia, Spain, and France. He was certainly not an angry American trying to bring down the structure and the essence of this country. On the contrary, Welles yearned and wished for success in America. We do have to ask whether the American rejection of Welles was truly rooted in the dismissal and hatred from the American people (the audience) or did the establishment simply push him out until he was deemed irrelevant? Did the entrenched powers of studios and production companies dictate American tastes so completely? 

These questions are important not only in terms of Welles’ life and career but also in relation to what constitutes art and the American character. How do people respond to art? Does a film cease to be a work of art once it is admired by many and not only a few? The creation of art is most certainly not democratic but its enjoyment should be available to all, and Welles would agree with this. On several occasions, he remarked that he truly wished he had a mass audience but that always seemed to elude him. He was not an elitist in the sense that he enjoyed the exclusion of others for arbitrary or superficial reasons. He did demand perfection, however, above all from himself. 

The Man in the Mirror

In one of the interviews given later in his life, Welles admitted that he did not like mirrors. Of course, Welles meant this comment to be symbolic of his view of himself. He always evaded the past and didn’t want to delve into any details of his own life. The public continued to be curious, but every time Welles appeared on television talk shows, he maintained the persona of a magician or a celebrity who was very much relaxed in talking about anything—except his past. 

According to Joseph McBride, even when the host of a show, particularly Merv Griffin “tried prodding him in those directions, he [Welles] preferred to spend his time on ‘Merv’ indulging his fondness for performing elaborately tedious magic tricks.” Is this all an audience in America was really interested in? Perhaps, but Welles certainly was not willing to give or share anything more. He used such appearances to his advantage: first, he didn’t have to talk about his life, and second, he made money which he then always put into his film projects. 

Welles may not have liked to look in the mirror; but he used them frequently in his films. The most famous example comes from “Citizen Kane,” in the scene when an old Charles Foster Kane roams the vast rooms of his massive Xanadu estate, and catches a glimpse of himself in the mirror. Kane is not pleased with what he sees: a man, who has everything and nothing; a failure, yet still a young boy yearning to become whole again. In this case, the mirror has power over Kane—instead of looking deeply into the distorted and lost face, he recoils in disgust, in shame, or in anger that emotions lying so deeply under the skin are visible at all under the veneer, or mask, that has been carefully sculpted over the years.

Another famous example of the mirror comes in Welles’ 1948 film, “The Lady from Shanghai,” in which Welles stars along with Rita Hayworth. At the time of the shooting, Welles and Hayworth were already practically divorced and the tension is visible on the screen as well. Welles plays Michael O’Hara, a man who is hired to work aboard the yacht of a disabled man, Mr. Arthur Bannister, and gets caught in a web of deceit and corruption thanks to Bannister’s beautiful and seductive wife, Rosalie.

Typical of Welles, this film too is very different from others in the noir genre, and it leaves the viewer disoriented. And just like most of Welles’ films that were made through film studios, “The Lady from Shanghai” also suffered at the hands of the producers who cut about an hour of footage and were not pleased with Welles’ lack of close-ups in the film or his strange long takes. But despite some omissions, the film is still very much a Wellesian project, and the end alone proves that point.

The mirror scene in “The Lady from Shanghai” is one of the most brilliant scenes in the history of cinema. It takes place in the Magic Mirror Maze, a seaside funhouse, where there are mirrors upon mirrors, replicating, one distorted image after another. The truth of who framed whom comes out at the end of the film but unlike Charles Foster Kane, Michael O’Hara has power over the mirror and is willing to look at his own face. It is important to point out, however, that unlike Kane, O’Hara has no reason to feel shame—only justified anger over Rosalie’s betrayal. 

In both cases, Welles plays with the illusory quality of our lives. What is real and authentic? Are we willing to look into the mirror of our souls or is it easier to look away? Is self-revelation even possible? Do we not all simply wear masks in order to forget and break away from the burdens of existence? Consider Jaques’ famous lines in Shakespeare’s comedy, “As You Like It”: 

All the world’s a stage,
And all the men and women
merely players;
They have their exits and their entrances,
And one man in his time plays
many parts

Surely Welles would attest to this. Shakespeare’s monologue leaves us with questions about reality and whether our lives are merely somewhat meaningful and bearable transitions, or whether the stage, the world, the exits, and entrances are all up to us to create? As we attempt to answer the question of reality and illusion, we cannot evade our very selves that we see in the mirror, and the biggest question of them all: are we free? 

Accepting an American Film Institute Life Achievement Award in 1975, Welles touched on his vision as an independent filmmaker and freedom: “A maverick may go his own way but he doesn’t think it’s the only way or ever claim that it’s the best one—except maybe for himself. And don’t imagine that this raggle-taggle gypsy-o is claiming to be free. It’s just that some of the necessities to which I am a slave are different from yours.”

Welles knew that choosing to be entirely in control of his projects and holding closely and deeply to his particular vision would not be easy. Being an independent artist of any kind means that you have to pave your way, engage in what will most likely be a constant forge, and understand that whatever you create (even if brilliant!) may not be recognized or applauded in the way that you yearn to see it done. I think that Welles understood and accepted this—otherwise, he would not have kept making films. It is significant, however, that he does not claim to be free. 

On many occasions throughout a number of separate interviews, Welles had said that his biggest regret was that he fell in love with making movies and that he should have started a different profession after he completed “Citizen Kane.” This regret and this constant need to forge new ways not only to make films but also to show us how fragile, tragic, and yet beautiful life can be, may have been the thing that rendered Welles an imprisoned man unwilling to look in the mirror. Or perhaps it was only that he was unwilling to let us see that he indeed had no fear of taking off the mask and seeing his own face. 

There is a great sense of mystery in Orson Welles. He was composed of many parts, and many “exits and entrances.” Through his work, Welles reveled in the game of concealment and illumination but, in the end reveals it to be more than just a silly game. It was his invitation to explore the deep, forgotten, and neglected pieces of our strange, funny, and broken selves, and to ask what it means to be human. Welles explored this time and time again with great courage and resolve, and he gave us not only a cinematic gift but also a true example of individual courage.

Weekend Long Read

This essay is adapted from “Disloyal Opposition: How the #NeverTrump Right Tried—and Failed—to Take Down the President” by Julie Kelly (Encounter Books, 240 pages, $25.99)

Weapons of Mass Collusion

For two years, as Robert Mueller tried and failed to find evidence of a criminal conspiracy, NeverTrump Republicans tended to the right flank of the Trump-Russia collusion front. But their role in pushing the hoax went much deeper.

“The Clinton campaign got a bunch of dirty cops to frame and spy on their opponent, the Trump campaign. After Trump won, they rolled this dirty tricks operation, this spying campaign, into a coup.”

—Lee Smith, author of The Plot Against the President 

The biggest scandal in American political history started with NeverTrump conservatives. Desperate to tarnish Trump’s viability as a candidate, anti-Trump Republicans and Democrats joined together to convince the public that Donald Trump was working with Russia to influence the outcome of the 2016 presidential election. Articles connecting Donald Trump’s campaign to Russian interests started appearing on conservative websites as early as March 2016. 

Just two days after then-candidate Trump announced his foreign policy team in the spring of 2016, the Washington Free Beacon posted a 1,100-word hit piece on Dr. Carter Page: “Energy investor Carter Page, one of Donald Trump’s handpicked foreign policy advisers, has heavily criticized what he considers American aggression toward Russia, even comparing U.S. policy to American slavery and high-profile police shootings,” Lachlan Markay wrote on March 23, 2016. “Trump’s selection of Page may indicate the reality-star-cum-politician’s opposition to U.S. policies that counter Russian interests in key global theaters.” 

Markay’s piece contained arcane details about Page’s views on Russia, including columns Page had written for obscure energy publications. (Page is a global energy consultant.) Even the most dogged reporter would have been hard-pressed to find so many specifics on an unknown campaign advisor, draft the article, and post it in less than 48 hours. How did Markay produce a lengthy article in such a short time—and why? 

According to Fusion GPS, the opposition research firm that helped concoct the Russian collusion hoax, a Republican Party elder connected Fusion with the Free Beacon in the summer of 2015. Fusion chief Glenn Simpson sent an email to a “longtime Republican politico” in August 2015 to pitch their expanding file of dirt on Donald Trump. 

The unnamed Republican immediately expressed interest in the project; a month later, Simpson’s GOP contact informed him that the Washington Free Beacon, reportedly backed by hedge fund manager and onetime Trump adversary Paul Singer, would hire Fusion for $50,000 per month. Simpson referred to his client as a “Never Trump operation.” 

The editor of the Washington Free Beacon at the time was Matthew Continetti—Bill Kristol’s son-in-law, the same Bill Kristol who, by mid-2015, was pledging to stop Donald Trump’s candidacy. 

The Free Beacon’s March 2016 article was the first to claim Carter Page had an alleged affinity for Mother Russia. It offered a platform for other anti-Trump outlets on the Right to expand upon. National Review published another Page-Russia article the following month; this time, the headline and content were more brazen. 

Trump: The Kremlin’s Candidate,” cribbed many of the same links and talking points cited in Markay’s original piece. “Carter Page is an out-and-out Putinite,” declared Robert Zubrin in April 2016. “With Page providing Trump’s Russia policy, it is not surprising that the Donald has also attracted the support of other prominent Putinites.”

According to Lee Smith’s book The Plot Against the President, a series of proto-dossiers—compiled by the Free Beacon’s paid dirt-digger, Fusion GPS—predated the infamous Steele dossier, the centerpiece of the collusion scheme. “Fusion GPS was the Clinton campaign’s shadow war room and subsequently became its dirty tricks operation center,” Smith wrote. The Clinton campaign and the Democratic National Committee hired Fusion GPS in April 2016; the company became the nexus of the Left and NeverTrump, an alliance that would continue throughout Trump’s first term. 

The Free Beacon posted a few more smear jobs on Carter Page into July, the month that the Democratic Party heavily spun its Trump-Russia collusion narrative to bury the damaging release of internal emails the week of the party’s convention to officially nominate Hillary Clinton as its presidential candidate. By that time, Steele’s first installments of his dossier had been completed; the Fusion team began pitching his work to news outlets and friendly journalists in late July at the DNC’s coronation of Hillary in Philly. 

On July 21, 2016, Commentary’s Noah Rothman openly doubted Trump’s loyalty to America and suggested the business tycoon favored Russia over the United States. In his article, “Trump’s Great Russia and Our Expense,” Rothman ticked off a number of Fusion GPS–produced talking points. Then Rothman posed this ridiculous question: “In the zero-sum game of geopolitics, it long ago became crystal clear that Russia’s national interests and America’s national interests are mutually exclusive. So just whose side is Donald Trump on?” 

A few days later, on July 24, the Weekly Standard published a telling piece titled “Putin’s Party?” The author explained why voters should be troubled by disturbing ties between the Kremlin and Trump campaign associates Page, Paul Manafort, and Lt. General Mike Flynn. “These indications provide sufficient grounds for Trump’s links to Putin to be further investigated.” 

The author of the piece? Bill Kristol, the magazine’s editor-in-chief at the time. His son-in-law still had Fusion on retainer for the Free Beacon. (Continetti denied any ties to the Steele dossier.) 

Kristol’s article mimicked accusations of Trump-Russia collusion hawked by Fusion GPS in the summer of 2016. Tom Nichols followed Kristol’s report with a tweetstorm sketching Trump’s fealty to Russia and questioning his patriotism. Calling Trump “Putin’s poodle,” Mona Charen penned a lengthy column about the Trump-Putin bond: “Trump bats his eyes at Putin like a schoolgirl with a crush,” she wrote on July 28, 2016. At the Washington Post on the same day, Jennifer Rubin was vexed about why “Trump . . . is so deferential toward Russia’s authoritarian bully.” 

The ensconced, and in some instances nepotistic, fiefdom of the anti-Trump conservative commentariat acted as its own Trump-Russia collusion echo chamber; but unlike their colleagues on the Left, NeverTrump’s audience was nervous Republican voters. 

Fusion GPS fed its anti-Trump propaganda to conservative influencers who, in turn, warned their followers about the Putin stooge at the top of the Republican ticket. As the earliest narrators of the collusion fable, NeverTrumpers—editors and writers for the Weekly Standard, National Review, Commentary, and others—were heavily invested in discouraging Republicans from voting for Trump based on the fiction that he would work in Putin’s interests and not America’s. 

NeverTrump would remain prolific peddlers of collusion hype, helping the Democrats mislead the American public for three years that the Trump campaign was in cahoots with the Kremlin prior to Election Day. 

In the process, NeverTrump abetted the biggest con job in American political history while covering up the legitimate scandal, one that will be documented as the most egregious abuse of federal power ever wielded against a U.S. presidential campaign. 

Crossfire Hurricane 

The same month that Markay published his first Page hit piece in the Free Beacon, former FBI director James Comey met with former attorney general Loretta Lynch to discuss his “concerns” about the Trump campaign volunteer. As conservative commentators ginned up the public relations end of the scam, Comey and Obama’s top national security chiefs orchestrated the inside job. 

Something else consequential happened in March 2016: Florida senator Marco Rubio suspended his campaign, following in the failed footsteps of 12 other Republican candidates who had already dropped out of the race. Trump, Texas senator Ted Cruz, and Ohio governor John Kasich were the three men left standing. 

And it was increasingly obvious who would prevail. 

But a Trump presidency, no matter how unlikely, was unacceptable to the Obama White House. 

President Barack Obama held deep animus toward Donald Trump for spreading rumors about Obama’s birthplace. During the White House Correspondents’ Dinner in 2011, Obama mocked Trump, in attendance at the event, for his “birther” conspiracy theory about the then-president. After roasting The Donald for several minutes, Obama showed a cartoon of an imaginary Trump White House, ornamented with gold columns and bikini-clad women. 

A few weeks before Election Day, Obama appeared on Jimmy Kimmel’s late-night talk show to read a series of mean tweets: One tweet was from Trump, saying Obama would go down as “perhaps the worst president in the history of the United States.” Obama, not amused, looked into the camera and declared, “At least I will go down as a president.” He then stared into the camera and dropped his cell phone. 

Obama never forgave Trump for raising doubts about where the president was born. (In fact, during one of his last White House briefings, Obama’s press secretary, Josh Earnest, intimated that the entire Russian collusion scheme was revenge for Trump’s birtherism. “The president-elect and his team are suggesting that the accusations [about Russian collusion] that are being made are totally unfounded, that there’s no basis for them. This president has been in a situation in which he has been criticized in an utterly false, baseless way. And I’m, of course, referring to the president’s birthplace,” Earnest said on January 11, 2017, the day after BuzzFeed published the entire dossier authored by Fusion GPS hired gun Christopher Steele.) 

Weaned in the cutthroat world of Chicago politics, where every public agency from the school system to the Department of Streets and Sanitation is leveraged for either maximum political gain or damage, Obama would have no qualms about using the federal government’s most powerful tools against his biggest rival. The Obama administration had already been caught using the Internal Revenue Service to punish Tea Party organizers before his 2012 reelection campaign. 

Further, Obama and his partisan toadies who populated key agencies needed to redirect public and internal outrage over the FBI’s handling of the investigation into Hillary Clinton’s illicit email server. Although Comey concluded Clinton had mishandled classified material, he announced in July 2016 he would not recommend charges against her. 

That very same month, Comey’s FBI opened a counterintelligence probe into four individuals connected to the Trump campaign: Page, Manafort, Flynn, and George Papadopoulos, another foreign policy advisor. The operation was called “Crossfire Hurricane,” a line from the Rolling Stones song “Jumpin’ Jack Flash.” It involved deploying informants into the campaign and manipulating a secret court to get authorization to surveil Carter Page for a year. The CIA and State Department were in on the scheme, too. 

At the same time, a media blitz bolstered the FBI’s alleged suspicions about sketchy ties between Team Trump and the Kremlin. That effort was coordinated by Glenn Simpson, Fusion GPS’s co-founder, and his paid operative, former British intelligence officer Christopher Steele. 

His so-called dossier of unproven and outlandish accusations against Trump and others was not only cited as evidence in an application prepared by Comey’s FBI and submitted to the Foreign Intelligence Surveillance Court in October 2016 to obtain a warrant against Page; it was circulated among the media and top lawmakers on Capitol Hill, both Republicans and Democrats. Dossier-sourced articles claiming senior government officials had intelligence from a former British spy that proved Trump-Russia election collusion appeared in Yahoo! News and Mother Jones before Election Day. 

Thanks to Fusion GPS’s handiwork, the Trump campaign spent the last few months of the election season fending off allegations of fealty to Russia. An official statement from Obama’s intelligence community in October 2016 confirmed Russia’s plans to mess with the election. The trap had been set to smear Trump with Russian dirt; nearly everyone in the political universe, including NeverTrump, participated in the con. 

Then Trump won. The con continued—but after the election, the stakes were much higher. Removing Trump from the Oval Office on suspicions his campaign team had helped the Russians influence the outcome of the election in his favor, and worse, that his presidency would act in service to Vladimir Putin, became the Democrats’ sole crusade. 

And NeverTrump played right along. 

Post-Election Collusion With NeverTrump 

During an annual security conference in Nova Scotia shortly after the 2016 election, a few high-level officials gathered privately to discuss the outcome and Russia’s alleged influence. One person in the meeting had a deep-seated grudge against the incoming president: Arizona Senator John McCain. 

McCain huddled with former British diplomat Sir Andrew Wood and David Kramer, a McCain confidant who worked for the senator’s nonprofit, on the evening of November 16 in Halifax. Wood briefed McCain about accusations contained in the Steele dossier, which he described as “raw, unverified intelligence,” according to McCain’s 2018 autobiography, The Restless Wave.  

Wood vouched for Steele’s credibility, McCain wrote, assuring the senator that the former MI6 agent had dependable Russian contacts and a solid reputation. The group began discussing the contents of the dossier. “Our impromptu meeting felt charged with a strange intensity,” McCain described. “No one wisecracked to lighten the mood. We spoke in lowered voices. I was taken aback. They were shocking allegations.” 

One charge—that the Russians had a tape recording of Russian prostitutes urinating in front of Trump in a Moscow Ritz-Carlton in 2013—was so preposterous that it should’ve immediately raised a red flag about the document’s veracity. 

Nonetheless, McCain directed Kramer to travel to the UK to meet with Steele. But the British operative did not give Kramer a copy of the dossier at that meeting in his London home on November 28, 2016. Instead, Steele arranged for Kramer to meet with Glenn Simpson, Fusion GPS’s chief, in Washington the next day; Simpson provided one of McCain’s top advisors with a copy of the sketchy political propaganda. (In his book, Crime in Progress, coauthored with Peter Fritsch, Simpson admitted he and Kramer had a working relationship dating back nearly a decade.) 

After Kramer gave the dossier to McCain, the senator later handed it off to FBI director James Comey, who already had the document. Forwarding partisan dirt to the head of the nation’s most powerful law enforcement agency, McCain later explained in his book, was in the country’s national security interest. “I did my duty, as I’ve sworn an oath to do,” McCain preened in his customary self-aggrandizing way. “Anyone who doesn’t like it can go to hell.”

But McCain’s imprimatur on Trump-Russia election collusion would be a crucial contribution to legitimizing the scam. Embracing his long-standing act as a “maverick,” McCain clearly welcomed the opportunity to work as Trump’s foil from the same side of the political aisle. That gave NeverTrump pundits the backing of arguably the most influential Republican senator, one who still commanded respect from rank-and-file Republicans despite his two losing presidential bids. 

As chairman of the Senate Armed Services Committee, McCain wielded his post to inflict maximum damage on the incoming administration. He wasted no time scheduling a hearing into Russia’s “attack” on the 2016 election. On January 5, 2017, as the Trump transition team planned to take control of the White House, McCain’s committee heard testimony from top government officials, including former director of national intelligence James Clapper, an architect of the hoax, about Putin’s predations.

During one exchange between McCain and Clapper, the pair implied that the Kremlin’s social media skullduggery might have changed votes from Clinton to Trump. “We have no way of gauging the impact . . . it had on choices the electorate made,” Clapper told McCain. “There’s no way for us to gauge that.” McCain further intimated that if Russian social media tinkering actually did change any votes, it would be an act of “war” against the United States. 

The message was clear: McCain goaded Clapper into saying publicly that there was a chance that Russian Facebook memes swayed people to vote for Donald Trump. The new president had been illegitimately elected thanks to chicanery from an American adversary. Enough gullible voters in Wisconsin, Pennsylvania, and Michigan had been brainwashed by weird Russian social media posts to put Trump over the finish line. And the man hinting that might have been the case was a Republican stalwart—one whom the Trump-hating Beltway media corps adored and NeverTrump revered. 

Further, the official intelligence buttressing the claim that Russia hacked the election was specious at best, sloppy and dishonest at worst. Former CIA director John Brennan and Clapper finished the report in less than 25 days in December 2016. The flimsy document hardly provided the fides to justify howls about Russia “attacking our democracy” after Election Day. 

Either McCain knew the intelligence was thin gruel or he was duped again by intelligence officers with an ulterior motive. 

All of it started to feel eerily familiar. Sketchy intelligence touted by powerful politicians as evidence of an imminent threat to justify action against a foreign foe for domestic political purposes. I’m referring, of course, to weapons of mass destruction. It’s not a coincidence that most of the very same people, McCain in particular, pushing Russian collusion based on the thinnest trove of “evidence” also successfully convinced the American people that Iraq possessed weapons of mass destruction. 

As I noted in 2019

In between the two scandals was more than a decade of recriminations against once-trusted experts on the Right who led our nation into battle. The Iraq war cost the lives of more than 4,400 U.S. troops, maimed tens of thousands more and resulted in an unquantifiable amount of emotional, mental, and physical pain for untold numbers of American military families. Suicide rates for servicemen and veterans have exploded leaving thousands more dead and their families devastated. And it has cost taxpayers more than $2 trillion and counting. 

So, these discredited outcasts thought they found in the Trump-Russia collusion farce a way to redeem themselves in the news media and recover their lost prestige, power, and paychecks. After all, it cannot be a mere coincidence that a group of influencers on the Right who convinced Americans 16 years ago that we must invade Iraq based on false pretenses are nearly the identical group of people who tried to convince Americans that Donald Trump conspired with the Russians to rig the 2016 election, an allegation also based on hearsay and specious evidence.

The verbiage and tone NeverTrump used to warn the country about collusion were eerily similar to those of the WMD alarms: 

Bill Kristol in 2003: “We look forward to the liberation of our own country and others from the threat of Saddam’s weapons of mass destruction, and to the liberation of the Iraqi people from a brutal and sadistic tyrant.” 

Bill Kristol in 2018: “It seems to me likely Mueller will find there was collusion between Trump associates and Putin operatives; that Trump knew about it; and that Trump sought to cover it up and obstruct its investigation. What then? Good question.” 

John McCain in 2003: “I believe that, obviously, we will remove a threat to America’s national security because we will find there are still massive amounts of weapons of mass destruction in Iraq.” 

John McCain in 2017: “There’s a lot of aspects with this whole relationship with Russia and Vladimir Putin that requires further scrutiny. In fact, I think there’s a lot of shoes to drop from this centipede. This whole issue of the relationship with the Russians and who communicated with them and under what circumstances clearly cries out for an investigation.”

David Frum in 2002 (writing for President George W. Bush): “States like these and their terrorist allies constitute an axis of evil, arming to threaten the peace of the world.”

David Frum in 2016: “I never envisioned an Axis of Evil of which one of the members was the U.S. National Security Adviser.” 

Max Boot in 2003: “I hate to disappoint all the conspiracy-mongers out there, but I think we are going into Iraq for precisely the reasons stated by President Bush: to destroy weapons of mass destruction, to bring down an evil dictator with links to terrorism, and to enforce international law.” 

Max Boot in 2019: “If this is what it appears to be, it is the biggest scandal in American history—an assault on the very foundations of our democracy in which the president’s own campaign is deeply complicit. There is no longer any question whether collusion occurred. The only questions that remain are: What did the president know? And when did he know it?” 

Bush’s FBI director at the time publicly testified about the looming global menace posed by Iraq’s stockpile of deadly materials. “Secretary [of State Colin] Powell presented evidence last week that Baghdad has failed to disarm its weapons of mass destruction,” Mueller told the Senate in 2003. Those weapons, the FBI director warned, could be supplied to terrorist organizations around the world. 

A report issued two years after the invasion excoriated the intelligence community. “We conclude that the Intelligence Community was dead wrong in almost all of its pre-war judgments about Iraq’s weapons of mass destruction,” concluded a special commission in 2005. “This was a major intelligence failure.” (Senator McCain served on the commission.) 

The FBI director pushing the weapons of mass destruction line in 2003—Robert Mueller—would become the central figure, and arguably the most powerful man in Washington, leading the two-year investigation into whether Donald Trump colluded with the Russians before the election. History would repeat itself in an uncanny way. 

It’s Mueller Time 

Throughout the spring of 2017, the drumbeat of Trump-Russia collusion intensified along with calls for a special counsel. Lt. Gen. Michael Flynn, Trump’s first national security advisor, didn’t last a month in the West Wing. Flynn resigned on February 14, 2017, amid an orchestrated campaign between Obama holdovers in the administration and the news media that portrayed Flynn’s phone calls with the Russian ambassador as either traitorous or a violation of the Logan Act. That law, which has been on the books for 220 years without a single conviction, prohibits U.S. citizens from communicating with foreign powers to “defeat the measures of the United States.” The so-called “dead letter” law was exhumed before Election Day; beginning in the summer of 2016, Democrats regularly accused Trump of violating the Logan Act for various comments about Russia. 

McCain, breaching his own rule of not attacking military heroes, accused Flynn of “lying” to the vice president about his pre-inaugural conversations with Sergey Kislyak and said Flynn’s resignation raised “further questions about the Trump administration’s intentions toward Vladimir Putin.” 

In March 2017, James Comey finally confessed to the Republican-led House Intelligence Committee that he had opened a counterintelligence probe into the Trump campaign in the summer of 2016 based on suspicious activity with Russian interests. (Rep. Elise Stefanik would force Comey to admit that he violated House protocol by withholding that information from congressional leaders for eight months.) Comey’s sneakiness, however, was portrayed as protecting “sensitive” law enforcement activities rather than intentional deceit. 

In April, the Washington Post disclosed the FISA (Foreign Intelligence Surveillance Act) warrant against Carter Page; the government told the Foreign Intelligence Surveillance Court that Page was a foreign agent of Russia. (The reporting on both Flynn and Page was based on illegal leaks of classified government information, a felony for which no one has been either charged or convicted.) 

McCain and other NeverTrumpers insisted that a separate, full-scale investigation would be necessary. “This whole issue with the relationship with the Russians and who communicated with them and under what circumstances clearly beg, cries out for investigation,” McCain told Jake Tapper on CNN in March 2017. “We should not assume guilt until we have a thorough investigation.” “The situation begs for a bipartisan, transparent investigation,” David French wrote

Then the coup de grace: On May 9, 2017, Trump fired Comey. The dismissal was portrayed as an attempt to stop Comey from probing Trump’s ties to the Kremlin; it quickly became the Democrats’ latest impeachment fodder. 

Deputy Attorney General Rod Rosenstein appointed Robert Mueller, a Comey pal, as the special counsel tasked with rooting out evidence of Trump-Russia collusion. (Jeff Sessions, Trump’s attorney general, ill-advisedly recused himself in March 2017 from any matters related to Russia based on his own innocuous contacts with Kislyak. This empowered the Obama-appointed Rosenstein to take control of the Justice Department’s inquiry into Trump-Putin ties.) 

NeverTrump seized the moment. Mueller, they were convinced, would doom Trump’s presidency. His unfettered inquiry, commandeered by a team stacked with partisan prosecutors, surely would produce evidence of impeachable offenses that would quickly dispatch Trump from the Oval Office. No comparison designed to underscore the gravity of the situation would be considered out of bounds: Max Boot compared alleged Russian election interference to 9/11

For the next two years, NeverTrump tended to the right flank of the Trump-Russia collusion front. This primarily involved protecting Mueller’s investigation. 

“The investigation by special counsel Robert Mueller into Russian interference with the 2016 election is now entering a new and critical phase,” a group of NeverTrumpers wrote in a November 2017 letter addressed to Paul Ryan, then-Speaker of the House. “We would regard dismissal of the special counsel, or pardons issued preemptively to anyone targeted by his investigation, as a grave abuse of power that justifies initiation of impeachment proceedings. It is morally imperative that the Republican Party and the conservative movement stand as bulwarks of the rule of law, not enablers of its erosion and violation. Now is the time for choosing.” 

It was signed by more than two dozen NeverTrumpers, including Bill Kristol, Mona Charen, Max Boot, and Evan McMullin. 

As Trump regularly expressed his outrage at Mueller’s spiraling “witch hunt,” NeverTrump rallied around the special counsel and demanded that Republican lawmakers “protect” Robert Mueller. Kristol formed a group called Republicans for the Rule of Law, which produced television ads touting Mueller’s military valor, integrity, and legal reputation. The group bought airtime on Sunday news programs and Fox News. 

As the investigation progressed, it became hard to distinguish between NeverTrump and Democrat Adam Schiff, the leading collusion propagandist in the House, who promised for three years that “clear” evidence of collusion existed. 

It didn’t.

The End Is Near 

In embarrassing fashion given the final result, NeverTrump salivated at every rumor, accusation, interrogation, charge, arrest, and raid initiated by Team Mueller, confident that the special counsel would soon haul Donald Trump out of the Oval Office in handcuffs; perhaps a few of his children would be arrested, too. 

Hardly a day passed when some NeverTrumper didn’t chortle that Trump’s days were numbered or the walls were closing in or the end was near. 

After Comey’s June 2017 Senate testimony to complain about his firing set the stage for impeachment based on an obstruction of justice case, Jennifer Rubin warned that it was a turning point for Republicans. “Before Comey, impeachment talk was not a real concern for Republicans. After Comey, [it] surely will be a referendum on Trump, and specifically whether he should be impeached—unless, of course, Republicans decide to cut their losses and get rid of him before the midterms.”

NeverTrump frequently defended the contents of the Steele dossier and assured the public that Fusion’s Glenn Simpson, under increasing scrutiny throughout 2017, was the real deal. His former Wall Street Journal colleague Bret Stephens attested to Simpson’s sterling reputation; the White House and the president, warned Stephens, should be “terrified” about Simpson’s congressional testimony. “Glenn is a very serious, capable journalist. He’s not a partisan…If he has politics, I’m not aware of them,” Stephens said on MSNBC about the Clinton/DNC hired gun. Tom Nichols continued to insist the dossier was “raw intelligence” even after everyone else acknowledged that it was nothing more than fabricated political dirt. 

Bill Kristol was giddy after the FBI’s raid of Michael Cohen’s office, home, and hotel room in the summer of 2018. He could hardly contain his glee on the set of CNN. “This is war. This shows we are very close to the end game,” he assured his ecstatic CNN panelists in April 2018. Kristol later would claim that “reality has changed” after Cohen’s guilty plea. Even though the charges had nothing to do with Russian collusion, Kristol questioned whether, deep down, it was true. 

“This week was the worst of Donald Trump’s presidency. But it seems likely there will be worse still,” Charlie Sykes warned when Mueller snagged both Manafort and Cohen.

David French claimed Mueller’s December 2018 sentencing memo on Michael Cohen “may well outline the roadmap for an impeachment count against the president that is based on recent presidential precedent. Donald Trump’s legal problems continue to mount.”

After the New York Times reported in July 2017 that Donald Trump Jr. and other top campaign associates met with a so-called “Russian lawyer” allegedly connected to the Kremlin a few months before the election, NeverTrumpers insisted the brief confab amounted to campaign collusion. 

David French concluded that the meeting met the definition of collusion. “To repeat, it now looks as if the senior campaign team of a major-party presidential candidate intended to meet with an official representative of a hostile foreign power to facilitate that foreign power’s attempt to influence an American election,” French wrote in National Review in July 2017. “Russian collusion claims are no longer the exclusive province of tinfoil-hat conspiracy theorists. No American—Democrat or Republican—should defend the expressed intent of this meeting.” 

(Evidence would later show that the “Russian lawyer” was working with Glenn Simpson on behalf of a Russian company in trouble with the U.S. government. Simpson and Natalia Veselnitskaya met both before and after the Trump Tower meeting. No damning information about Hillary Clinton was shared with the participants.) 

NeverTrump mocked a three-star general after he accepted a plea deal with Mueller’s team in December 2017. Mike Flynn’s guilty plea to one process crime elicited cheers from NeverTrump. “Michael Flynn going to jail? Unlike Paul, can’t make $11 Million bail. Or maybe against Trump, he’ll have to wail. This whole presidency is one big fail,” Ana Navarro snarked on Twitter.

Even the most ludicrous, unfounded charges of collusion meant doom for the president. “Big news: Mueller reportedly has evidence that Michael Cohen did travel to Prague in 2016, lending credence to Chris Steele’s reporting that Cohen secretly met a Kremlin figure there to strategize about Moscow’s election assistance to President Trump,” Evan McMullin tweeted in the spring of 2018. Mueller concluded Cohen never traveled to Prague; it was another dossier-fabricated collusion talking point.

Jonah Goldberg erroneously claimed that Trump campaign coordinator Sam Clovis sent George Papadopoulos to Russia to get dirt on Clinton and often parroted the head fake that Papadopoulos, and not the Fusion-sourced dossier, prompted Comey’s probe into the campaign. (Goldberg often got key details about “collusion” flat-out wrong. As late as December 2019, Goldberg had to correct a post on National Review that originally claimed the FBI hired a private cybersecurity firm to determine the Russians hacked the DNC server. Only after readers pointed out his mistake did Goldberg note that the DNC, not the FBI, hired CrowdStrike.) 

Conversely, anyone attempting to uncover the legitimate, provable scandal—how the world’s most powerful law enforcement and intelligence apparatus was weaponized against a rival presidential campaign—was partaking in a “conspiracy theory.” 

The very same NeverTrumpers who regurgitated every reckless charge of collusion downplayed alarming evidence of abuse at the highest level of the federal government to target Team Trump. “It’s time for partisans to ditch conspiracy theories and reach mutual agreement to follow the evidence and apply the law to the facts without regard for personal affection or policy preference. Any other approach—either by pundits or politicians—fails their audience or their constituents,” lectured David French in December 2018. French often defended the FBI’s actions, even as evidence mounted that the pretext for the probe was either phony or manufactured by the FBI itself: “The FBI wasn’t abusing its power. It was fulfilling the mission the president gave it.” 

Tom Nichols suggested that people digging into “FISAgate” were wearing “tin foil hats.” (NeverTrump repeatedly ridiculed Rep. Devin Nunes, the Republican chairman of the House Intelligence Committee, and his effort to expose numerous offenses related to the infiltration of and investigation into Trump’s campaign.)

Sen. Ben Sasse, a member of the Senate Judiciary Committee, never mentioned his concerns about the FBI’s illicit probe or expressed outrage at the behind-the-scenes activities of Comey, former deputy FBI director Andrew McCabe, former counterespionage chief Peter Strzok, or his lover, FBI lawyer Lisa Page. There were no questions about the role of Bruce Ohr, a twice-demoted Justice Department official, and his wife’s work at Fusion GPS or both Ohrs’ relationship with Christopher Steele. 

Again, NeverTrump sided with the Left not only to mislead the American people about a nonexistent collaboration between Trump and Putin, but they intentionally ignored and downplayed the real scandal as a conspiracy theory. 

Defending massive abuses of federal power, which included violating the constitutional rights of private citizens, prosecuting political opponents, breaching attorney-client privilege, and illegally leaking classified information to the news media, somehow became a conservative “principle” during the Trump era. Go figure. 

Mueller Report Bombs 

NeverTrump speculated for two years that Robert Mueller would find evidence of collusion between the Trump campaign and the Kremlin. In March 2019, Mueller submitted his long-awaited report to the Justice Department. To avoid leaks, and since Mueller had not redacted grand jury material as he was instructed, Attorney General Bill Barr released a summary of the report’s contents as it underwent the classification process. 

The bottom line: Mueller’s team of skilled, partisan, vengeful prosecutors found no evidence of collusion between the Trump campaign and Russian state actors. (The second half of Mueller’s report outlined instances of possible obstruction of justice, but Mueller declined to make a prosecutorial recommendation.) 

In April 2019, the Justice Department released a redacted copy of the 448-page Mueller Report. Its findings supported Barr’s summary. 

The Mueller probe, more accurately described as a “witch hunt” by the president and his supporters, was over. The crimes NeverTrump and the Left had hoped to see never materialized. 

NeverTrump was wrong, once again. 

Kristol, commenting on MSNBC as the new publisher of The Bulwark, the Weekly Standard’s stepchild, griped that Team Trump was acting like the “most sore winners in the world. They’re bitter and angry and want to punish people who made the mistake of thinking there might be collusion.” 

If only Kristol knew what it was like to be a winner, even a sore one. Following, again, the lead of the Democrats, NeverTrump dumped collusion and quickly embraced Mueller’s dubious and politically motivated allegations of obstruction of justice. Charlie Sykes, writing at The Bulwark, insisted the second volume of Mueller’s report was “devastating” and constituted an “open invitation to Congress to launch impeachment proceedings.” 

Some NeverTrumpers, however, had a hard time letting go of their collusion dream. As late as November 2019, Max Boot insisted that “collusion evidence remains strong.” Just as they had with faulty claims about weapons of mass destruction, NeverTrump’s Iraq War promoters refused to abandon the Russian collusion narrative they helped create. And when the government produced evidence to the contrary, just as was the case with WMDs, NeverTrump refused to concede or apologize. They moved on, no penalty paid, to the next manufactured scandal while looking for new foes.

Weekend Long Read

This essay is adapted from “United States of Socialism: Who’s Behind It. Why It’s Evil. How to Stop It,” by Dinesh D’Souza (All Points Books, 304 pages, $29.99)

Identity Socialism

Herbert Marcuse’s toxic legacy.

Socialism, a system for raising up the working class, has now largely abandoned the working class. A program for raising the condition of ordinary citizens and workers has turned into a coordinated effort to make those very citizens and workers feel unwelcome and demonized in their own country. Socialism in America today has turned black against white, female against male, homosexual and transsexual against heterosexual, and illegals against legal immigrants and American citizens.

The typical socialist today is not a union guy who wants higher wages; it is a transsexual eco-feminist who marches in Antifa and Black Lives Matter rallies and throws cement blocks at her political opponents. American socialism is concerned less with worker exploitation by the bourgeoisie and more with the race, gender and transgender grievances of identity politics. I call it identity socialism.

If Franklin D. Roosevelt were alive today, he would not recognize the modern Democratic Party he created. Nor would he recognize the progressivism and socialism that formed the ideological pillars of his party. For FDR, as for Marx, socialism was primarily a matter of class. It was the rich versus the poor. Its political base was the working class—specifically the white working class that to this day forms the majority of working class people in America. While the socialist Lleft still employs the old rhetoric of class warfare, it seems something of a relic. Contemporary socialism is no longer rooted in class, and moreover its oldest allies—working class white males—are now its villains and enemies.

If FDR had attended the 2016 primary debates among Democratic Party contenders, he would have heard Hillary Clinton jibe at Bernie, “If we broke up the big banks tomorrow, would that end racism?” Recently he would have encountered this outlandish tweet from Elizabeth Warren: “Thank you @BlackWomxnFor! Black trans and cis women, gender-nonconforming, and nonbinary people are the backbone of our democracy.”

FDR would probably would have no idea what she was talking about. Who are these people and how could they be the “backbone of our democracy”? They certainly seem to be the backbone of the socialist Left. At a recent meeting of the Democratic Socialists, FDR would have encountered a strange menagerie of activists calling themselves eco-socialists, Afro-socialists, Islamosocialists, Chicano socialists, sanctuary socialists, #MeToo socialists, disability socialists, queer socialists, and transgender socialists.

Typical of the new type of socialist is Stacey Abrams, the Democrat who narrowly lost the governor’s race in Georgia. “My campaign,” she says, “championed reforms to eliminate police shootings of African Americans, protect the LBGTQ community against ersatz religious freedom legislation, expand Medicaid to save rural hospitals, and reaffirm that undocumented immigrants deserve legal protections.” Only one of these four planks—the one about saving rural hospitals—would be even remotely recognizable to FDR as part of the progressive agenda.

During this year’s Democratic primary, each candidate tried to play his or hertheir diversity card. Pete Buttigieg was white and male, but hey, at least he was gay! Cory Booker, regrettably, was a man, but fortunately for him he was a black man. Julian Castro affirmed his Latino status despite his inability to speak Spanish. This put him a notch above Beto who, after all, only had the Mexican nickname. Elizabeth Warren had the woman card but she also wanted the native American card—making her a “two-fer”—so she faked her Indian ancestry. Kamala Harris, the winner of this sweepstakes, is both female and a person of color.

The great irony, of course, is despite this identity parade the guy who got the nomination is the whitest male of them all, Joe Biden. Given his semi-comatose state, he almost qualifies as a “dead white male.” As if to remedy this predicament, Biden has pledged to nominate a woman as his running mate, and he will get extra points if he nominates a minority woman like Stacey Abrams. Identity socialism now defines the Democrats and carries even an old white geezer like Biden in its wake.

The implications of this go beyond party politics; they involve how the Lleft views the country itself. For FDR, America was an “imagined community.” I get this term from sociologist Benedict Anderson. Anderson points out that a nation is imagined because it is made up of people who have never met and don’t know each other. Yet nations seek to create a “deep horizontal comradeship” in which we identify with people we’vre never heard of. They are our “fellow citizens.”

This identification is critical because, without it, who would be willing to die for his or her country? Anderson points out that no one is willing to die for the Labor Party, the American Medical Association, or the United Way. Not only soldiers but even cops and firemean who risk their lives for “strangers” must have an imagined comradeship with those strangers. Lincoln understood this. Memorial Day was created immediately following the Civil War, and it was during that era that the American flag becoame a symbol of quasi-religious national devotion.

Even socialist redistribution within a country relies on some sense of solidarity among the citizens; otherwise why should my hard-earned money go to pay the medical expenses of someone I couldn’t care less about? In India I learned a proverb that may seem somewhat heartless, “The tears of strangers are only water.” The basic idea, however, is that we have an obligation to help only our own; if others have a problem we wish them well, but it’s their problem.

For FDR the New Deal was a patriotic project. He routinely defended his programs in terms of “the greater good of the greater number.” Moreover, he appealed to this same patriotic solidarity during World War II. Martin Luther King, Jr. also spoke in terms of restoring the “beloved community.” The basic idea here is that America is a good country, based on noble ideals. The political task is to fully to integrate and assimilate everyone—blacks, women, immigrants—into that America.

Today’s socialist Lleft, however, wants an American that integrates the groups seen as previously excluded while excluding the group that was previously included. “If you are white, male, heterosexual, and religiously or socially conservative,” writes blogger Rod Dreher, “there’s no place for you” on the progressive lLeft. On the contrary, it should now be expected that in society “people like you are going to have to lose their jobs and influence.”

In other words, for identity socialists and for the Lleft more generally, blacks and Latinos are in, whites are out. Women are in, men are out. Gays, bisexuals, pansexuals and transsexuals, together with other, more exotic, types are in; heterosexuals are out. Illegals are in, native-born citizens are out. One may think this is all part of the politics of inclusion, but to think that is to get only half the picture. The point, for the Lleft, is not merely to include but also to exclude, to estrange their opponents from their native land.

Consider how normalcy has been defined in America. Since whites have been a clear majority, whiteness was the norm. Since the structure of society was, however loosely, patriarchal, maleness was also seen as normative. And of course the same applied to heterosexuality, since most people are heterosexual. For the socialist lLeft, it’s vital to overturn this hierarchy not by leveling the playing field but by creating an inverse hierarchy. Whiteness, maleness, and heterosexuality are now viewed as pathological, as forms of oppression. In this way, the Left by design seeks to demonize white male heterosexuals and thus to make a large body of Americans feel like aliens in their own country.

How did we get here? This, I believe, is the story of the 1960s, because that was when this great shift occurred. One man, whose name few people know, was the prophet of the change. He is the one who posed the big question: how do you get socialism when the people who are supposed to want it the most don’t want it? How do you create a proletariat when the original proletariat opts out? And where do you find the replacements? To answer these questions is to discover the roots of the socialist Lleft that defines and directs today’she Democratic Party.

Marcuse’s Marxist Conundrum

To understand identity socialism, we must go back several decades and meet the man who figured out how to bring its various strands together, Herbert Marcuse. A German philosopher partly of Jewish descent, Marcuse studied under the philosopher Heidegger before escaping Germany prior to the Nazi ascent. After stints at Columbia, Harvard and Brandeis, Marcuse moved to California, where he joined the University of San Diego and became the guru of the New Left in the sixties.

Marcuse influenced a whole generation of young radicals, from Weather Underground co-founder Bill Ayers to Yippie activist Abbie Hoffman to Tom Hayden, president of the activist group Students for a Democratic Society (SDS). Angela Davis, who later joined the Black Panthers and also ran for vice president on the Communist Party ticket, was a student of Marcuse and also one of his protéegeés. It was Marcuse, Davis said, who “taught me that it is possible to be an academic, an activist, a scholar and a revolutionary.”

Marcuse egged on the activists of the 1960s to seize buildings and overthrow the hierarchy of the university, as a kind of first step to fomenting socialist revolution in America. Interestingly, it was Ronald Reagan—then governor of California—who got Marcuse fired. Still, Marcuse retained his celebrity and influence over the radicals of the time. He did not, of course, create the forces of identity socialism but he saw, perhaps earlier than anyone else, how they could form the basis for a new and viable socialism in America. That’s the socialism we are dealing with now.

To understand the problem Marcuse confronted, we have to go back to Marx. Marx saw himself as the prophet, not the instigator, of the advent of socialism. We think of Marx as some sort of activist, seeking to organize a workers’ revolution, but Marx emphasized from the outset that the socialist revolution would come inevitably; nothing had to be done to cause it. The Marxist view is nicely summed up by one of Marx’s German followers, Karl Kautsky, who wrote, “Our task is not to organize the revolution but to organize ourselves for the revolution; it is not to make the revolution, but to take advantage of it.”

But what happens when the working class is too secure and contented to revolt? Marx didn’t anticipate this; in fact, the absence of a single worker revolt of the kind Marx predicted, anywhere in the world, is a full and decisive refutation of “scientific” Marxism. In the early 20twentieth century, Marxists across the world—from Lenin to Mussolini—were fully aware of this problem. Fascism or national socialism represented one way to respond to it; Leninism represented another.

I’ll focus on Lenin, because his was the approach that influenced Marcuse and the New Left in the 1960s. Basically Lenin argued that the working class was never going to revolt; they might join trade unions, but that was about it. In Lenin’s diagnosis, workers could develop “trade union consciousness” but not “revolutionary consciousness.” So then what? In his famous work What Is To Be Done? Lenin insisted that the socialist revolution would not be done by the working class; it would have to be done for them.

In other words, a professional class of activists and fighters would be required to serve as a revolutionary vanguard. Lenin assembled a varied group of landless farmers, professional soldiers, activist intellectuals and attorneys, and criminals to collaborate with him in overthrowing the czar and introducing Bolshevik socialism to Russia. Although Lenin presented his approach as continuous with Marxism, it represented, as socialists around the world recognized, a radical break with and revision of Marxism.

Around the same time, in the early 1920s, the Italian Communist Antonio Gramsci made his own revision of socialist theory by introducing the concept of culture. “Hegemony” was Gramsci’s key concept. He insisted that the capitalists did not rule society solely on the basis of economic power. Rather, they ruled through “bourgeois values” that permeated the cultural, educational, and psychological realm of society. Economics, Gramsci insisted, is a subset of culture. Economics is shaped by culture no less than culture is shaped by the economic basis of society.

For Gramsci, socialist revolution under current conditions was impossible because the working class had internalized bourgeois values. The ordinary worker had no intention of toppling his employers; his aspiration was to become like them. Gramsci’s solution was for socialist activists to figure out a way to break this hegemony, and to establish a hegemony of their own. To do this they would have to take over the universities, the art world, and the culture more generally. In this way they could combat bourgeois culture “from within.”

Lenin and Gramsci provided Marcuse’s starting point. He agreed with both of them that the working class had become a conservative, counterrevolutionary force. But his greatest early influence was a third man, Heidegger. Marcuse read Heidegger’s magnum opus Being and Time and it inspired him so much he apprenticed himself to Heidegger, becoming first his student and then his faculty assistant at the University of Freiburg. Marcuse found in Heidegger a way to ground socialism in something more profound than better salaries and working conditions, in something that transcended Marx’s materialism itself.

The basic idea of Heidegger’s magnum opus Being and Time is that we are finite beings, “thrown,” as Heidegger puts it, into the world, with no knowledge of where we came from, what we are here for, or where we are going. We live in a present, yet we are constantly aware of multiple future possibilities, in which we must choose even though we can only know in retrospect whether we chose wisely and well. This radical uncertainty about our situation, Heidegger argued, produces in us anxiety—anxiety that is heightened by our knowledge of death. “Being,” in other words, is bracketed by “time.” Humans are perishable beings that for the time being are.

Yet how should we “be”? That, for Heidegger, was the big question. Not “what is it good to do?” but “how is it good to be?” Typically, we have no answer to this question; we are barely even aware of it as a question. We go through life like a twig in a current, steered by a tide of sociability and conformity. Thus we lose ourselves; we cease to be “authentic.” Authenticity, for Heidegger, means coming to terms with our mortality and living the only life we get on our own terms. We cannot rely on God to show us the way; we are alone in the world, and have to find a way for ourselves. Frank Sinatra’s song, “I did it my way,” expresses a distinct Heideggerian consciousness.

Marcuse eventually broke with Heidegger when he heard that Heidegger had both joined the Nazi Party and become an apologist for Hitler. Marcuse seems to have had no objection to Heideggers’—or Hitler’s—national socialism, although being partly Jewish, he was naturally less enthusiastic about the accompanying anti-Semitism. Even so, Marcuse continued to draw from Heidegger’s philosophy to illuminate the political problems he was dealing with.

Essentially his problem was the same as the one Lenin faced: if the working class isn’t up for socialism, where to find a new proletariat to bring it about? Marcuse knew that modern industrialized countries like America couldn’t assemble the types of landless peasants and professional soldiers—the flotsam and jetsam of a backward feudal society—that Lenin relied on. So who could serve in the substitute proletariat that would be needed to agitate for socialism in America?

Marcuse looked around to identify which groups had a natural antipathy to capitalism. Marcuse knew he could count on the bohemian artists and intellectuals who had long considered hated industrial civilization, in part because they considered themselves superior to businessmen and shopkeepers. In Germany, this group distinguished “culture”—by which they meant art—from “civilization”—by which they meant industry—and they were decidedly on the side of culture. In fact, they used art and culture to rail against bourgeois capitalism.

These were the roots of bohemianism and the avant garde. “Bohemia,” wrote Henry Murger, “leads either to the Academy, the Hospital or the Morgue.” Elizabeth Wilson in her book Bohemians concurs. “Bohemia offered a refuge to psychological casualties too disturbed to undertake formal employment or conform to the rules of conventional society. It was a sanctuary for individuals who were so eccentric or suffered from such personal difficulties or outright psychological disorder that they could hardly have existeding outside a psychiatric institution other than in Bohemia.”

These self-styled “outcasts” were natural recruits for what Marcuse termed the Great Refusal—the visceral repudiation of free market society. The problem, however, was that these bohemians were confined to small sectors of Western society: the Schwabing section of Munich, the Left Bank of Paris, Greenwich Village in New York, and a handful of university campuses. By themselves, they were scarcely enough to hold a demonstration, let alone make a revolution.

A New Proletariat

So Marcuse had to search further. He had to think of a way to take bohemian culture mainstream, to normalize the outcasts and to turn normal people into outcasts. He started with an unlikely group of proles: the young people of the 1960s. Here, finally, was a group that could make up a mass movement.

Yet what a group! Fortunately, Marx wasn’t around to see it; he would have burst out laughing. Abbie Hoffman? Jerry Rubin? Mario Savio? Joan Baez? Bob Dylan? How could people of this sorry stripe, these slack, spoiled products of postwar prosperity, these parodies of humanity, these horny slothful loafers completely divorced from real-world problems, and neurotically focused on themselves, their drugs and sex lives and mind-numbing music, serve as the shock troops of revolution?

Marcuse’s insight was Heideggerian: by teaching them a new way to be “authentic.” By “raising their consciousness.” The students were already somewhat alienated from the larger society. They lived in these socialist communes called universities. They took for granted their amenities. Ungrateful slugs that they were, they despised rather than cherished their parents for the sacrifices made on their behalf. They sought “something more,” a form of self-fulfillment that went beyond material fulfillment.

Here, Marcuse recognized, was the very raw material out of which socialism is made in a rich, successful society. Perhaps there was a way to instruct them in oppression, to convert their spiritual anomie into political discontent. Marcuse was confident that an activist group of professors could raise the consciousness of a whole generation of students so that they could feel subjectively oppressed even if there were no objective forces oppressing them. Then they would become activists to fight not someone else’s oppression, but their own.

Of course it would take some work to make selfish, navel-gazing students into socially conscious activists. But to Marcuse’s incredible good fortune, the sixties was the decade of the Vietnam War. Students were facing the prospect of being drafted. Thus they had selfish reasons to oppose the war. Yet this selfishness could be harnessed by teaching the students that they weren’t draft-dodging cowards; rather, they were noble resisters who were part of a global struggle for social justice. In this way bad conscience could itself be recruited on behalf of left-wing activism.

Marcuse portrayed Ho Chi Minh and the Vietcong as a kind of Third World proletariat, fighting to free itself from American hegemony. This represented a transposition of Marxist categories. The new working class were the Vietnamese “freedom fighters.” The evil capitalists were American soldiers serving on behalf of the American government. Marcuse’s genius was to tell leftist students in the 1960s that the Vietnamese “freedom fighters” could not succeed without them.

“Only the internal weakening of the superpower,” Marcuse wrote in An Essay on Liberation, “can finally stop the financing and equipping of suppression in the backward countries.” In his vision, the students were the “freedom fighters” within the belly of the capitalist beast. Together the revolutionaries at home and abroad would collaborate in the Great Refusal. They would jointly end the war and redeem both Vietnam and America. And what would this redemption look like? In Marcuse’s words, “Collective ownership, collective control and planning of the means of production and distribution.” In other words, classical socialism.

Okay, so now we got the young people. Who else? Marcuse looked around America for more prospective proles, and he found, in addition to the students, three groups ripe for the taking. The first was the Black Power movement, which was adjunct to the civil rights movement. The beauty of this group, from Marcuse’s point of view, wais that it would not have to be instructed in the art of grievance; blacks had grievances that dated back centuries.

Consequently, here was a group that could be mobilized against the status quo, and if the status quo could be identified with capitalism, here was a group that should be open to socialism. Through a kind of Marxist transposition, “blacks” would become the working class, “whites” the capitalist class. Race, in this analysis, takes the place of class. This is how we get Afro-socialism, and from here it is a short step to Latino socialism and every other type of ethnic socialism.

Another emerging source of disgruntlement was the feminists. Marcuse recognized that with effective consciousness- raising they too could be taught to see themselves as an oppressed proletariat. This of course would require another Marxist transposition: “women” would now be viewed as the working class and “men” the capitalist class; the class category would now be shifted to gender.

“The movement becomes radical,” Marcuse wrote, “to the degree to which it aims, not only at equality within the job and value structure of the established society…but rather at a change in the structure itself.” Marcuse’s target wasn’t just the patriarchy; it was the monogamous family. In Gramscian terms, Marcuse viewed the heterosexual family itself as an expression of bourgeois culture, so in his view the abolition of the family would help hasten the advent of socialism.

Marcuse didn’t write specifically about homosexuals or transgenders, but he was more than aware of exotic and outlandish forms of sexual behavior, and the logic of identity socialism can easily be extended to all these groups. Once again we need some creative Marxist transposition. Gays and transgenders become the newest proletariat, and heterosexuals—even black and female heterosexuals—become their oppressors.

We see here the roots of “intersectionality.” As the Left now holds, one form of oppression is good but two is better and three or more is best. The true exemplar of identity socialism is a black or brown male transitioning to be a woman with a Third World background who is trying illegally to get into this country because his—oops, her—own country has allegedly been wiped off the map by climate change.

These latest developments go beyond Marcuse. He didn’t know about intersectionality, but he did recognize the emerging environmental movement as an opportunity to restrict and regulate capitalism. The goal, he emphasized, was “to drive ecology to the point where it is no longer containable within the capitalist framework,” although he recognized that this “means first extending the drive within the capitalist framework.”

Marcuse also inverted Freud to advocate the liberation of eros. Freud had argued that primitive man is single-mindedly devoted to “the pleasure principle,” but as civilization advances, the pleasure principle must be subordinated to what Freud termed “the reality principle.” In other words, civilization is the product of the subordination of instinct to reason. Repression, Freud argued, is the necessary price we must pay for civilization.

Marcuse argued that at some point, however, civilization reaches a point where humans can go the other way. They can release the very natural instincts that have been suppressed for so long and subordinate the reality principle to the pleasure principle. This would involve a release of what Marcuse termed “polymorphous sexuality” and the “reactivation of all erotogenic zones.”

We are a short distance here from the whole range of bizarre contemporary preoccupations: unisexuality—people falling in love with themselves—group sexuality, pansexuality—people who do not confine their sexuality to their species—and people who attempt to have sex with trees.

Marcuse recognized that mobilizing all these groups—the students, the environmentalists, the blacks, the feminists, the gays—would take time and require a great deal of consciousness- raising or reeducation. He saw the university as the ideal venue for carrying out this project, which is why he devoted his own life to teaching and training a generation of socialist and left-wing activists. Over time, Marcuse believed, the university could produce a new type of culture, and that culture would then metastasize into the larger society to infect the media, the movies, even the lifestyle of the titans of the capitalist class itself.

Marcuse, in other words, foresaw an America in which bourgeois culture would be replaced by avant garde culture. He foresaw a society in which billionaires would support socialist schemes that took away a part of their wealth in exchange for social recognition conferred by cultural institutions dominated by the socialists. Bill Gates, Warren Buffett, and Mark Zuckerberg are three owlish geeks who were probably ridiculed in junior high school; they don’t seem to mind paying higher taxes if they can now hobnob with comedians, rock stars and Hollywood celebrities. Why only be rich when you can also be rich and cool?

Marcuse’s project—the takeover of the American university, to make it a tool of socialist indoctrination—did not succeed in his lifetime. In fact, as mentioned above, he got the boot when Governor Reagan pressured the regents of the university system not to renew Marcuse’s contract. In time, however, Marcuse succeeded as the activist generation of the 1960s gradually took over the elite universities. Today socialist indoctrination is the norm on the American campus, and Marcuse’s dream has been realized.

Marcuse is also the philosopher of Antifa. He argued, in a famous essay called “Repressive Tolerance,” that tolerance is not a norm or right that should be extended to all people. Yes, tolerance is good, but not when it comes to people who are intolerant. It is perfectly fine to be intolerant against them, to the point of disrupting them, shutting down their events, and even preventing them from speaking.

Marcuse didn’t use the term “hater,” but he invented the argument that it is legitimate to be hateful against haters. For Marcuse there were no limits to what could be done to discredit and ruin such people; he wanted the Left to defeat them “by any means necessary.” Marcuse even approved of certain forms of domestic terrorism, such as the Weather Underground bombing the Pentagon, on the grounds that the perpetrators were attempting to stop the greater violence that U.S. forces inflict on people in Vietnam and other countries.

Our world is quite different now from what it was in the 1960s, and yet there is so much that seems eerily familiar. When it comes to identitfy socialism, we are still living with Marcuse’s legacy.

Weekend Long Read

This essay is adapted from “The Coming of Neo-Feudalism: A Warning to the Global Middle Class,” by Joel Kotkin (Encounter Books, 288 pages, $28.99)

A New Age of Feudalism for the Working Class?

If too many of the American working class lack any hope of improving their condition, we could face dangerous upheaval in the near future.

In the past, fears of job losses from automation were often overstated. Technological progress eliminated some jobs but created others, and often better-paying ones. In the early days of the high-tech revolution, many of the pioneering firms—such as Hewlett-Packard, Intel, and IBM—were widely praised for treating their lower-level workers as part of the company and deserving of opportunities for advancement, as well as benefits including health insurance and a pension.

The labor policies of the newer generation of tech giants tend to be vastly different. Firms like Tesla have been sued for failing to pay contract workers the legally mandated overtime rates, and for depriving them of meal and rest breaks. The Tesla plant has wages below the industry average, according to workers, and risk of injury higher than the industry average, notes a pro-labor nonprofit. Given that the high housing prices keep them living far from the workplace, some workers sleep in the factory hallways or in their cars.

“Everything feels like the future but us,” complained one worker.

The largest tech employer today is Amazon, with 798,000 employees worldwide in 2019. Amazon tends to pay its workers less than rivals do. Many employees rely on government assistance, such as food stamps, to make ends meet. When the company announced it was adopting a minimum wage of $15 an hour, it also cut stock options and other benefits, largely wiping out the raises, at least for long-term employees.

The average Amazon worker in 2018 made less than $30,000 annually, about the same as the CEO made every 10 seconds.

Working conditions at Amazon are often less than optimal. Warehouse workers in Britain reportedly were urinating in bottles to avoid being accused of “time-wasting” for taking breaks. Amazon has also patented wristbands that track employee movements, described as a “labor-saving measure.” Those who can’t keep up the pace are written up and then fired, said one British worker. “They make it like the ‘Hunger Games.’ That’s what we actually call it.”

Apple manufactures virtually all its products abroad, mostly in China, although medical concerns and political factors might change that. In addition to its own employees there, the company relies on the labor of more than 700,000 workers—roughly 10 times its U.S. employment—to build Apple products at contractors like Foxconn. These workers suffer conditions that have led to illegal strikes and suicides; workers often claim they are treated no better than robots.

From Proletariat to Precariat

In the old working-class world, unions often set hours and benefits, but many low-status workers today are sinking into what has been described as the “precariat,” with limited control over their working hours and often living on barely subsistence wages.

One reason for this descent is a general shift away from relatively stable jobs in skill-dependent industries or in services like retail to such occupations as hotel housekeepers and home care aides.  People in jobs of this kind have seen only meager wage gains, and they suffer from “income volatility” due to changing conditions of employment and a lack of long-term contracts.

This kind of volatility has become more common even in countries with fairly strong labor laws. In Canada, the number of people in temp jobs has been growing at more than triple the pace of permanent employment, since many workers who lose industrial jobs fail to find another full-time permanent position. The same patterns can be seen in traditionally labor-friendly European countries. From 20 to 30 percent of the working-age population in the EU15 and the United States, or up to 162 million individuals, are doing contract work. A similar trend shows up in developing countries such as Kenya, Nigeria, South Africa, Vietnam, Malaysia, and the Philippines.

Even in Japan, long known as a country of secure long-term employment, the trend is toward part-time, conditional work. Today, some 40 percent of the Japanese workforce are “irregular,” also known as “freetors,” and this group is growing fast while the number of full-time jobs is decreasing. The instability in employment is widely seen as one reason for the country’s ultra-low birth rate.

Many of today’s “precariat” work in the contingent “gig” economy, associated with firms such as Uber and Lyft. These companies and their progressive allies, including David Plouffe (who managed Barack Obama’s presidential campaign in 2008), like to speak of a “sharing” economy that is “democratizing capitalism” by returning control of the working day to the individual. They point to opportunities that the gig economy provides for people to make extra money using their own cars or homes. The corporate image of companies like Uber and Lyft features moonlighting drivers saving up cash for a family vacation or a fancy date while providing a convenient service for customers—the ultimate win-win.

Yet for most gig workers there’s not very much that is democratic or satisfying in it. Most are not like the middle-class driver in Uber ads, picking up some extra cash for luxuries. Instead, they depend on their “gigs” for a livelihood, often barely making ends meet. Almost two-thirds of American gig workers in their late 30s and 40s—the age range most associated with family formation—were struggling to pay their bills. Nearly half of gig workers in California live under the poverty line. One survey of gig workers in 75 countries including the United States found that most earned less than minimum wage, leading one observer to label them “the last of Marx’s oppressed proletarians.”

The reasons for their precarious situation are not hard to locate. Gig workers lack many basic protections that full-time workers might have, such as enforcement of civil rights laws. Workers without representation, or even set hours, do not have the necessary tools to protect their own position; they are essentially fungible, like day laborers anywhere. Robert Reich, former U.S. secretary of labor, has gone so far as to label the “sharing” economy a “share-the-scraps” economy. Rather than providing an “add on” to a middle-class life, gig work for many has turned out to be something closer to serfdom.

Cultural Erosion in the Working Class

The downward economic trajectory of the working class has been amplified by cultural decline. The traditional bulwarks of communities—religious institutions, extended family, neighborhood and social groups, trade unions—have weakened generally, but the consequences are most damaging for those with limited economic resources.

Social decay among the working class echoes what occurred in the first decades of the industrial revolution, when family and community structures and bonds of religion buckled and often broke. Rampant alcoholism spread “a pestilence of liquor across all of Europe,” wrote the Marxist historian E. J. Hobsbawm. In the mid-19th century, 40,000 prostitutes plied their trade in London. The physical condition of British workers was horrible: most were malnourished and suffered various job-related maladies. As late as 1917, only one-third of young males were considered to be in good health.

In America and elsewhere today, the working classes lag behind the affluent in family formation, academic test scores, and graduation rates. Marriages may be getting more stable in the upper classes, as the sociologist Stephanie Coontz has shown, but as many as 1-in-3 births in the nation occurs outside matrimony. In some working-class neighborhoods, particularly those with a large proportion of ethnic minorities, four-fifths of all children are born to unmarried mothers. The rate of single parenting is the most significant predictor of social immobility across the United States and in Europe as well.

These social patterns parallel changes in economic trends. A detailed study in the United States published in 2017 shows that when towns and counties lose manufacturing jobs, fertility and marriage rates decrease, while out-of-wedlock births and the share of children living in single-parent homes increases. In addition, a variety of health problems—obesity, diabetes, disease of the heart, kidney, or liver—occur at much higher rates when family income is under $35,000 than when it is over $100,000. Between 2000 and 2015, the death rate increased for middle-aged white Americans with a low educational level. Anne Case and Angus Deaton say this trend owes primarily to “deaths of despair”: suicides as well as deaths related to alcohol and drugs, including opioids. In Europe likewise, a health crisis including drug addiction and drug-related deaths has emerged in old industrial areas, especially in Scotland.

In East Asia, traditionally known for strong family structures, the working class is showing signs of social erosion. Half of all South Korean households have experienced some form of family crisis, mostly involving debt, job loss, or issues relating to child or elder care, notes one recent study. Japan has a rising “misery index” of divorces, single motherhood, spousal and child abuse—all of which accelerate the country’s disastrous demographic decline and deepen class division.

An even greater social challenge may emerge in China, where some authorities are concerned about the effects of deteriorating family relations, particularly in care for aging parents. The government has started a campaign to promote the ideal of “filial piety,” a surprising revival of Confucian ideals by a state that previously attempted to eradicate them.

The problem of family breakdown is especially severe in the Chinese countryside. The flow of migrants into the cities in search of work has resulted in an estimated 60 million “left behind children” and nearly as many “left behind elderly.” The migrants themselves suffer from serious health problems, including venereal disease at rates far higher than the national norm, but the children left behind in rural villages face especially difficult challenges. Scott Rozelle, a professor at Stanford University, found that most of these children are sick or malnourished, and as many as two in three suffer from anemia, worms, or myopia. Rozelle predicts that more than half the left-behind toddlers are so cognitively delayed that their IQs will never exceed 90. This portends a future of something like the Gammas and Epsilons of Brave New World.

The Gentrification of the Left

In developed nations, as the middle classes are being proletarianized and the working classes fall further behind, the longstanding alliance between the intellectual Left and the working class is dissolving.

Already in the 1960s, New Left radicals such as C. Wright Mills and Ferdinand Lundberg disparaged the mental capacity of average Americans. Most of the population, according to Lundberg, were “quite misinformed, and readily susceptible to be worked upon, distracted.” The general acceptance of capitalism by the working class, as well as questions of race and culture, led many on the Left to seek a new coalition to carry the progressive banner. For its part, the working class has moved away from its traditional leftist affiliation not only in the United States but also across Europe and the United Kingdom.

“The more than 150-year-old alliance between the industrial working class and what one might call the intellectual-cultural Left is over,” notes Bo Rothstein, a Swedish political scientist. He suggests that a “political alliance between the intellectual left and the new entrepreneurial economy” could replace the old “class struggle” model and provide a way to “organize public services in a new and more democratic way.”

Across Europe, traditional parties of the Left now find their backing primarily among the wealthy, the highly educated, and government employees. Germany’s Social Democrats, France’s Socialists, and the British and Australian Labor parties have been largely “gentrified,” as has America’s Democratic Party, despite the resurgence of “democratic socialism” as part of its ideology. They have shifted their emphasis away from their historic working-class base, toward people with college and graduate degrees.

Even more than disagreements over immigration and cultural values, differences in economic interests have driven a wedge between the established Left and the working class. The agenda promoted by the leftist clerisy and the corporate elite—on immigration, globalization, greenhouse gas emissions—does not threaten their own particular interests. But it often directly threatens the interests of working-class people, especially in resource-based industries, manufacturing, agriculture, and construction. Environmental policy in places like California and western Europe has tended to ignore the concerns of working-class families.

The continuing heavy use of coal, oil, and other fossil fuels—still increasing in countries like India and China—may present a danger to humanity’s future, but it has contributed greatly to wealth creation and the comfort of the working class since the 18th century. Plans for a drastic reduction in the use of carbon-based energy by 2050 would force middle-class Americans to be more like North Koreans in their energy consumption.

In Europe, green energy mandates have caused a spike in energy costs. As many as one in four Germans and over half of Greeks have had to spend 10 percent or more of their income on energy, and three-fourths of Greeks have cut other spending to pay their electricity bills, which is the economic definition of “energy poverty.” These mandates have far less impact on the wealthy.

In their zeal to combat climate change, the clerisy have taken aim at things like suburban homes, cars, and affordable airfare. The lifestyles of the middle and working classes are often criticized by the very rich, who will likely maintain their own luxuries even under a regime of “sustainability.” A former UK environment minister said that cheap airfare represents the “irresponsible face of capitalism.” Apparently the more expensive travel done by the wealthy, including trips by private jet to conferences on climate change, is not so irresponsible. New regulations and taxes on fuel imposed by France’s aggressively green government sparked the gilets jaunes uprising, as well as the previous bonnets rouges protests in Brittany.

Those in today’s intellectual Left are concerned about the planet and about international migrants, but not so much about their compatriots in the working class. The French philosopher Didier Eribon, a gay man who grew up in a struggling working-class family in provincial Reims, describes a deep-seated “class racism” in elite intellectual circles toward people like his family.

Working-class voters in France were joyful at the socialist victory in the 1981 election, but then found themselves supporting a government whose priorities turned out to be “neoliberalism,” multiculturalism, and modernization. One result is widespread cynicism toward the political establishment. Eribon recalls his socialistically inclined mother saying, “Right or Left, there’s no difference. They are all the same, and the same people always end up footing the bill.”

Realignment

As the major left-leaning parties in high-income countries have become gentrified, the political orientation of working-class voters is realigning. Populist and nationalist parties in Sweden, Hungary, Spain, Poland, and Slovakia have done particularly well among younger votes. In fact, many of the right-wing nationalist parties are led by millennials. American millennials too are surprisingly attracted to right-wing populism. In November 2016, more white American millennials voted for Donald Trump than for Hillary Clinton. Their much-ballyhooed shift toward the Democratic Party has reversed, and now less than a majority identify as Democrats.

More broadly, a sense of betrayal among those being left behind by progress is leading to defections from mainstream parties of both Right and Left. Among the working classes and the young, there is a steady growth of far-Left opposition to the established liberal order, as well as strong support for the far Right. This increasing movement away from the center and toward the fringes is not an ideal formula for a stable democratic society.

As Tocqueville put it, we may be “sleeping on a volcano.”

Peasant Rebellions

Will the world’s working classes accept their continuing decline? We are already seeing what might be described as “peasant rebellions” against the globalist order that is being constructed by the oligarchs and their allies in the clerisy. In recent years, an insurrectionary spirit has surfaced in the Brexit vote, the rise of neonationalist parties in Europe and authoritarian populists in Brazil and the Philippines, and of course the election of Donald Trump.

At the core of these rebellions against the political mainstream lies the suspicion among the lower classes that the people who control their lives—whether corporate bosses or government officials—do not have their interests at heart. The slow-growth economy that emerged from the Great Recession benefited the financial elite and property speculators, but did little for the vast majority of people. Firms like Apple have profited from soaring stock prices and low-wage Chinese production while less capital-rich businesses have struggled.

These lopsided economic results have prompted attrition from the traditional mainstream political parties in many countries.

In multiparty democracies, a reaction against economic globalization and mass immigration, among other policies, has resulted in pronounced movement to the political fringes. One Harvard study found that anti-establishment populist parties across Europe expanded their share of the electorate from 10 percent in 1990 to 25 percent in 2016. At the same time, center-Left parties are losing ground to far-Left parties or candidates.

Is this only a prelude to a more serious kind of rebellion—one that could undermine democratic capitalism itself?

A Brief History of Peasant Rebellions

Admirers of medieval feudalism highlight the concept of mutual obligation between the classes. The upper clergy and the military aristocracy practiced a kind of noblesse oblige that provided a floor (albeit often insufficient) for the lower classes. But the obligations of the lower to the higher classes may have been no more voluntary than those binding the Cosa Nostra.

The medieval poor did not always accept their miserable situation quietly. Uprisings broke out as early as Charlemagne’s reign in the 9th century, and became more common in the later Middle Ages. Violent peasant armies actually bested aristocratic knights in the Low Countries in 1227, in Northern Germany in 1230, and in the Swiss Alps in 1315. The brutal 14th century brought a rash of peasant rebellions and urban insurrections. French peasants burned down manors of the wealthy in the Jacquerie of 1358, aiming to “destroy all the nobles and gentry in the world and there would be none any more.” After being routed by armies of nobility and gentry, the insurgents were subjected to a campaign of reprisal that cost an estimated 20,000 lives.

In England, a labor shortage following the great plague resulted in higher pay and more mobility for laborers, but Parliament and big landowners took measures to hold down wages and keep peasants on their estates. Then, a new poll tax sparked a large-scale uprising led by Watt Tyler in 1381. A radical priest named John Ball traveled up and down England stirring up peasants, and in a speech outside London he famously asked: “When Adam delved and Eve span, who was then the gentleman?” The rebels’ demands included abolition of serfdom and feudal service, an end to market monopolies and other restrictions on buying and selling, and confiscation of clerical property.

Violent uprisings of peasants or urban poor also broke out in many other places, including Flanders, Florence, Lübeck, Paris, Transylvania, Croatia, Estonia, Galicia, and Sweden. But the biggest social upheaval before the French Revolution was the great Peasants’ Rebellion of 1525 in Germany. Among the demands presented in the “Twelve Articles of the Peasantry” were the abolition of serfdom, restrictions on feudal dues, the right to fish and hunt, and the right of peasants to choose their own priest. The rebels took inspiration from Martin Luther’s doctrine of a “priesthood of all believers,” but Luther himself became horrified by their violence. The rebellion was put down so savagely that it dissuaded further uprisings in Germany.

Only rarely did such rebellions prove successful, like the one by the Swiss peasants. The ruling powers sometimes used treachery to quell uprisings by offering pardons that were eventually revoked. In 17th-century England, Cromwell’s “respectable revolution” quashed the efforts of the Levellers to extend Parliament’s war against the monarchy into a radical egalitarian reordering of society. Southern and western France endured frequent rural protests through much of the seventeenth century.

Peasant rebellions also occurred in other parts of the world, often with greater ferocity. Japan had numerous ikki or peasant uprisings, particularly in the fifteenth century; the consolidation of power under the shogun in 1600 finally put an end to the disturbances. There were numerous uprisings and revolutions in Mexico, but it was only in the early 20th century that the peones finally overturned the quasi-feudal regime left over from the Spanish legacy. They achieved significant land reform, but at the cost of well over 1 million lives.

In Russia, with its overwhelmingly rural society, peasant rebellions were commonplace by the 17th century. A revolt among Ural Cossacks under Emelian Pugachev threatened the czarist regime in 1773, during the reign of Catherine the Great. The rebellion failed, as did some 550 others, but in 1917 the peasants rose up to support Lenin’s seizure of power. When the Soviet regime began to confiscate land for collectivization, the property-loving muzhiks rebelled, only to be put down ruthlessly.

Arguably the most powerful peasant rebellion occurred in China, in 1843. After failing civil service exams several times, Hung Hsiu-ch’uan read some Christian tracts and connected their message with hallucinations he had experienced. He designed his own religion, in which he was part of the Holy Trinity, but with doctrines based mainly on the Ten Commandments, and he preached it to destitute laborers. His Taiping Rebellion called for the overthrow of the Manchu Ch’ing dynasty, land reform, improving the status of women, tax reduction, eliminating bribery, and abolishing the opium trade. The rebellion was finally put down more than a decade later, with massive loss of life. Some of the Taiping program would later be adopted by Sun Yat-sen, who would overthrow the imperial regime, and then by Mao Tse-tung and the Communists.

The Revolt Against Mass Migration

The contemporary versions of peasant rebellions, particularly in Europe and the United States, are in large part a reaction against globalization and the mass influx of migrants from poor countries with very different cultures. The numbers of international migrants worldwide swelled from 173 million in 2000 to 258 million in 2017; of these, 78 million were living in Europe and 50 million in the United States.

Mass migration from poorer to wealthier countries seems all but unstoppable, given the great disparities between them. According to a Gates Foundation study, 22 percent of the people in sub-Saharan Africa live in extreme poverty, defined as subsisting on less than $1.90 a day. By 2050, the region will be home to 86 percent of the world’s poorest people, and about half that number will live in just two countries, Nigeria and the Democratic Republic of the Congo. For the extremely poor in such countries, who see little to no chance of improving their condition at home, a dangerous trek to Europe or some other wealthy place would seem worth the risk.

Many people in Europe have welcomed migrants from poorer countries, including former colonies. Political and cultural elites in particular have elevated cosmopolitanism and “diversity” above national identity and tradition. Tony Blair’s “Cool Britannia” was an effort to highlight cultural diversity as a central part of modern Britain’s identity. Herman Lebovics, in Bringing the Empire Back Home: France in the Global Age (2004), pondered how to redefine what it means to be French in a multicultural age.

When Germany’s chancellor, Angela Merkel, flung the doors wide open to a huge wave of refugees and migrants from the war-ravaged Middle East in 2015, many ordinary Germans were eager to show Gastfreundschaft, or hospitality, as were many people elsewhere in Europe. By the end of that year, nearly a million refugees had entered Germany alone, and the public welcome turned cold. Merkel’s decision came to be widely unpopular with Germans and the vast majority of Europeans.

A year after the rapid influx of refugees began, Pew Research found that 59 percent of Europeans thought immigrants were imposing a burden on their country, while only a third said that immigrants made their country a better place to live. Among Greeks, 63 percent said that immigrants made things worse, as did 53 percent of Italians. In 2018, Pew found 70 percent of Italians, almost 60 percent of Germans, half of Swedes, and 40 percent of French and British citizens wanting either fewer or no new immigrants; barely 10 percent wanted more.

In the years following Merkel’s decision to set out the welcome wagon, virtually all European countries—including such progressive ones as the Netherlands, France, Denmark, Norway, and Germany itself—have tightened their immigration controls. This has been done chiefly to counter the populist (and at times quasi-fascist) nativist movements growing in many countries: Hungary, Poland, Austria, France, the Netherlands, Sweden, Finland, Slovakia, and most importantly in Germany.

Much of the support for populist parties comes from the working class and lower-middle class, who are more exposed to the disruptions and dangers that the migrants have often brought, and are generally more burdened by the public expense of accommodating them. Even in Sweden, where the citizens have long prided themselves on tolerance, there is widespread anger about rising crime and an unprecedented level of social friction in a formerly homogeneous country.

Some of the anti-immigrant movements that have sprung up espouse racist views, but others are far less odious, being simply opposed to the globalizing policies of elites and their indifference to the concerns of average citizens. Some have found inspiration in the Middle Ages, such as the example of the Frankish king Charles Martel, who defeated Muslim invaders in the 8th century. Fans of Donald Trump presented images of him as a Crusader clad in chainmail with a cross embroidered on the front.

The conflict over immigration divides largely along class lines. There is a huge divergence between elite opinion, which generally favors mass immigration, and that of majorities in the working and middle classes. France’s president, Emmanuel Macron, acknowledged this divergence in 2015 when he said, “The arrival of refugees is an economic opportunity. And too bad if [it] isn’t popular.”

If political elites in Europe regard open borders as good for the economy, corporate elites in the United States are eager to import skilled technicians and other workers, who typically accept lower wages. The tech oligarchs in particular like to hire from abroad: in Silicon Valley, roughly 40 percent of the tech workforce is made up of noncitizens. Steve Case, the former CEO of America Online, has suggested that immigrant entrepreneurs and workers could offset middle-class job losses from automation. Some conservative intellectuals have even thought that hardworking newcomers should replace the “lazy” elements of the working class. Some of the earliest opposition to the Trump Administration focused on his agenda of curtailing immigration.

Somewheres vs. Anywheres

Ironically, the people who most strongly favor open borders are welcoming large numbers of immigrants who do not share their own secular, progressive values. That is particularly true in Europe, where migrants and refugees from Muslim countries often hold very conservative or reactionary views on things such as homosexuality and women’s rights; many even support female genital mutilation. Some European politicians and other leaders, including the archbishop of Canterbury, have proposed that elements of Muslim sharia law, such as a prohibition of blasphemy, could be applied on top of existing national standards.

Giles Kepel, one of France’s leading Arabists, observes that Muslims coming to Europe tend to possess “a keen sense” of cultural identity rooted in religion, while the media and academia tend to promote the “erasing of identities,” at least for the native population. Rather than defend their own values, Europeans and others in the West have been told by their leaders that “they must give up their principles and soul—it’s the politics of fait accompli.” This “erasing of identities” is not widely popular among the working and middle classes.

The British writer David Goodhart describes a cultural conflict between the cosmopolitan, postnational “anywheres” and the generally less educated but more rooted “somewheres.” If the media and most high-level government and business leaders in Europe have an “anywhere” perspective, people in less cosmopolitan precincts outside the capital cities tend to remain more strongly tied to national identities, local communities, religion and tradition. These divisions were particularly evident in the vote on Brexit and the Conservative sweep in 2019.

The “somewhere” sentiment has repeatedly been expressed in votes concerning the European Union. In addition to the Brexit referendum of 2016, French, Danish, and Dutch voters have opted against deeper or broader EU ties, preferring a stronger national “somewhere.” Less than 10 percent of EU residents identify themselves as Europeans first, and 51 percent favor a more powerful nation-state, while only 35 percent want power in Brussels to be increased.

As long as the political and economic elites ignore these preferences, populist rebellions against establishment parties will likely continue and could become more disruptive. Elite disdain for traditions of country, religion, and family tends to exacerbate class conflict around cultural identity. “Liberalism is stupid about culture,” observed Stuart Hall, a Jamaican-born Marxist sociologist.

In the United States, discontent with the globalist and open-borders agenda of the oligarchs and the upper clerisy resulted in strong working-class support for Donald Trump in 2016. He won two out of every five union voters and an absolute majority among white males. Like his European counterparts, Trump ran strongest in predominantly white, working-class and lower-middle-class areas—precisely the areas hardest hit by globalization. He appealed most to people who work with their hands, own small shops, or are employed in factories, the logistics industry and energy sector; those who repair and operate machines, drive trucks, and maintain our power grid. Among white voters at least, he did poorest with well-educated professionals.

To many voters, Trump was “a champion for forgotten millions.” When surveyed, these voters put a high priority on bringing back manufacturing jobs, protecting Social Security and Medicare, and getting conservatives on the Supreme Court—ahead of building a wall to keep out undocumented immigrants, who are widely seen as cutting into labor wages for American citizens. Even though he came from the business elite, Trump met almost universal opposition from the dominant classes. Instead, he won over voters who see big corporations as indifferent to the well-being of working people. Like some of the populist movements in Europe, the American populist Right has adopted many of the class-based talking points, although usually not the policies, associated with the pre-gentrified Left.

In the higher echelons of the clerisy, the response to the populist revolt has mostly been revulsion. It’s Time for the Elites to Rise Up Against the Ignorant Masses” was the title of an article by James Traub in Foreign Policy in the summer of 2016. A former New York Times writer, Traub asserted that the Brexit vote and the nomination of Donald Trump, among other developments, indicate that the “political schism of our time” is not between Left and Right, but “the sane vs. the mindless angry.” Larry Summers, a former Obama Administration official, took a more astute view of the matter: “The willingness of people to be intimidated by experts into supporting cosmopolitan outcomes appears for the moment to have been exhausted.”

Is There a Mass Insurrection in the Making?

In the late 1920s and early 1930s, the proletarianization of the middle class resulted in widespread support for Communism, Fascism, and National Socialism. Today, as in Europe before World War II, people on both right and left often blame financial institutions for their precarious situation. Anger at the financial services sector gave rise to the Occupy Wall Street movement in New York City and the many spinoff Occupy protests in 2011-12. Marching under the slogan “We are the 99 percent,” protesters around the world decried the heavy concentration of wealth in a few hands.

Alienation from the political mainstream today is resulting in strong support for far-Left parties and candidates among youth in various high-income countries. In France’s presidential election of 2017, the former Trotskyite Jean-Luc Mélenchon won the under-24 vote, beating the more youthful Emmanuel Macron by almost two to one among that age group. In the United Kingdom, the Labour Party under the neo-Marxist Jeremy Corbyn in 2018 won more than 60 percent of the under-40 vote, while the Conservatives got just 23 percent. He won the youth vote similarly in 2020, even amidst a crushing electoral defeat. In Germany, the Green Party enjoys wide support among the young.

A movement toward hard-Left politics, particularly among the young, is also apparent in the United States, which historically has not been fertile ground for Marxism.

In the 2016 primaries, the openly socialist Bernie Sanders easily outpolled Hillary Clinton and Donald Trump combined among under-30 voters. He also did very well among young people and Latinos in the early 2020 primaries, even as other elements of the Democratic Party rejected him decisively. Support for socialism, long anathema in America, has gained currency in the new generation. A poll conducted by the Communism Memorial Foundation in 2016 found that 44 percent of American Millennials favored socialism while 14 percent chose fascism or Communism. By 2024, Millennials will be the country’s biggest voting bloc by far.

The core doctrines of Marxism are providing inspiration for labor unrest in China today, particularly among the younger generation of migrants to the cities. Activists often find themselves prosecuted for threatening “the social order.” Communist officials have been put in the awkward position of cracking down on Marxist study groups at universities, whose working-class advocacy conflicts with the policies of the nominally socialist government.

Democratic capitalist societies need to offer the prospect of a brighter future for the majority. Without this belief, more demands for a populist strongman or a radical redistribution of wealth seem inevitable. A form of “oligarchic socialism,” with subsidies or stipends for working people, might stave off destitution while allowing the wealthiest to maintain their dominance. But the issue boils down to whether people—not just those with elite credentials and skills—actually matter in a technological age.

Wendell Berry, the Kentucky-based poet and novelist, observed that the “great question” hovering over society is “what are people for?” By putting an “absolute premium on labor-saving measures,” we may be creating more dependence on the state while undermining the dignity of those who want to do useful work.

The future of the working class should concern us all. If too many lack any hope of improving their condition, we could face dangerous upheaval in the near future.

Weekend Long Read

The Big Red Fake News Machine

Independent media outlets offer an important counterbalance to prevailing mainstream media now compromised by corporate conflicts of interest. But the “independent” label is often contrived as foreign interests and even foreign governments drive the agenda. The Real News Network is one example.

In the late 1990s, Latin America underwent a seismic shift away from its northern neighbor as a result of the domineering and interventionist policies of successive U.S. administrations dating back to the 19th century. This led to the 1998 election of Hugo Chávez Frias as president of Venezuela, and a chain reaction of similar governments of varying leftist ideological stringency coming to power in nations such as Nicaragua, Bolivia, Uruguay, Brazil, and Ecuador. 

The focus of Chávez’s Bolivarian revolution, first and foremost, was on creating a new bloc of anti-imperialist nations to end U.S. hegemony in the Americas. 

But although this was the stated primary ambition, the revolution itself was not limited to Latin America. A dormant far Left in the United States, briefly sidelined after the fall of the Berlin Wall and the Soviet Bloc, quickly found in this movement a new kindred spirit. They were ecstatic that Chávez was elected in “free and democratic” elections, effectively displacing the two established centrist parties COPEI and Democratic Action, and that he survived a CIA-sponsored coup in 2002 thanks to his genuine popularity among the poor. 

Conveniently omitted in this history is that twice in 1992 prior to his election, Chávez, as an army lieutenant colonel along with his colleagues, attempted to overthrow the government of his predecessor Carlos Andrés Pérez, leading to the deaths of almost 200 soldiers. Along with other middle-ranking military officers, Chávez had formed in the 1980s the revolutionary leftist MBR-200 group and conspired for years to overthrow the duopoly ruling his country, by force if necessary. 

As seems always to be the case with socialism, real events that harm the reputation of the movement are either ignored or excused by true believers. 

The results of the Bolivarian revolution in Venezuela and its offshoots took more than 15 years to transpire thanks to an oil glut that wiped out the strategic advantage of the nation’s reserves, and are now obvious to all as an example of an unsustainable experiment that ignored the basic rules of supply and demand. 

But that part of the story is well known. Now is the time to shed more light on the Canadian and American members of the anti-capitalist Left who not only supported Chávez, but in some cases became paid mouthpieces and accomplices in the scheme to export Bolivarianism to the land of the yanquis.

Charm City Apparatchiks

In 2007, as the rot and decay of the George W. Bush Administration began to manifest itself, a Canadian filmmaker launched a project he hoped would apply his perspective to news reporting after years of toiling in obscurity. 

Paul Jay had been a producer, director, and journalist at CTV, Canada’s largest private television network, as well as its state-run CBC network, and had produced several documentary films ranging from “Return to Kandahar” about the war in Afghanistan to “Hitman Hart: Wrestling with Shadows” about professional wrestler Brett Hart. 

While many in the media have left-leaning political sympathies, Jay’s leftist convictions run deep. His uncle, Ted Allan, had served with the Lincoln Brigades in the Spanish Civil War, and became a renowned screenwriter. The Lincoln Brigades were made up primarily of anglophone Communists supporting the Republican government of Spain. Allan would later write the biography of Dr. Norman Bethune, a fellow volunteer surgeon from Canada who later served the Communist Chinese before dying in 1939 of sepsis. He was personally close to the Canadian Communist Party leader Tim Buck. 

Jay’s new creation would be called The Real News Network (TRNN) and he located the company in Toronto. In 2013, TRNN moved its main headquarters to Baltimore and have made that location and its community a focus of their reporting ever since. 

TRNN’s building at 231 Holliday Street, two blocks from City Hall, sold for $1.3 million in 2012. The Real News operates on several digital platforms, notably YouTube and Roku. Despite massive annual investments from its backers, The Real News remained a niche news source under Jay until his 2019 departure. It continues to operate as a nonprofit without accepting corporate advertisements, a structure that is similar to that of The Nation, In These Times and other digital left-wing publications. It has close to 400,000 YouTube subscribers and an unknown number of Roku viewers. 

Though these are respectable numbers for an independent creator, they would not begin to financially sustain a full network with both on-screen and production staff.

While claiming to promote real journalism, The Real News Network—perhaps to its credit—quite openly draws its presenters and reporters almost exclusively from members of the far-left activist community. 

For example, former Black Panther Eddie Conway, whose conviction for the murder of a Baltimore police officer was overturned in 2014 due to “irregularities,” hosts a series on the network called “Rattling the Bars,” concerning criminal justice issues. Like many TRNN shows, this one is more of an editorial presentation, made worse by the fact that Conway is clearly not confident in front of the camera after more than 40 years in prison. In one episode, Conway clumsily confronted, to no good effect, a group of Donald Trump supporters led by Scott Presler cleaning streets in West Baltimore.

Paul Jay, founder of TRNN

Screengrab/YouTube

Local and Global

While some of TRNN’s reporting does focus on the decay and corruption of Baltimore, most of the network’s efforts have been spent highlighting issues that are completely disconnected from the city. 

A heavy proportion of its stories are hostile to foreign governments in Brazil, Israel, and India. As Jay himself is an anti-Zionist Jew, he is deeply linked to various pro-Palestine movements and media figures such as Grayzone’s Max Blumenthal, the son of former Hillary Clinton advisor Sidney Blumenthal and an obsessively hostile critic of Israel. 

Similarly, TRNN’s main commentator on India is Vijay Prashad, a Marxist at Trinity College in Connecticut who runs the Tricontinental Institute, an organization that claims to support the “Non-Aligned Movement” of anti-imperialist states that met in Havana in 1966 at the Tricontinental Conference. The original Tricontinental continues to function in Cuba as the Organization of Solidarity with the People of Asia, Africa and Latin America (OSPAAAL), the vehicle for exporting its revolution abroad, and Prashad’s group openly admits to collaborating with this organization. Paul Jay’s second-in-command was Sharmini Peries, a Sri Lankan-Canadian with views similar to Prashad’s. At times it seems the only competition that TRNN engaged in was whether to hate Narendra Modi or Benjamin Netanyahu more. 

But Peries and others have a connection that is much stronger than either Palestine or India—Venezuela. 

At one point, Peries was a direct advisor on economics and trade to Hugo Chávez. The overlap between TRNN and Venezuela’s state-supported media is so blatant, that it almost functions as a branch office with minor local variations. And while Americans as a whole have had to deal with crippling social media censorship thanks to media-generated moral panics, this unabashed fifth column has operated unhindered and without general public acknowledgment. 

Thanks to the freedoms enshrined in our Bill of Rights, these activists masquerade as reporters, play the part of martyrs, and live out the youthful fantasies emblazoned on their sweatshop made Ché shirts. The Real News Network and a network of related groups provide them that stage. 

Guerrilla TV SuperstarsUploaded from Caracas

The progressive media landscape is littered with the shards of the shattered leftist chandelier that never seemed to illuminate much to begin with. Far-left politics is notoriously schismatic, especially when it comes to applying the theories of Karl Marx and other socialists who died in the 19th century and have been interpreted in many different ways. Many progressives are apologists for Communism while denying that they are believers, such as Cenk Uygur of “The Young Turks,” who gives mealy-mouthed defenses of capitalism but in substance supports a corporate nanny state. 

But The Real News and its affiliated organizations rarely allow such ambiguities to stand. On the ideological spectrum, TRNN occupies a space somewhere between “democratic” socialism and revolutionary Marxism-Leninism. The main dispute between these two schools of thought concerns whether violent revolution is necessary in order to eliminate class divisions.

Portions of TRNN’s programing have been sourced directly—and often unedited— from TeleSUR English, a television station owned by the Venezuelan, Cuban, and Nicaraguan governments. Founded in 2005, TeleSUR is the mouthpiece of Hugo Chávez’s successor Nicolás Maduro and the nation’s ruling United Socialist Party of Venezuela (PSUV). 

During the aborted 2002 coup against Chávez, he and his supporters recognized that the commercial media in Venezuela were blatantly sympathetic to military and corporate interests that were intent on ousting him. As a result of this they formed not only TeleSUR, but other international blocs such as Bolivarian Alliance for the Liberation of Our Americas (ALBA) with Cuban and Caribbean allies, the Unasur transnational union in South America and the economic union Mercosur. 

But all of these entities, including TeleSUR are political instruments of Chavista governments and political parties. Contributing nations in TeleSUR at one time included Argentina and Ecuador, who pulled out after changes of government and souring of relations with Venezuela. In 2018, Daniela Vielman, one of its Spanish-language anchors, resigned in a huff and discussed the mistreatment, extortion, and political coercion that she and other staff had undergone in covering her nation’s many political crises and civil disturbances.

Many personnel from TRNN also have been prominent commentators on TeleSUR:

  • Tariq Ali has been featured on The Real News to talk about topics relating to Britain, Pakistan, and the Iraq War. He also hosted the TeleSUR program “The World Today,” such as one from 2015 where he heralded a bright red future for the UK under his close friend and colleague Jeremy Corbyn.
  • Abby Martin arrived at TeleSUR after a falling out with Russia’s RTAmerica over the Ukraine crisis. Prior to that, she was a San Diego-area organizer for 9/11 Truth and supported the controlled demolition theory, but later distanced herself from that community and today rarely speaks on the topic. She hosted “Empire Files” on TeleSUR until 2018 when it could no longer be funded. TRNN has featured full episodes of the program, such as one from 2016 with Ecuador’s Marxist foreign minister Guillaume Long. Listed in the credits are two other TRNN staff, Oscar León and executive producer Paul Jay. It continues to produce new episodes using a donation model. 

Besides TeleSUR personnel, TRNN also employs or publishes content by other Maduro government apologists:

  • Gregory Wilpert is the founder of Venezuelaanalysis.com, a news website dedicated to rationalizing Venezuela’s economic meltdown as the result of sanctions. His wife is also a senior Venezuelan diplomat and former ambassador to Ecuador.
  • Lucas Koerner writes for Wilpert’s website and functions as a media critic in defense of the Maduro regime. Like Paul Jay, he is an anti-Zionist Jew and was once arrested for attempting to disrupt the Jerusalem Day parade. 

Those last two, in particular, illustrate the depths to which TRNN will sink in order to defend Venezuela in the name of being “anti-war.” Both Koerner and Wilpert have written for Fairness and Accuracy in Reporting (FAIR), a leftist media watchdog that is blatantly sympathetic to the Chavista government. 

In 2006, Wilpert authored a story disputing the notion that Chávez was corrupt by claiming that metrics used by Transparency International and other corruption monitors were skewed by public opinion. As it turns out, the late president’s daughter, Maria Gabriela Chávez, became a multi-billionaire and the richest woman in the country, although most of her assets were held in American and Andorran banks, all while serving as the country’s alternate ambassador to the United Nations. 

Koerner is an even more dedicated supporter of Maduro. In 2019, Gabriel Hetland, professor of Latin American studies at the University of Albany, published an article for the North American Congress in Latin America (NACLA) and Jacobin critical of the Maduro government for its incompetence and abuses. In the article, Hetland placed three values above all others: “non-interventionism, self-determination, and solidarity with the oppressed.” NACLA and Jacobin are both staunchly left-wing socialist organizations and media outlets, and Hetland himself had been featured in 2016 with Peries on TRNN downplaying the severity of the humanitarian disaster in Venezuela. Joining him was Koerner’s colleague Rachael Boothroyd, also of TeleSUR and Venezualaanalysis.com. 

Nevertheless, Koerner wrote a response a year later for FAIR claiming that Hetland’s piece was part of a campaign to legitimize regime change in Venezuela, despite Hetland’s explicit statements to the contrary. Hetland responded by pointing out that he did not support regime change in Venezuela, nor the November 2019 overthrow of its allied government in Bolivia. In FAIR’s publishing of the exchange, Koerner responded by castigating Hetland for not presenting “solidarity” and an “unqualified defense” of deposed Bolivian leader Evo Morales. 

The infighting suggests there is a growing rift between those who accept and those who deny that the Maduro government should continue to be championed by the anti-capitalist left. Unfortunately, it is difficult to gauge which of these players is being sincere as opposed to abandoning a sinking ship out of self-interest. 

Party Like There’s No 1989

The Real News Network also shares TeleSUR’s coverage of historical topics, not just current events. It is apparent from the network’s choice of programming that it’s a proponent of ideological dogma and orthodox Cold War-era Marxism. 

In 2017, on the 100th anniversary of the October Revolution that established the Soviet Union, Abby Martin interviewed Brian Becker in a very flattering retrospective of the monumental event. While not addressing the Red Terror, famines, or the reconquest of the Caucasus and Central Asia that subjugated many non-Russian minorities, the two created the impression that this bloody introduction to tyranny in the name of progress was a necessary speed bump on the road to their goal. 

Becker hailed the creation of the Soviet of Nationalities, a legislative body that supposedly gave all Soviet peoples representation in government. But in practice, both this and the other chamber, the Soviet of the Union, were rubber stamp parliaments for Communist Party organs such as the Politburo and Central Committee. Also omitted was the fact that Becker is the founder and leader of the Party for Socialism and Liberation (PSL) a pro-Venezuela Marxist-Leninist group and the head of the ANSWER Coalition. Like Martin, who once hosted a show on RTAmerica, Becker has similarly hosted a program on the Russian-sponsored Sputnik Radio, formerly known as RAI Novosti, since 2015.

Paul Jay’s own TRNN program, “Reality Asserts Itself,” has often featured episodes discussing the virtues of Marxism and Soviet Communism. A 2018 episode was devoted to Marx’s 200th birthday. Another 2018 episode discussed the present-day relevance of Communism with the Russian anti-revisionist Marxist Alexander Buzaglin.

Most TRNN segments on Venezuela have denied the fact that the Chávez and Maduro governments mismanaged the country, removed all constitutional protections, and created a totally petroleum-dependent economy. When discussing the Venezuela crisis in 2019 in the context of whether socialism is a failure, Jay hedged by claiming that foreign sanctions had crippled the Chávez-Maduro experiment. In so doing, he made several false statements, such as that Venezuela was not an industrialized country. In fact, for many decades it was one of the richest and most developed nations in the Western hemisphere.

Is the support for this failed state merely ideological, since the ties to Caracas seem no longer to be financial? How can a network dedicated to ending the fossil fuel economy and Wall Street plutocracy simultaneously credit so much of its financing to both? 

In the early 1990s, as a younger documentary filmmaker, Jay produced “Albanian Journey” in the midst of its transition to democracy. While papering over the crimes and oppression wielded by the Communist regime of Enver Hoxha that clung to Stalinism even after China abandoned it following Mao Zedong’s death in 1976, he did portray the decay and downfall of socialism in the Mediterranean state. It seems as if The Real News Network and its embrace of Bolivarian socialism were just an attempt by Paul Jay to hold on to the new hope of a “21st Century Socialism” that Chávez promised. Like the last century’s version, however, its Western supporters are fine with enjoying the comparative luxuries of a decadent capitalist economy as it falls apart. 

CARACAS, MIRANDA, VENEZUELA - 2019/11/14: Protester utters insults at the riot police during the demonstrations. Tension rose in Venezuela after university students protested in the streets in support of the call made by the Venezuelan opposition to remain in the streets without return, indefinitely.

Roman Camacho/SOPA Images/LightRocket via Getty Images

Green for the Screen

What is so ironic about The Real News Network’s fawning coverage of Venezuela, a petrostate now economically beholden to its creditors, is its simultaneous focus on supporting the replacement of fossil fuels and banning extraction of oil and natural gas in any way possible. 

Throughout its history, TRNN has promoted “climate justice” calling it an imperative in 2011. During the 2020 Democratic primary season, one of its correspondents asked at the woke Netroots Nation summit whether candidates’ climate plans were “climate justice plans.” 

Mere hypocrisy is not so rare as to be the only reason to highlight this contradiction. What TRNN is actually doing is much worse than that. It is using a manufactured global crisis in order to support the economic interests of a hostile foreign government. While the domestic U.S. oil industry is naturally subject to hazards such as spills and air pollution, in that respect it is not to be distinguished from other oil producers like Saudi Arabia, Iran, Russia, Norway, or Venezuela. The fracking boom of the 2010s and the further expansion of offshore drilling under President Donald Trump, however, has only served to create downward pressure on oil prices and hurt the bottom lines of every oil-dependent economy—including Venezuela’s, which for decades has sold its gasoline in the United States through its subsidiary Citgo stations.

By advocating for policies such as the Green New Deal, TRNN seeks the economic equivalent of unilateral nuclear disarmament. The United State would abandon fossil fuel extraction prematurely, and therefore would have to be supplied by states like Venezuela once it becomes evident that the energy needs of American citizens cannot be met by renewables. This already has occurred in Germany which once was the vanguard in green technology and is now attempting to compensate with a natural gas pipeline from Russia. 

The red-green alliance around climate change is grounded in fear and guilt, not science. Rather than being revolutionaries with microphones fighting Wall Street, the money trail shows them to be fully conflicted and hiding their dirty laundry behind red flags. 

While progressive media has embraced several different agendas over the decades in order to buttress their economic and social vision, none has greater potential than climate change does to shape the future of society. 

The benefit of focusing on this area is that target audiences beyond the far Left are reluctant to challenge a movement claiming to derive its legitimacy from the scientific community, the United Nations, and Hollywood. It is a common refrain that the dependence of the economy on fossil fuels will doom humanity and the planet at large. But behind this agenda is the serious paradox that is difficult for activist journalists to get around: oil and natural gas are so ingrained in the economy that typically those pushing the agenda of renewable energy are themselves backed by or invested in fossil fuels. 

In 2019, presidential hopeful Beto O’Rourke found this out fast when his otherwise green-friendly campaign was attacked by the Sunrise Movement for having an insufficiently aggressive climate change plan and for accepting contributions from fossil-fuel company executives. 

Progressives deem as heresy anything short of a full embrace of the panic-driven climate change movement—and TRNN does its best to stay near the tip of the spear on the subject. 

After the first Democratic debate in June 2019, TRNN talking heads griped that only 15 minutes had been dedicated to climate change. In August 2019, correspondent Dharna Noor interviewed a plaintiff in Juliana v. USA, a landmark class action case that attempted to force the federal government to pay reparations to young Americans over its alleged “inaction” on climate change. (The Ninth U.S. Circuit Court of Appeals dismissed the case in January.) More recently, TRNN news host Marc Steiner has attempted to tie the coronavirus outbreak to climate change. 

Fossil Fuel Skeletons in the Vault

Cloaked in the façade of caring for the future of the planet and opposing Wall Street, TRNN deceptively hides its own past reservoir of fossil fuel interests. Its leaders are not merely ideologically committed to supporting the Venezuelan government; the organization through its funding apparatus had financial ties to the regime and its oil conglomerate until 2018. 

Its editorial line remains sympathetic to the regime, and the details of the departure of Jay and Peries remain undisclosed. Their best-known investigative reporter Aaron Maté, one of the central Russiagate skeptics, had left in 2018 under similarly mysterious circumstances. Since then he has appeared more often on China’s CGTN America network. Much of how and why TRNN came about is known only to them and other insiders but, at least in the group’s media materials, they remain staunchly supportive of the Maduro government.

Puzzle pieces relating to its background can be gathered from TRNN videos, as well as tax disclosures. According to a now-deleted 2014 video, the organization’s Baltimore headquarters was purchased by the “small family” Quitiplas Foundation as part of a $3 million investment. According to real estate records the property at 231 N. Holliday Street was purchased for $1.3 million in 2012. But in TRNN’s 2015 IRS 990 tax disclosure, the same property was still listed as belonging to Quitiplas. The foundation has no website, and like many corporations and organizations that want to avoid scrutiny, it is incorporated in Delaware. The name of the foundation comes from the “quitiplas,” which is a percussion instrument originating in Venezuela. 

An organization called “Son of Quitiplas” registered under Paul Jay’s name is currently “not in good standing” based on available information from the Maryland Department of Assessments and Taxation and lists the same property as its address. It is unclear whether this organization is directly linked to Quitiplas, but the status suggests Jay’s entity has been dissolved.

Delving into the list of grantees from 2015, some familiar names pop up:

  • Fairness and Accuracy in Reporting received $5,000.
  • The North American Congress on Latin America received $10,000. 
  • Dwarfing all of the others was “Independent World Television” which received $650,000.

Independent World Television is the legal name of The Real News Network. IWT’s own 2015 form 990 listed Jay as the corporation’s CEO along with Hollywood actor and Hugo Chávez supporter Danny Glover as a member of its board of directors (a board that shares significant overlap with that of Quitiplas). Glover co-wrote a eulogy for Chávez in 2014 with fellow board member James Early, who serves as an assistant provost and author at the Smithsonian Institute

Another board member, Thomas M. Scruggs, is listed as the president of Quitiplas. Scruggs’ personal background is as an ethnomusicologist (one who studies folk music of different cultures), and he has taught at a number of institutions such as the University of Iowa and Florida International University. From 2004-2006, however, he was a guest teacher at the University of the Andes, in Mérida, Venezuela on a Fulbright fellowship. His most well-known work is a foreword to a pamphlet honoring the musician Victor Jara, a Chilean Communist and supporter of murdered President Salvador Allende, who was himself murdered under the regime of General Augusto Pinochet. 

In 2012, Scruggs was featured by TRNN on a report about elections in Venezuela from his home in Berkeley, California. Scruggs continues to serve on the boards of both TRNN and Quitiplas. Another board member for both organizations, Dmitri Lascaris of Montreal, is a pro-Palestine activist and current candidate for leader of the Green Party of Canada.

The tax disclosures also show that Quitiplas held financial assets directly tied to the Venezuelan and Argentine governments, as well as massive oil and gas assets. In 2013, it had investments of over $800,000 in 5 percent interest bond holdings for Petroleos de Venezuela (PDVSA; pronounced Pe-de-ve-sa), the state-owned oil monopoly of Venezuela. On top of that, the foundation owned almost $3 million in 8.75 percent interest bonds for Argentina, which in 2014 would default on its debt. In 2016, TRNN’s Greg Wilpert condemned new Argentine President Mauricio Macri for paying out vulture funds that had held these very same bonds. 

Quitiplas also has investments in Sandridge Mississippian Trust, an American fund that holds royalties in oil and gas properties in states like Oklahoma and Kansas. Quitiplas’ investment in Venezuelan and Argentine public debt dates back all the way to 2008, the second year of its existence. That year it also funded IWT with $100,000 and the pro-Venezuela think tank Center for Economic and Policy Research for $500,000. CEPR’s c0-director Mark Weisbrot famously wrote in The Guardian in 2013 that Venezuela’s economy was not “the Greece of Latin America. He was a co-writer of the screenplay for “South of the Border,” Oliver Stone’s fawning 2009 documentary about Chávez and the Latin American “Pink Tide.”

Opposition demonstrators and riot police clash during a protest against President Nicolas Maduro in Caracas, on July 6, 2017. A political and economic crisis in the oil-producing country has spawned often violent demonstrations by protesters demanding Maduro's resignation and new elections. The unrest has left 91 people dead since April 1.

Juan Barreto/AFP via Getty Images

Reality Re-Asserts Itself

As of 2018, the last available year for which filings are available, it appears that the sovereign debt investments such as PDVSA were sold off. Yet Quitiplas continued to hold investments of tens or hundreds of thousands of dollars in Facebook, Amazon, Alphabet (parent company of Google), Apple, discount store chain Dollar General, and defense contractor Honeywell. 

Another investment that Quitiplas consistently has held is the hedge fund Alphakeys Millennium I and III, despite TRNN’s condemnation of hedge funds for capitalizing on public debt in Puerto Rico and hindering medical innovation

Of course, TRNN portrays Wall Street as a whole as a malevolent force. Jay claimed in a 2018 video that “the billionaire class is not fit to rule.” Also in 2018, TRNN reporter Marc Steiner interviewed Matt Taibbi about the restrictions that Google and Facebook were putting on the privacy and freedom of users while simultaneously calling the recently unpersoned Alex Jones “vile.” Perhaps Steiner was unaware of where his employer’s finances originated. 

Did the dumping of PDVSA and other state-owned assets occur to square the organization’s finances with its philosophy, or was that merely a reaction to the declining profitability of the bond market in Latin America and the oil glut of the mid-2010s? 

The most likely explanation is that the foundation dumped the assets in compliance with 2017 sanctions on Venezuela, and to PDVSA-related debt holdings specifically. Unlike other years, the organization’s 2016 form 990 disclosure does not include any itemized investment holdings, and the 2014 disclosure is completely unavailable.

A Dim Future?

The editorial line of The Real News Network has not changed with the unexplained departure of Jay and Peries, and it consistently portrays Venezuela as the victim of oppressive U.S. sanctions. Like its TeleSUR partner, TRNN sees the United States as having designs of regime change in Venezuela and worldwide, but unlike them, its content is not labeled as being funded by the Venezuelan government on YouTube. 

Today, more than 22 years since Chávez rose to power, the pretension of supporting his successor Maduro in the name of fighting U.S. imperialism is ludicrous. Last year, Venezuela’s sovereign debt rose to $156 billion, some of it held not by hedge funds but by allies such as China and Russia. Due to drops in the worldwide crude oil price, it is facing new economic pressure if China does not agree to restructure Venezuelan debts again. 

In March, the U.S. Justice Department indicted Maduro and several of his most senior officials for narcoterrorism and leading the “Cartel of the Suns” to traffic cocaine to the United States by air and maritime routes. The investigation began in 2015 thanks to careless talk by the “narcosobrinos,” two nephews of Hugo Chávez, boasting of their state support to a fellow drug trafficker and DEA informant. 

While TRNN continues to cheer on the Venezuelan communes, regime figures have looted the country and squirreled away assets overseas. As millions of their countrymen starve, the boliburgueses (Bolivarian bourgeoisie) like Alejandro Betancourt benefit from state contracts with PDVSA while living among the gringo imperialists in Miami. 

Far from promoting a brighter future and more equal world, PDVSA is today a basket case run by Major General Manuel Quevedo. In 2018, the oil company quelled riots at its own cafeteria using national guardsmen. Citgo, its U.S. subsidiary, was found guilty in 2007 of violating the Clean Air Act in a U.S. federal court case. In 2019, the U.S. government legally prohibited PDVSA from profiting from Citgo, and most recently the Supreme Court ruled it liable for a 2004 oil spill. Meanwhile the U.N. High Commission on Refugees claims 4.5 million Venezuelans are currently refugees or migrants abroad.

As it attempts to survive in a difficult digital media market, some questions remain unanswered concerning the relationship between TRNN and the Venezuelan government. Who donated the seed money through Quitiplas to create TRNN in 2008 and buy the Baltimore building five years later? What are the sources of the foundation’s finances since then, and do they include foreign state funds? How formal was the partnership between it and TeleSUR? And finally, what triggered the replacement of Paul Jay and Sharmini Peries, neither of whom have made any public statement since? These are riddles that Venezuelans living abroad, Baltimore residents, and even TRNN’s own confused viewers may find fascinating and troubling.

Weekend Long Read

An excerpt from “Progressivism: The Strange History of a Radical Idea,” by Bradley C.S. Watson (University Press of Notre Dame, 260 pages, $45). 

The Revolt Against the American Order

Progressive theorists, statesmen, and theologians embraced a notion that material and spiritual fulfillment can be found in and through the good graces of the state. It represented, in theory and practice, a stunning transformation of American politics, morality, and constitutionalism.

Common experience, and modern psychology, validate the truism that people tend to see what they are looking for. In the professional realm, confirmation bias—that is, the tendency of investigators to seek and elevate that which confirms their preexisting hypotheses—is likely to constrain the gaze of even the most determined and experienced souls, and perhaps especially the most determined and experienced. Déformation professionnelle, as the French call it, is a condition that can afflict only the well trained, or at least the long inured.

Economists, meanwhile, use the phrase regulatory capture to describe the observable phenomenon of knowledgeable groups with concentrated interests swaying or “capturing” the determinations of regulators who are supposed to act impartially and for the public good. The public’s interest, alas, is dispersed. A captured agency might well be more harmful to the public good than no agency at all. Its influence can be pernicious and can go largely unnoticed by everyone except the very few in the know.

Professional academics, nominally dedicated to objectivity, have not proved immune to deformation, or outright capture by professional interests, in their efforts to regulate the ebb and flow of respectable opinion. The American academy, long enjoying various forms of insulation and privilege, is uniquely positioned to generate moral hazard in the realm of ideas. A case in point is the idea of progressivism as it was transmitted by American academics, especially historians, from the middle part of the 20th century onward. The progressive idea, simply put, is that the principled American constitutionalism of fixed natural rights and limited and dispersed powers must be overturned and replaced by an organic, evolutionary model of the Constitution that facilitates the authority of experts dedicated to the expansion of the public sphere and political control, especially at the national level.

By the middle part of the 20th century, historians were reporting that progressivism had never existed. By so doing, they certainly could not be accused of exaggerating its death. In 1971, Peter Filene of the University of North Carolina wrote an obituary for progressivism and for attempts to chronicle a phantom. It was as if scholarly ghostbusters for decades had carefully planted their cameras in the countless rooms of the haunted mansion of American history, only to come up with nothing—or at least nothing clearly identifiable as progressivism once the videotapes had finally been scrutinized by more dispassionate, technically adept observers.

Despite the intentions of scholars to airbrush progressivism from American history, the progressive idea seemed real enough to those who first expounded and developed it. As a recent observer notes, “No one at the time thought Progressivism so various and contradictory as to be meaningless, much less nonexistent, though its adherents battled furiously over its political agenda.” Furthermore, each of the three main presidential candidates in the election of 1912 claimed the label.

The reality of American Progressivism comes into view only in relation to what it rebelled against, which was nothing less than the American constitutional order and especially the political philosophy on which it rested.

The Real Presence of Christ 

As progressives mobilized intellectually and politically around the inadequacies and injustices of the founders’ Constitution and the modern economic order, they did so with a fervor for, and faith in, the social sciences, which they thought could remedy injustice. The intensity of their fervor and faith can be traced to the influence of religion.

At the dawn of the Progressive Era, American Christianity still buttressed the constitutional order by linking human fallenness to the need for political moderation, individual rights and responsibilities, and limited government, which in turn reflected what historian Johnathan O’Neill refers to as “the long-established view that maintenance of a political regime involves ideas and sensibilities associated most readily in the Western tradition with religion.” Scholars have also shown that this view of religion and morality, pointing to fidelity to a Constitution embodying immutable truths, informed the thinking and constitutional interpretations of pre-progressive Supreme Court justices. So for the progressives, regime change necessarily meant religious change, and vice versa. Christian progressives held that a new era had dawned, based on a new conception of religious obligation. A reconstituted worldly Christianity called for the expansion of the state in the name of moral and theological progress.

This reconstitution accounted for the zeal of many progressives, confident as they were not only of the direction of history but of their own rectitude. As Christian progressives directed their minds to what they saw as the new problems confronting America, they exhibited various degrees of millenarianism, which accounted for the power of their thought and its ability to capture the hearts and minds of a growing cadre of true believers. 

Throughout the Progressive Era, religious language was common at political gatherings at the local, state, and national levels, including even national conventions. But the fervor of Christian progressivism was unlike that of prior American religious awakenings. Instead of concentrating on individual moral failings and the special need for individual reformation, Christian progressives concentrated their gaze almost exclusively on matters of social and economic justice. 

By the first decades of the 20th century, both Protestant social gospelers and Catholic reformers were vigorously attempting to shift the center of gravity of mainline Christianity toward applying what they claimed to be true Christian ethics in the here and now. It was clear that they understood their project to be both radical and political, and a very sharp break from the Christianity of their fathers. According to the scholar Ernst Breisach, they “prided themselves on having freed Christianity from the shackles of the past—asceticism, dogmatism, and ceremonialism—and on having transformed it into a message befitting the future—brotherly love in a truly democratic society.” For these progressives, Christian churches placed too great an emphasis on the salvation of souls and the life of the world to come. The real presence of Christ came to take on whole new meaning.

Historians of progressivism have occasionally observed this phenomenon but have been divided on its origins and significance. Some have noted that, along with more purely economic notions like “antimonopolism” and “efficiency,” the language of “social bonds” ran through most strains of progressivism and was juxtaposed against homo economicus, and especially the notion of man as the autonomous wielder of property rights. Scholar Daniel T. Rodgers notes this was the language “most tightly attached to the churches and the university lecture halls. Its roots stretched toward Germany and, still more importantly, toward the social gospel. When progressives talked of society and solidarity the rhetoric they drew upon was, above all, the rhetoric of socialized Protestantism.” Richard Hofstadter goes so far as to trace the roots of progressivism to Protestant guilt and the need to atone:

In evangelical Protestantism the individual is expected to bear almost the full burden of the conversion and salvation of his soul. What his church provides him with, so far as this goal is concerned, is an instrument of exhortation. In Catholicism, by contrast, as in some other churches, the mediating role of the Church itself is of far greater importance and the responsibility of the individual is not keyed up to quite the same pitch. A working mechanism for the disposal and psychic mastery of guilt is available to Roman Catholics in the form of confession and penance. If this difference is translated into political terms, the moral animus of Progressivism can be better understood.

But such psychological and theological reductionism cannot adequately account for what Protestant progressives claimed was the essentially social and political nature of the Christian enterprise, or for the strains of progressivism that animated leading Catholic thinkers—including, for example, Fr. John Ryan. 

In A Living Wage, Ryan, like his Protestant counterparts, sought human solidarity and heavenly justice through economic policy. And in this quest, he sought to turn Catholicism—as the social gospel movement had turned Protestantism—against the American system of constitutionally limited government, private property, and capitalism, in the search for a more rational scientific state that would support nothing less than the Kingdom of God on earth.

The roots of the modern administrative state thus run deep in the soil of Christian progressivism. But one might go further and argue that religious reformers drew on notions of moral duty running from Aristotle through the medieval Catholic intellectual tradition, albeit often infused with an anti-prudential Kantian moralism. And as a practical matter, Protestant progressives allied with both Catholics and Jews, whose understandings of law and morality antedated modernity. While rejecting the natural rights tradition of the American founders, religious progressives—unlike their secular confreres—at least formally asserted versions of a natural moral order, and even natural rights, which purported to be timeless. They were not willing to reduce “nature” merely to physical or biological laws.

In short, one needs to take religion more seriously than many historians have been prepared to do. The centrality of serious and wide-ranging religious sentiment to progressive ideology should not be underestimated. Christian progressives joined forces with economists such as Richard T. Ely and political scientists like Woodrow Wilson against what they claimed were the new economic and social realities that had been fully unleashed by the modern industrial age. They generally glossed over, and sometimes deliberately understated, the fundamentally anti-constitutional character of their arguments and the reforms to which they pointed. Secular and Christian progressive thinkers together pressed for an expansion of state power, and especially national state power, at the expense of constitutional limits. And in the case of the theologians, it was also at the expense of the sacred, even as the essential revelations and rituals of Christianity were of vital importance to them. Theirs was a natural law that did not limit government in principle but rather vouchsafed its protean expansion as it simultaneously reduced Christian faith to a set of economic and political demands.

From a contemporary perspective, it seems ironic that social Christianity of both the Protestant and Catholic varieties helped lay the foundations for the modern administrative state, as nowadays religious faith is frequently associated with political conservatism and opposition to progressive goals. But it was not always so. And to the extent that a secularized millenarianism is evident in the rhetoric of contemporary liberalism, it can trace its origins to the rather insistent piety of the early progressive religious thinkers.

Richard T. Ely on the Border Land

In the thought of Richard Ely—Progressive economist and expounder of the social gospel at the end of the 19th century—one can find a compact explication of the overlapping intuitions and arguments that the new breed of social scientists shared with Christian theologians. Ely was a professor of political economy first at Johns Hopkins—the institution that most channeled German Hegelian understandings onto American intellectual shores—and then at Wisconsin, which would become a bastion of progressive thinking throughout the 20th century. 

Along with his intellectual antagonist William Graham Sumner, Ely was arguably the most influential economist of his age, laying the intellectual groundwork for, and anticipating the reforms of, both the Progressive Era and the New Deal. But it was in the views of Ely the armchair theologian that the era—if not the century—that he foreshadowed was most comprehensively limned. 

Not only did he decisively influence both the social gospel of Walter Rauschenbusch and the Catholic social thought of Ryan, he also served as practical exemplar and theoretical explicator of the power of faith to move social science, as well as the obligation of faith-based social science to move the levers of power. As the 20th century wore on, the faith that animated social science and justified governmental power shifted from its roots in Christianity to a fully secular millenarianism. But the leap was perhaps not that great once American elites had fully internalized the worldliness of Ely’s version of Christianity—what he called Christianity’s inherently “manward” side.

Ely concentrates on the ethical obligations of Christians in the industrial age. He makes clear that his writings deal with the “border land” where theology, ethics, and economics meet. He claims that only Christianity can provide the Archimedean point on which a proper political and economic ordering can rest. Christianity—albeit with a new worldly emphasis—provides immeasurable advantage to those committed to social change. It is the most powerful social force known to man; it need only be harnessed and directed toward its proper end.

Ely recognizes that modern social science cannot provide answers to normative questions, and he claims it leaves “too much in the air” to give progressive thinkers a firm or confident motive for their reformist ambitions. Enter Christianity, which Ely claims is unique among religions in the nature and extent of the civic and secular obligations it imposes. Remarking on Matthew 22:34–40, where Christ reduces the law to loving the Lord and loving thy neighbor as thyself, Ely says no merely human teacher would place the duty to man on an equal plane with the duty to God. He claims that such a juxtaposition of duties exists in no other religious system. Personal salvation is not the end of religion, though it is the beginning: only when the individual Christian is in right relation with God can he get on with the ultimate task of being in right relation to his fellows. Christianity alone provides a stable ground for humanitarianism.

The history of ethics, according to Ely, confirms the view that Christianity is unique: classical philosophers did not know of benevolence. Benevolence, for Christians, is a form of divine service, and piety is identified with pity. Prior to the Reformation, this fact was obscured, and the separation of “right life” from religion was a scandal to the church. 

Socialism is Christianity for the modern age insofar as it promises to realize the brotherhood of man by creating a social system dedicated to the maxim “One for all, all for one.”

“Some have gone so far as to make salvation consist in ceremonies, obedience to the dictates of priestcraft, in some sort of magic, or in a feeling of the emotional nature . . . even in intellectual assent to a species of metaphysics. What have all these things to do with conduct?” But Ely argues there is much work still to be done, and much that Protestants can learn from Catholics, for the Church of Rome provides the greatest opportunities for renunciation and sacrifice of the self, thus overcoming one of the errors of Protestantism.

Ely elsewhere notes with regret that modern hymns are almost exclusively oriented to individual rather than social salvation. So different are they from the Psalms, which tend to be “social and national” and don’t “contain an I or me except when the words are put into the mouth of the Lord.” The hymns thereby deny or downplay our common humanity, united in God, of which we are reminded by the visible witness of Baptism. 

Likewise, the Lord’s Supper, though it draws us to heaven, reminds of the “manward” side of Christianity in the food and drink—bread and wine—that so sublimely express human fraternity. And yet even this sacrament is degraded by the use of individual communion cups. Ely asks, “Is our earthly life so precious that it must be so saved at all hazards?” 

The rituals and revelations of Christianity point to our unity and interdependence in the tribulations of this world. All of Christ’s words must be read in light of the doctrine of “social solidarity,” which makes us all responsible for the sin and suffering of our fellow men. An entire city is guilty of a murder that occurs in one of its slums. This is a truth confirmed by social science, which can show us the determining power of heredity and environment. We develop true “individuality” only by bringing ourselves into harmony “with the laws of social solidarity” 

Christ separated good men from bad on the basis of their respective performance of “social duties,” which makes true Christianity unique in the extent to which man serves God by serving man. Other religions tell men they may serve God by injuring their fellows. Christianity, by contrast, exalts man. Through his second great commandment to love your neighbor as yourself, Christ introduces sociology, or the science of society, to the world. It is therefore incumbent on the church to embrace research in social science; her failure to do so has encouraged communism to become infidel, and socialism to become materialistic rather spiritual. 

Ely goes so far as to suggest that half the time spent in theological seminaries—which should be the intellectual centers of sociology—should be devoted to social science education. While social science cannot point to ends, it can provide the means to achieve them. As political scientist Luigi Bradizza argues, Ely’s social science “becomes practical Christianity,” and its confident pursuit is an implicit rejection of the inherent imperfection of this world. But the 20th century would provide ample evidence that social science, on Ely’s terms, could not long serve Christianity. The table would soon be turned, and Christianity swept from it.

Smith Collection/Gado/Getty Images

For Ely, the welfare of man is the point of the “most fundamental laws” of the church, and social utility is their test. There is, Ely insists, only one law taught by Christianity on its “manward” side: that is the law of love, which finds expression through social service and its test in social welfare. “Christianity and ethical science agree perfectly.” Ely tends to ignore biblical passages that cut against a worldly Christianity, or at least he glosses them to support it.

In a statement that is characteristic of the progressive mind, Ely expresses profound confidence in the power and utility of expertise, so long as it is wielded by the right sort of people. “Philanthropy,” he claims, “must be grounded in profound sociological studies. Otherwise, so complex is modern society that in our efforts to help man, we might only injure him. Not all are capable of research in sociology, but the church should call to her service in this field the greatest intellects of the age.” The purpose of the American Economic Association, of which Ely was a founder, is nothing less than “to study seriously the second of the two great commandments on which hang all the law and the prophets, in all its ramifications, and thus to bring science to the aid of Christianity.” Ely goes on to express something approaching bewilderment that not one in ten Christians would contribute to the association (a fact of which he had personal knowledge as its secretary).

Because of the indifference of Christians to the second great commandment, wage workers feel increasingly alienated from the church, Ely notes with regret. This is destined to be, so long as the church fails to understand its true mission and fails to see that “nearly everything in the words of Christ applies to the present life.” Ely makes many arresting claims, but perhaps none more than this: “Christianity is primarily concerned with this world, and it is the mission of Christianity to bring to pass here a kingdom of righteousness and to rescue from the evil one and redeem all our social relations.” 

At the end of the 19th century, Ely was pointing to the similarities between early Christianity and socialism. Each appealed mainly to the masses, grew rapidly, and had an international, cosmopolitan character. Each demanded universal dominion, neither had been slowed by persecution, and each commanded of its adherents a religious devotion. Ely claims that socialism is attractive not for its materialism but for its ethical ideals, which parallel Christianity’s, and which inspire “fiery zeal” for labor and sacrifice on behalf of the masses. While the influence of the Bible on the average Christian has waned over the centuries, socialism retains a power to guide the lives of its followers similar to that of early Christianity. In fact, socialism is Christianity for the modern age insofar as it promises to realize the brotherhood of man by creating a social system dedicated to the maxim “One for all, all for one.” 

In language that sounds remarkably contemporary (save perhaps for its grounding in the duty of Christians), Ely stresses the importance of fulsome tax payments, for he who neglects to pay his “fair share” does so on the backs of the “weaker elements in society, such as the widow or orphan.”

Christian socialism arises out of the belief that Christianity must be real and vital, applying in the marketplace as well as the pews, and recognizing the fact of social solidarity: that all interests are intertwined and that the prosperity of any one depends on the prosperity of all. 

In this context, it must be understood that private property is a useful and exclusive right but never an absolute one, for property has a “social side.” Individual claims, essential to “thrift and industry,” must nevertheless give way to social claims on the understanding that all property is a trust to be administered in accordance with the will of God. The land of Israel was not the property of the nation, let alone of the individual, but remained always God’s property, assigned to the use of families “under national regulation.” It remains the task of just societies to find some political mechanism to make the Christian doctrine of stewardship real. In practice, this involves “public agencies” exercising regulatory power. In fact, passing “good laws” in the cities is as much a religious service as preaching the gospel.

Despite Ely’s assertion of a right to property, it seems clear that title to property creates a social obligation more than a right to exclusive use. Ethical behavior, from a Christian point of view, depends much on coercion, or at least law that attracts true believers as it cajoles those who need guidance. Drawing on the insights of classical political philosophers, Ely sees law as education as well as force, enlightening the conscience. 

The subjects that Ely imagines the law might effectively compass and the lessons he imagines it might teach are in no way limited to those matters over which reasonable men might, after due deliberation, agree. Law is unmoored from any grounding in nature and is instead directed at moving—more or less in unison—the consciences of men toward particular policy conclusions concerning the regulation of the conditions of industrial life. 

Ely’s view of property relations, like economics as a whole, is distinctly historicist: all policies must change depending on time, place, and cultural particularities. And government, animated by the essential moral teaching of Christ, is the primary agent of change and direction. 

The progressive state is valorized along with the things of the world. With the growth of such an understanding, the only things on which the morally earnest man need concentrate are those things that are within the purview and control of the state—those that can be manipulated through the application of law and administrative expertise.

To further these ends, lawyers and judges must become social scientists in order to do away with the messiness and corruption of American republican institutions. Well before it became a commonplace observation, Ely recognized that judges in effect exercise legislative authority, but he saw little problem with that so long as it was well exercised. Judges should be selected with explicit reference to their social and economic philosophies and should decide the limits of police powers in a scientifically (as opposed to constitutionally) appropriate manner. 

Unlike Tocqueville, Ely was not willing to sacrifice some order, along with predictability and high conceptions of moral propriety, for the sake of self-government. Viewed retrospectively, Ely’s understandings of judicial competence and power seem refreshingly honest, if not exactly true to the American constitutional and common law tradition. But if they were articulated as clearly and honestly today, such understandings might at least have the benefit of preventing judicial confirmation hearings from turning into the comic kabuki dances they have become.

In language that sounds remarkably contemporary (save perhaps for its grounding in the duty of Christians), Ely stresses the importance of fulsome tax payments, for he who neglects to pay his “fair share” does so on the backs of the “weaker elements in society, such as the widow or orphan.” A “great body” of “attractive laws” must be formulated by thinking Christians to keep the ways and means flowing toward the government without complaint, paving the way to a brighter future. Private philanthropy will not suffice for this comprehensive task, for the “great lines of social reform must be the concern of agencies which work steadily and persistently.” 

Property distribution must be manipulated by the state for the good of all, though not all property must be owned collectively. So Ely, while no friend of capitalism, was not strictly speaking a socialist, or at least not a very comprehensive one. Distribution and regulation of private property, however, must be undertaken fairly regularly, and without the counterproductive and artificial constraints that would be imposed by traditional constitutional understandings. A constitution grounded in natural rights and expressing limitations on government power is an obstacle to social Christianity. 

So what might be called the default position of the Founders’ regime—that a central purpose of government is to protect property as a natural right, rather than to distribute it as a contingent one—is flatly rejected by Ely. And in this he seems to ignore the possibility of factional conflict over governmental distribution of spoils, not to mention the dangers posed by the imperial overreach of ambitious politicians and the consequent discrediting of government itself.

A developed, innate moral sense of social obligation is something for which Ely hopes, but he believes it is not something on which he, or his fellow Christians, can rely. Freedom is not the absence of restraint but is found in service to others and therefore eschews self-interest. And this freedom needs external guidance. 

Our individuality must be directed toward others—rather routinely, one might say—in a manner that is contrary to both the letter and the spirit of the Founders’ Constitution. This is so because of Ely’s implicit denial of Madison’s observation that the causes of faction are sown irreducibly in the nature of man and that the simultaneous unleashing and checking of unequal interests, opinions, and passions can conduce to the public good far better than the high-minded moralism of the state. For Ely, rather, freedom comes in pursuing the common rather than individual good and in overcoming what Madison calls self-love, which routinely limits and degrades man’s higher faculties. Ely is confident that man shall know the truth and that the truth shall set him to concentrate on social goals.

The State and Social Ethics

Ely claims his conception of the state is derivative from Christian social ethics, which rejects the “English” philosophy of individualism. A proper reading of the Old Testament confirms that the nation, in its law-making capacity, is nothing less than “God’s instrument for the establishment of universal righteousness.” God consistently deals with nations and reaches individuals only through them. Ely insists that this “co-operative institution” of the state is merely the means to a proper political economy that is in harmony with religion. In Ely’s scheme, the practical morality man needs is the morality embodied in and expressed through the state. At best, this seems to result in muddying the relationship between ethical ends and means.

Ely understands the state to be an organic whole rather than a product of the conscious will of man. No social contract created it, nor can it be dissolved through the deliberate choices of men. Christ himself recognized the state’s divine character, with powers ordained of God. Ely strikingly insists that yet another outcome of the Protestant Reformation was “the exaltation of the state,” overcoming the Roman Catholic Church’s insistence on the distinction between the cities of God and man. 

With poor laws and the curtailment of the functions of the ecclesiastical courts, Protestant nations achieved something analogous to the merger of English courts of common law and equity, each with their own bodies of substantive law, but overlapping and interconnecting at various points. In each case, the goal was unity under a higher, more complete understanding of justice. The clergy and the special prerogatives of the church gave way to universal law expressed through the sovereign unity of the state, which is a truer representation of God’s will. Ely goes so far as to make the arresting claim that “religious laws,” broadly understood, “are the only laws which ought to be enacted.” 

So the state must be understood to be divine in idea and intention, if not in practice. To the extent that the political life of the United States is “unworthy,” it is because “the nature of offenses against the purity of political life as offenses directly against God has not in recent years been adequately emphasized.” The state is not quite God—but woe unto that man through whom offense to the state cometh.

Although the New Testament replaces the nation with a “world-wide” society and extends our duties accordingly, the nation-state is still, practically speaking, the instantiation of universal Christian truth. As Moses said nothing of the future life, so Christ, even in his resurrection and immortality, reminded us that “eternal life begins in this world.” Even the injunction to render unto Caesar is nothing more than an admonishment to submit to sovereign authority, even if it is established by conquest. 

Christ condemns not the world but the worldliness of self-interest and seeks always national righteousness: 

We must have a feeling for our city, for our country, like that which is inculcated in the Bible. Our Jerusalem must be so dear to us that we can say with the psalmist, “If I forget thee, O Jerusalem, let my right hand forget her cunning.

“If I do not remember thee, let my tongue cleave to the roof of my mouth; if I prefer not Jerusalem above my chief joy.”

When we reach this point, then we shall attain civic reform; then our commonwealths will be regenerated; then shall we see our nation a new nation, exalted by righteousness.

Ely was routinely bold to pray, “Thy Kingdom come, thy will be done on earth as it is in heaven.”

Father John Ryan and a Roman Catholic Political Economy

Like his Protestant counterparts, Ryan was an influential scholar, professor, and activist with an overriding interest in matters of economic justice, resting on a belief that religion, ethics, and economics could not be divorced. He taught first at St. Paul Seminary, and then at the Catholic University of America. Unlike Rauschenbusch, he attempted to ground or at least embed his arguments in a larger natural law theory. And he rejected, at least formally, the idea that the church’s primary objective should be anything other than the salvation of souls. Also unlike Rauschenbusch, he lived through and directly influenced the New Deal period, so he was able to see his moral theology come to fruition in very concrete ways. 

New Deal initiatives like minimum wage laws, social security, and labor legislation are all enactments of various elements of Ryan’s plan. And Ryan was a political actor himself when circumstances called for it. In several states, he testified in favor of the passage of minimum wage laws. The Progressive Party platform of 1912 incorporated his “living wage” language. And by the 1930s, he became a vehement supporter of the New Deal, on the basis that it found a Christian middle ground: “neither individualism nor socialism.”

Ryan’s most influential contribution to the intellectual ferment of his times was his argument in favor of a living wage. But it would be a mistake to construe his efforts narrowly. His case for the living wage amounts to a social welfare version of the natural law, as well as an argument against what he sees as the rampant individualism of the American polity. His doctoral dissertation was first published in 1906 as the book, A Living Wage, and was widely reviewed in America and abroad. The book was introduced by none other than Richard Ely, whom Ryan had first read as a young seminarian and to whom he sent a prepublication copy.

Despite his orthodoxy, Ryan, like most progressives, could never escape his fascination with modern science and its tendency to direct human attention away from eternity and toward the here and now.

In the book, Ryan shared Rauschenbusch’s confidence that a new day was finally dawning in Americans’ understanding of the ends, and injustices, of their economic system. In his 1919 preface to a revised edition, he asserted what he claimed was almost “universally accepted” by “all intelligent and disinterested persons”: a laborer has a distinct moral claim to a decent living wage. And Ely, in his introduction, suggested the main purpose of the book was to stimulate the conscience of Christians as to their palpable duties, including supporting a Christian doctrine of wages. But the book’s subject matter was yet broader than that, according to Ely. It was in fact “the first attempt in the English language to elaborate what may be called a Roman Catholic system of political economy.”

In the words of Ryan’s mid-20th-century biographer Francis L. Broderick, “More than any other single figure in the Catholic Church in America, he is responsible for the progressive stands adopted by official Catholic spokesmen in our time. Some of these men are former students of his; many were trained in an atmosphere he helped create.” 

When, in 1919, the American bishops issued their “Program for Social Reconstruction,” Ryan in effect enjoyed the support of the American Catholic hierarchy for the reforms he had long championed. The document, for Ryan’s purposes, “created another standard to set beside Rerum Novarum when he appealed to the conscience of Catholic America.” The effect was to shift the burden of proof on economic matters—more or less permanently, as it turns out—from progressives to conservatives within the church. The American church, while making room for conservative clergy and laymen, has itself spoken the language of economic progressivism, in its official voice, since Ryan’s time.

Insisting on his Christian bona fides, and, beyond that, his religious orthodoxy and commitment to the Holy See, Ryan is at pains in A Living Wage to state the influence of Pope Leo XIII’s 1891 encyclical Rerum Novarum as the document that “converted the Living Wage doctrine from an implicit into an explicit principle of Catholic ethics.” 

Opposing socialism and materialism as well as exploitation of labor, Leo argued for the dignity of workers and wage justice, as well as for a wide sphere of state action—things that accorded with Ryan’s views even before he read the encyclical. Ryan notes that Protestantism, in its individuality, has less pronounced and uniform teachings on these matters, but it is nonetheless true that Protestant denominations have never signaled approval of “unlimited bargaining.” And he also notes that the Federal Council of Churches had just made a formal demand for a living wage enforced by the state. 

Smith Collection/Gado/Getty Images

Paradoxically, despite his orthodoxy, Ryan, like most progressives, could never escape his fascination with modern science and its tendency to direct human attention away from eternity and toward the here and now.

Ryan wastes no time in arguing that his notion of a living wage is derivative from natural law. In this sense, his work is less dependent on a parsing of the Gospels than is Rauschenbusch’s. Ryan asserts that the labor question cannot be solved without religion, but “Neither will religion suffice in the absence of a detailed application of moral principles to the relations of employer and employee.” 

With Rauschenbusch, Ryan recognizes that men might be religious in a conventional sense but blind to moral wrongs because of their false commitment to an individualist, competitive ethical code. In fine, business ethics instead of Christian ethics govern their lives. Clergymen must therefore give more attention to preaching a living wage and less to “other duties that are no more important.” 

Moral and religious suasion—including using one’s ecclesiastical position to “deprive recalcitrant employers of the church privileges that are ordinarily denied to persistently disobedient members”—are important, but they are not all. For Ryan, philosophical reason looms much larger as a source of influence on Christians than it does for Rauschenbusch.

The laborer, Ryan emphasizes, has an individual natural right to a living wage that belongs to him personally, not simply to him as a member of society. It is something he possesses at birth and is in no way a creature of the positive law. The “absoluteness” of the right is meant in the sense it does not depend on the will of another, not that it cannot be subjected to reasonable limits. Or, as Ryan puts it, it is absolute in existence, though not in extent. Men’s natural rights are equal in number and embrace a minimum of goods, which minimum is determined by the reasonable needs of human “personality.” The catalog of natural rights to which Ryan refers includes not only life, liberty, and property but livelihood, marriage, religious worship, and education.

But rights are not ends in themselves; they are means to the end of the “welfare of the person,” which is an inviolable fact of the natural order. Happiness and dignity are alternative expressions of this welfare. And in turn, it is the “development” of “personality” that allows for welfare to be achieved. 

As we are morally obliged to order our lives to pursue human welfare, so we have a natural obligation not to interfere with the natural rights of others. We know what conduces to human welfare by knowing first what constitutes man’s nature—“his essential constitution, relations and end.” Ryan claims that academic opposition to natural rights doctrine is a result of the doctrine’s “exaggerated and anti-social form”—its Rousseauist form—which can be found among both European and American theorists (though Ryan’s tendency is to conflate the two). 

Writing elsewhere, Ryan echoed Theodore Roosevelt’s concerns that moral decadence and demand for luxurious living were leading to a dangerous decline in the birth rate.

According to this form of natural rights theory, nature refers not to what is permanent in man but to what can be found in his primitive state. “State of nature” theory for Ryan seems to always point to a denial of nature that allows the strong to oppress the weak through legal mechanisms. He seems therefore not to allow that a robust natural rights theory—one that is self-limiting and oriented toward protecting the rights of the minority from the tyranny of the majority—is embedded in a social contractarian view of government.

He claims his doctrine is the antidote to the dangers of antisocial natural rights theories, a middle ground between revolutionary, fundamentally Rousseauist views and legal positivism. Individuals must be understood to be endowed by nature, and God, with rights that are requisite to the development of personality. The extent of the rights must be worked out in time, according to social circumstances. No right can be understood to interfere with the state’s obligation to adjust conflicting claims in the name of social welfare. “The true formula is, that the individual has a right to all things that are essential to the reasonable development of his personality, consistently with the rights of others and the complete observance of the moral law.” Ryan claims this middle ground will guarantee that man does not become a mere instrument of the state.

Following Pope Leo, Ryan argues that the right to property is in fact natural rather than conventional but that it is also contingent. Private property is a right not for its own sake but insofar as it conduces to the satisfaction of genuine human needs, and especially the needs of the family. It is, again, a means rather than an end. It best enables the realization of the primary right of man to use nature for the development of personality—physical, intellectual, moral, and spiritual. 

“Adjustment” is necessary because, though men are equal “generically,” they are unequal “individually,” each having different powers and needs. A decent livelihood varies from time to time, place to place, and individual to individual. Hence the need for elasticity and, most importantly, expertise in determining just what constitutes such a livelihood. During his time teaching at St. Paul Seminary, Ryan tellingly devoted more than a quarter of his course in moral theology to economic history and political economy.

The difficulties of making such complex economic determinations, while daunting, should not deter. The right to a living wage can be asserted only against members of the industrial community where the worker lives, which is something Ryan admits can be defined only approximately. But the complexity of modern economies, while serving to obscure economic rights, should not halt confident action. Even traditional rights doctrines interfere with a proper understanding of natural rights, which are more akin to the Christian doctrine that private ownership is not absolute but a form of stewardship. The capaciousness of Ryan’s understanding of stewardship is notable. He favored using the “superfluous” goods of the wealthy to subsidize the needs of the poor—from labor unions, to education, to hospitals and housing.

In an early version of equal pay for work of equal value, Ryan observes that women deserve the same living wage as men, assuming their efficiency is the same. But he grounds this in a concern not only for distributive justice but for the family. Paying women less than men would tend to drive the latter out of an occupation and thereby increase the proportion of female workers, which he does not see as a good. 

As man by nature needs the permanent love and companionship of the opposite sex, a living wage must be sufficient to support family life. In an interesting admixture of what might be called contemporary individualist and Catholic communitarian arguments, Ryan claims the majority of men cannot achieve appropriate “self-development” outside the conjugal state. For the average man, “celibacy is not normal” and cannot be the measure of his natural rights. But, in his search for some limiting principle, Ryan claims that a laborer cannot in justice demand a wage to support his parents because in the normal course of things parents should have taken precautions to secure themselves financially. Rights, he asserts, “are not to be interpreted by the abnormal and exceptional exigencies of existence.” 

And again, in his efforts to make economic life compatible with the life of the nuclear family, Ryan argues that the family living wage is due to every male laborer, based on “average” rather than exceptional circumstances. Even those who are unmarried are due this wage, for to deny it to them would create an increased demand for their labor, to the ultimate destruction of the family. It would place a premium on “a very undesirable kind of celibacy.” The basis for estimating the family living wage is in relation to a family containing the average number of children found in a workingman’s home—about four to five. While Ryan admits this formula is not perfect, this is the best that can be done in present circumstances to preserve “the intrinsic worth and sacredness of personality.” 

Writing elsewhere—shortly after President Theodore Roosevelt warned Americans, in 1903, that its best citizens were insufficiently fecund—Ryan echoed TR’s concerns that moral decadence and demand for luxurious living were leading to a dangerous decline in the birth rate. 

In A Living Wage, Ryan goes further to argue that aversion to marriage fosters selfishness that leads to indolence and inertia, and therefore that arguments for “sexual self-restraint” as a means to aid the working class are misplaced. They are “immoral and anti-social,” bad for both society and the individual personality. What is needed is not misguided moralizing—exhortations directed at encouraging fundamentally unnatural lives—but “social action,” especially in the realms of government regulation and labor organization. Positive rather than negative freedom is needed. In a summative statement of his conception of the social order—which is at once rights-based and organic—Ryan states:

the obligation to pay a Living Wage falls upon the employer as a reasonable consequence of his position in the economic organism. From this responsibility he cannot free himself by appealing to the labor contract or to the productivity of labor; for the former is consistent with extortion, while the latter is usually unknowable, and is always inferior to needs as a canon of distribution. Inability to perform the obligation suspends it, but inability must not be so interpreted as to favor the superfluous needs of the employer at the expense of the essential needs of the laborer. The employer’s right to obtain interest on the capital that he has invested in his business is subordinate to the laborer’s right to a Living Wage.

The state, therefore, has both a “right” and “duty” to require a living wage, for its very purpose is “social welfare,” or assisting the individual in attaining earthly ends. And this state activity can be thought of as protecting natural rights. A minimum wage law is both an urgent necessity and a dictate of natural law reasoning, and the Constitution—long thought to protect freedom of contract—cannot remain a barrier to natural rights. While the expression of these rights is new, they are rights that in Ryan’s estimation predate and supersede the flawed Enlightenment conceptions of negative liberty so mistakenly elevated by America’s founders.

Ryan considered his 1916 book Distributive Justice to be his most important work, though it was lesser known in his own day, and subsequently, than A Living Wage. The relative obscurity of the former is no doubt due to its being both drier and considerably more ponderous than the latter. It attempts to discuss “systematically and comprehensively the justice of the processes by which the product of industry is distributed” among landowners, capitalists, businessmen, and laborers—all with an eye to the morality of the processes and outcomes. Based on a sweeping survey of the morality of private land ownership, private capital, profits, and wages, the book reiterates familiar themes. The role of the state is substantial, and little to no regard is given to questions of legal or constitutional constraint. On the whole, Ryan was guided by Ely’s view that socialism could be severed from materialism and that elements of the socialist program—if not complete public ownership—were essential to a Christian commonwealth.

Ryan claims private ownership of land is preferable to socialism, but the landowner’s right to rent is a moral claim no stronger than the capitalist’s right to interest, and neither is as strong as the tenant’s right to live decently or the laborer’s right to a living wage. Public ownership of valuable lands should be maintained or expanded, and increases in land value should be severely taxed, to the point of breaking up exceptionally large or valuable estates.

With respect to capital and interest, it is wrong to claim, as the socialist does, that the capitalist has no claim to interest. But the right to collect it is conventional: “The State is justified in permitting the practice of taking interest.” The “right” exists only when it is socially useful. The best practical hope for reducing the “burden of interest” is a wider diffusion of capital through cooperative associations in key fields like banking, agriculture, distribution, and manufacture.

When it comes to profits, “needs, efforts and sacrifices, productivity, scarcity, and human welfare” must be taken into account. Only businessmen who use “fair methods of competition” have the right to all the profits that come their way. And Ryan predictably claims that “remedies for unjust profits are to be found mainly in the action of government”—in the form of public ownership and legal regulation of monopolies. Ryan also believes progressive taxation and inheritance taxes play an important role. His book was written just three years after the ratification of the 16th Amendment, granting Congress broad powers to lay and collect taxes on incomes.

Finally—and almost incidentally—“The possessors of large fortunes and incomes could help to bring about a more equitable distribution by voluntarily complying with the Christian duty of bestowing their superfluous goods upon needy persons and objects.” With respect to laborers, a living wage is a right to be vouchsafed through minimum wage laws, unionization, and cooperative enterprises in which workers have a substantial voice in the conditions of their employment. Ryan concludes with a reiteration of the importance of faith: “For the adoption and pursuit of these ideals the most necessary requisite is a revival of genuine religion.” 

One can see a distinct and unbroken line of descent from progressivism, to the New Deal, to the Great Society. But as each of these waves of liberalism crested, it became apparent that the underlying force and motivating energy of each was different.

Ryan’s view of the Declaration of Independence is at once expansive, partial, and particular. He sees republican government as but one means among many to pursue social welfare and therefore claim the mantle of legitimate government. But he fails to note the apparent incompatibility of this view with the limited and precise conception of natural rights found in the Declaration, which stems from what Jefferson claims to be the self-evident truth of human equality. 

According to Ryan, democratic forms can claim legitimacy along with monarchic or aristocratic ones, depending on circumstances. And even in democracies, the people are not the source of political authority but only its depositories. Linked to Ryan’s gloss on political equality is his view that the state should ideally recognize the one true religion, that professed by the Catholic Church, and prevent the introduction of new forms. He allows that Catholic states where other denominations are already established, should generally tolerate them as a matter of prudence. But no rights are absolute in the sense of being ends in themselves, including freedom of speech. 

All aspects of the state should be understood to be the means to human welfare. And so Ryan leaves to the good judgment of Christian rulers vast amounts of discretion as to what constitutes public welfare, even in matters of conscience. And he appears to deny that freedom of conscience is, in principle and nature, an essential incident of human welfare. It therefore easily follows that he would view lesser things—such as the right to property—as not to be entitled to inviolable protections, despite their apparent grounding in what he understands to be nature.

When the purpose of government is seen in such broad terms—that is, the furtherance of the general welfare of man in light of God’s purposes—natural rights are bound to be understood as less natural, less fixed, and less protective of irreducible spheres of human thought and activity than would have been acceptable to America’s founders—on grounds of either principle or prudence. In the language of contemporary academic discourse, we can say that Ryan’s Catholicism, while not hostile to republican government, is in tension with it. In less couched terms, we can say it is indifferent to it.

Smith Collection/Gado/Getty Images

The Liberal Millennium

And so we have come full circle. We have seen how progressive theorists, statesmen, and theologians alike, embraced a notion that material and spiritual fulfillment can be found in and through the good graces of the state. They shared a sense of the possibilities for an organic political wholeness that was coupled with a deep suspicion of anything they saw as too individualist—or, in other terms, too Newtonian or Lockean. All this represented, in theory and practice, a stunning transformation of American politics, morality, and constitutionalism.

Ely’s “ethical ideal” of political economy led him to advocate “‘such a distribution of economic goods’ as would nurture the ‘growth of all the higher faculties,’” including even love itself, as seen in religion, art, and literature. The heavenly city on earth was indeed a possibility, if only the Gospels were understood to condemn individualism, and individuals could be made to act on this teaching.

For his part, Woodrow Wilson tried to Americanize his Hegelianism and tame his social Darwinism through comforting versions of an increasingly familiar Christian theology. As Charles Kesler notes, “Wilson, whose father was a Presbyterian minister and his mother the daughter of a Presbyterian minister, fluently incorporated religious language and sentiments in his Progressivism. That was the era of the Social Gospel movement, a tributary of Progressivism, so it was common to encounter millenarian religious longings translated into calls for social work and social justice.” Even Wilson’s emphasis on the patriarchal origins of the Aryan races is very revealing as to his view of the relationship of politics to Christianity (not to mention what it says about the race consciousness of leading progressives). His claim that the state is the family writ large is the precursor to contemporary liberalism’s assertion that it takes a village to raise a child. 

For Wilson, the order and authority of the patriarchal family is the analogue to the order that the modern administrative state provides. As Kesler has noted, this understanding, at once an old and new dispensation, suggests that “we need not fear government’s increasing power to be our keeper . . . because it operates as merely the most efficient instrument of our brotherly and sisterly duty to care for one another.” And according to this political theology, our duty of care extends less to concern for the soul, but neither is it limited to mere life. Instead, it encompasses most facets of human existence that can be touched by the brave new world of centralized administration. In fact, concern for the soul is not the proper purview of the state, for spiritual progress is not measurable, whereas material progress—in the form of material equality—is. The state concerns itself only with those things that it can measure and manipulate, or that can be measured and manipulated by the expert scientific classes on which it relies for guidance.

Man becomes a creature of the state, rather than a political animal free to order the state according to his deliberative choices. To the Protestants and Catholics who were influenced by such a teaching, religion became an enemy of natural rights and limited government and a friend to the state. “Conscience,” far from being threatened by an unlimited state, could instead be followed—but only by influencing the mechanisms of the state in the interests of social justice. 

Christian progressives seemed unconcerned that, in a larger sense, the realm of conscience—not itself measurable or manipulable by the state or by modern social science—seemed by those very facts destined to play second fiddle to all those things of which the modern state could take cognizance and thereby directly superintend. What after all can be the status of Christian conscience to those who know the trajectory of History, including what will be revealed to every good Christian in the fullness of time? No one should be free to reject true progressive enlightenment, for to do so would be a form of slavery. When the fullness of time was come, God sent the administrative state.

And so, while the early Progressives were motivated by faith, their children and grandchildren became increasingly secularized. One can see a distinct and unbroken line of descent from progressivism, to the New Deal, to the Great Society. But as each of these waves of liberalism crested, it became apparent that the underlying force and motivating energy of each was different. The millenarianism of the early progressives was driven, thanks to Rauschenbusch and others, by a genuine if idiosyncratic sense of Christian purposes. This Christian sensibility was already on the wane by the 1930s. Franklin Roosevelt in effect secularized the phenomenon while maintaining some degree of recognizably Christian language: “When Roosevelt, as sensitive a barometer of his times as could be imagined, expressed the higher ethical life to which liberalism pointed, he did so in relatively unassuming, vaguely Protestant and vaguely Progressive terms that could appeal to almost everyone.”

The Great Society, by contrast, was characterized by its all-encompassing confidence in the power of government to do pretty much anything and everything. And so its premises sowed the seeds of its demise.

As Kesler argues, “Its soaring expectations, its utopian promises, could not be fulfilled in ten years or a hundred years. What it proffered was the satisfaction, in principle, of all material and spiritual needs and desires. But human desires are infinite. They cannot be satisfied, unless first governed or moderated by reason and morality.” And certainly by the late 1960s, the spiritual needs for which people demanded satisfaction had lost even the attenuated connections to the next world that could be seen in the longings of the early progressives.

But these insights, and more, would not play a role in most scholarly accounts of progressivism until well into the 21st century. They had to await a new generation of political theorists to bring them to the surface. The historians of the 20th century had very different stories to tell.

Weekend Long Read

This essay is a revised and expanded version of two stories that first appeared on American Greatness in July 2018.

The Monstrous Lie Behind CrowdStrike

There’s a simple explanation for the Democratic National Committee’s unwillingness to let outsiders have a peek at evidence its servers were infiltrated by the Russians in 2016: There isn’t any. The Russian hacking that’s caused so much division and turmoil at home and abroad never really happened. It was all a ruse.

Robert Mueller’s investigation into the 2016 presidential election was predicated largely on the claim Russian intelligence had hacked the Democratic National Committee’s servers ahead of the November election. Russia’s guilt is such an article of faith among our political class that a Republican-controlled Congress imposed sanctions on Russia and President Trump signed on, substantially worsening relations with an important and potentially dangerous nation

Since those sanctions were imposed, Mueller’s team confirmed the Russian espionage they were meant to punish. Since its publication last year, the Washington establishment has treated the Mueller report almost as a sacred document.

Outside the Acela Corridor, however, one finds more skepticism.

A lot of ordinary folks just can’t stop wondering why the DNC wouldn’t let any federal investigators examine their servers. Only CrowdStrike, an independent contractor on the DNC’s payroll, was allowed to do so. CrowdStrike executive Robert Johnson appeared on “60 Minutes” to address concerns that his firm hadn’t been completely forthcoming with its findings. But he only succeeded in raising more questions by claiming that the “FBI got what it needed and what it wanted.”

Even if the self-proclaimed “hard-hitting” investigators at “60 Minutes” couldn’t be bothered to spend 30 minutes researching such an important story, Johnson himself had to know he wasn’t telling the truth.

On no less than three occasions before President Trump fired him, FBI Director James Comey testified to Congress about the DNC’s strange unwillingness to let his agency examine their servers in a case they were simultaneously hyping as akin to “an act of war.” Comey testified that the DNC rejected the FBI’s “[m]ultiple requests at different levels” to collect forensic evidence. 

A week before Comey testified in January 2017, the DNC had already tried palming off Johnson’s lie and were sternly contradicted the very next day. A senior FBI official told The Hill that his agency “repeatedly stressed to DNC officials the necessity of obtaining direct access to servers and data, only to be rebuffed until well after the initial compromise.” According to The Hill’s source, far from getting everything the bureau wanted, “the FBI [had] no choice but to rely upon” CrowdStrike.

Johnson also must know the FBI isn’t even the only federal agency who ran into a brick wall when they took the DNC’s hysterical spiel about Russian espionage seriously. Obama Homeland Security Secretary Jeh Johnson told Congress he couldn’t even get the DNC to discuss the case with anyone from his agency, even though election security falls under its official purview. The homeland security chief was so disconcerted that he twice told Congress he “should have brought a sleeping bag and camped out in front of” the party’s headquarters. 

But Congress never got the chance to ask anyone from CrowdStrike about the peculiar circumstances surrounding its “investigation.” For some strange reason, the executives representing the only entity to inspect the DNC servers refused to discuss the matter under oath.

The crack team of investigative journalists at “60 Minutes” also somehow failed to uncover that, just six months after accusing the Russians of hacking the DNC, CrowdStrike issued a report accusing the very same alleged Russian hackers of having penetrated into some Ukrainian artillery software that was so riddled with errors they were forced to retract it. Perhaps the “60 Minutes” team was too busy telling the rest of us how awesome they are to learn that other actors were known to have been in possession of the malware to which CrowdStrike claimed Russian intelligence had exclusive access since 2015.

Among other problems with the technical aspects of CrowdStrike’s story, the malware which the company claims was used to broadcast Ukrainian artillery positions to the Russians turned out not even to “use GPS nor does it ask for GPS location information.” Jeffrey Carr, the cybersecurity consultant who exposed CrowdStrike’s bogus accusations against the Russians, wryly noted, “[t]hat’s a surprising design flaw for custom-made malware whose alleged objective was to collect and transmit location data.”   

“60 Minutes’” gaslighting only succeeded in confirming that the program’s self-proclaimed reputation as fierce and thorough investigators is a joke. And it underscored ordinary folks’ concerns about the DNC’s refusal to cooperate with federal officials.

Moreover, a bunch of not-so-ordinary folks who know a thing or two about computers think there’s a simple explanation for the DNC’s unwillingness to let outsiders have a peek at the evidence: There isn’t any. The Russian hacking that’s caused so much division and turmoil at home and abroad never really happened. It was all a ruse concocted by CrowdStrike.

One such skeptic is an anonymous journalist and computer aficionado who goes by the pseudonym “Adam Carter.” Carter has spent the last few years cataloging evidence, unearthed by himself and others, that CrowdStrike engaged in a disinformation campaign, inventing not just a fake Russian hack but also a fake hacker called “Guccifer 2.0.” Much, but by no means all, of Carter’s evidence is technical. And he’s unquestionably found an inconsistency in the Russia narrative that ought to raise doubts in even the most computer-illiterate congressman’s mind.

Julian Assange’s Warning

But first, why on earth would a private contractor hired by the DNC engage in such tactics? For motive, we need to go back to June 12, 2016, when Wikileaks founder Julian Assange made an announcement that was sure to strike panic in the hearts of Hillary Clinton and her closest advisers:

We have upcoming leaks in relation to Hillary Clinton . . . We have emails pending publication. 

A little less than three months earlier, on March 19, hostile actors had gotten ahold of all the emails in campaign chairman John Podesta’s main Gmail account. You may have heard that Podesta’s emails were “hacked,” but they weren’t. There were no faraway cyber-nerds searching for some vulnerability in the DNC network. He fell for a common “spear phishing” scam. A fake email from Google arrived, saying he needed to change his password and providing a link. The link was also fake. Instead of changing his password, Podesta gave it away—along with all of his campaign emails.

Whoops!

The Clinton campaign learned of Podesta’s blunder almost immediately and must have feared that the emails Assange was threatening to release were his. Moreover, on that date, a lot of the revelations contained therein would have been very salient—and not in a good way.

Just six days before, with Clinton still 570 delegates short of the 2,382 needed to win the Democratic nomination, the Associated Press angered Bernie Sanders and his supporters by claiming that she’d already won. The New York Times, CNN, NBC News, USA Today, and The Washington Post all followed suit, declaring Sanders’ loss a fait accompli.

But it wasn’t.

The AP had arrived at its numbers by polling unpledged superdelegates, who couldn’t vote until the convention and were free to change their minds until then or even to deceive the AP.

Sanders supporters had been angry about the role superdelegates played in the nominating process for months. Sanders himself complained about it just one week before Assange’s announcement and a day before the media started writing his campaign’s obituary:

My problem is that the process today has allowed Secretary Clinton to get the support of over 400 superdelegates before any other Democratic candidate was in the race.

The next day’s headlines prematurely declaring Clinton’s victory brought Sanders’ supporters long-simmering anger to a boil. His spokesman blasted the corporate media’s “rush to judgment”:

Secretary Clinton does not have and will not have the requisite number of pledged delegates to secure the nomination. She will be dependent on superdelegates who do not vote until July 25 and who can change their minds between now and then.

For the rest of the week, the big election story was whether Sanders would exit the race gracefully and encourage his followers to forgive, forget, and rally round Hillary Clinton. But just 12 hours after Assange’s announcement, Sanders emerged from a meeting with his top advisors, refusing to concede and reiterating his determination not to let the media gaslight his candidacy into a lost cause:

[W]e are going to take our campaign to the convention with the full understanding that we’re very good in arithmetic and that we know who has received the most votes up until now.

The Immensity of Podesta’s Blunder

John Podesta’s blunder had the potential to destroy Hillary Clinton’s already precarious reputation with voters regardless of their feelings about Bernie Sanders. In some of the emails, Podesta had revealed that Clinton’s most senior advisors—including Podesta himself—denigrated her abilities and her ethics, commented on her poor health, made disparaging remarks about Catholics, Muslims, blacks, and Latinos, and complained that Clinton wanted “unaware and compliant” voters.

Many of Podesta’s emails also contradict claims made in defense of the private email server Clinton used as secretary of state. Others reveal that the FBI investigation into the matter was anything but unbiased. At a minimum, the emails prove Clinton’s campaign knew from the beginning that she was breaking the law.

It’s easy to forget how serious an issue Clinton’s unsecured server was when Assange issued his warning. James Comey’s surprise announcement exonerating her was still three weeks away, on July 5, 2016. A few weeks earlier, the State Department had sharply rebuked Clinton for violating department rules, generating unpleasant headlines such as, “Hillary Clinton’s email problems just got much worse.

A June 1 Morning Consult poll found that about half of voters thought her private email server was “illegal, unethical and a major problem.” Even a quarter of Democrats agreed. There’s little question that Assange’s threat would have made the poll disturbingly salient to Clinton and her top advisers.

But, given Sanders’ supporters’ cresting anger on the very day Assange issued his warning and Clinton’s need for their enthusiastic support to prevail against Trump, her team would have been more concerned about emails revealing her disdain for the kind of voters who flocked to Sanders and some of their most beloved progressive policies.

How would Sanders’ passionate and ideological followers react upon learning, at the very height of their anger, that Clinton secretly opposed gay marriage and supported fracking? The Democratic nomination was almost within her grasp and those revelations alone might have made it impossible for Sanders to graciously concede and put the weight of his campaign behind hers. 

All the more so when his followers discovered that she and other top campaign officials routinely mocked both Sanders and them. Making matters worse, if Assange released Podesta’s emails they would also find out that CNN contributor Donna Brazile had given Clinton at least three questions in advance for her debates with Sanders. And an extraordinary number of emails confirm Sanders supporters’ long-standing complaints that the DNC and the mainstream media had been colluding with Clinton to torpedo his candidacy from its inception.

But perhaps the most troubling of Podesta’s emails would have been those containing passages from speeches Clinton gave to Goldman Sachs and other big-money outfits at $225,000 per appearance. In these speeches, Clinton downplayed Wall Street’s role in the 2008 recession. She even assured the wealthy bankers enriching her that they themselves ought to be the ones writing any legislation necessary to make sure such a crash didn’t reoccur.

Clinton’s Wall Street benefactors also heard her confess to being “obviously” out of touch with the struggles of middle-class voters. She further admitted to having distinct public and private positions on political issues. Finally, though it wouldn’t bother many of Sanders’s followers, moderate voters wouldn’t be happy to learn that Clinton assured her wealthy patrons that she secretly favors open borders.

Like the controversy over her private email server, Clinton weathered this storm so well that it’s hard to remember how much her unreleased speeches alarmed Sanders’ supporters, to whom she was little more than a corporate shill. Sanders himself had been mocking the extraordinary sums Clinton’s Wall Street patrons had paid to hear her speak and suggesting that they must have been getting more than just talk for their money in his own stump speeches for months:

If you’re going to give a speech for $225,000 it’s gotta be really, don’t you think an extraordinarily brilliant speech, I mean why else would they pay that kind of money? . . . Must be a speech written in Shakespearean prose. So I think, if it is such a fantastic speech, the secretary should make it available to all of us.

To make matters worse, three weeks before Assange’s announcement, Clinton released a mandatory financial statement that brought her Wall Street speeches to the forefront of campaign news, yielding disastrous headlines like, “How corporate America bought Hillary Clinton for $21M” and “The massive scale of the Clintons’ speech-making industry.”

A few days later, reporters even annoyed President Obama at a G7 summit in Japan by pestering him about whether she ought to release her speeches. On June 1, just 11 days before Assange’s warning, a Morning Consult poll had 64 percent of voters saying she needed to do so, including two-thirds of independents and even almost half of Democrats.

Many readers have likely forgotten the many serious political storms Hillary Clinton was navigating in the week preceding Assange’s June 12 announcement and how desperately she needed to placate Sanders’ increasingly angry supporters. If you weren’t too distracted by the Russian hacking narrative, however, you probably remember some of the above revelations from Podesta’s emails that would have made doing so impossible had Assange not given Clinton’s camp so much time to prepare.

By October 7, when Wikileaks finally began releasing Podesta’s emails, Democrat voters had been taught to tune them out by angrily reciting the mantras “Putin” and “Russia.” They were warned by CNN that it was illegal for folks who didn’t work for CNN or some other CNN-approved corporation to so much as look at the Podesta’s emails. Trump couldn’t push Wikileaks’ disclosures because doing so immediately rebounded back at him, raising worries he might be “Putin’s puppet,” rather than reflecting poorly on Clinton.

Clinton Uses the Russian-Hacking Narrative to Great Effect

Whether Adam Carter is right that the Russian DNC hack was a ruse designed to deflect the damage if it turned out Assange’s warning meant he had Podesta’s emails, there’s no question Clinton and her surrogates were instantly prepared to use it that way.

Within hours of WikiLeak’s October 7 release, Podesta himself made a transparent attempt on Twitter to tie the disastrous revelations caused by his own bone-headed blunder to a dastardly Russian scheme perpetrated on Trump’s behalf:

While I’m in pretty good company with Gen. Powell & Amb. Marshall, I’m not happy about being hacked by the Russians in their quest to throw the election to Donald Trump.

Clinton had avoided any situations in which she’d have to take questions as much as possible throughout the campaign. So she forestalled publicly addressing any of the disclosures in Podesta’s emails until her third debate with Trump, 12 days after they appeared. 

She was asked about the secret preference for open borders she’d revealed in a speech to a group of Brazilian bankers and the $225,000 they paid for the privilege of hearing about it. After a few nonsensical words claiming that she’d meant open borders for electricity, not people, Clinton quickly shifted to her real defense: 

But you are very clearly quoting from WikiLeaks. What is really important about WikiLeaks is that the Russian government has engaged in espionage against Americans. They have hacked American websites, American accounts of private people, of institutions. Then they have given that information to WikiLeaks for the purpose of putting it on the internet. This has come from the highest levels of the Russian government. Clearly from Putin himself in an effort, as 17 of our intelligence agencies have confirmed, to influence our election. So, I actually think the most important question of this evening, Chris, is finally, will Donald Trump admit and condemn that the Russians are doing this, and make it clear that he will not have the help of Putin in this election.

A more transparently rehearsed attempt to deflect the damaging revelations in Podesta’s emails by branding them with the words “Wikileaks,” “espionage against Americans,” “Putin,” and “Donald Trump” would be impossible.

So, by the time Assange released them on October 7, tainting the publication of Podesta’s emails as a Russian scheme perpetrated out of love for Donald Trump was demonstrably the Clinton campaign’s go-to strategy. But a Washington Post story about the DNC hack published just two days after Assange’s June 12 warning shows the strategy was prepared much earlier.

CrowdStrike’s Perplexing Announcement 

The June 14 Washington Post article marks the first time the DNC went public about the alleged Russian hack. It includes the detail that the Russians stole a file of Trump opposition research; which, though no ordinary readers could have known it at the time, would turn up months later when Wikileaks released Podesta’s emails.

Indeed, this detail is also the article’s big takeaway, as it’s mentioned in both the lead sentence and even its headline: “Russian government hackers penetrated DNC, stole opposition research on Trump.”

The story extensively quotes CrowdStrike president Shawn Henry, who previously was in charge of FBI cyber operations. Henry just so happens to have been promoted to that position by none other than Robert Mueller when he ran the agency. CrowdStrike’s founder and Chief Technology Officer, Dmitri Alperovitch is also featured prominently. Though born in Russia, his family fled the country when he was fourteen and Alperovitch is now a senior member of the vehemently anti-Russian Atlantic Council.

All information for the Washington Post story was provided voluntarily by CrowdStrike and the DNC. According to Alperovitch, the DNC “decide[d] to go public…about their incident and give us permission to share our knowledge.”

So, why on June 14, 2016, had the DNC wanted everyone to know the embarrassing fact that the Russians had penetrated their servers and the content of one particular pilfered file?

Alperovitch says the DNC wanted to “help protect even those who do not happen to be [CrowdStrike] customers.” It’s hard to understand how telling the world Russia had stolen a file of Trump opposition research from the DNC servers did anything to help those not fortunate enough to be able to rely on CrowdStrike. But, even if sense could be made of the philanthropic motives Alperovitch ascribed to the DNC, they must have had more self-interested ones to, once again, publicly connect Hillary Clinton’s name to lost emails and unsecured servers while her already existing troubles concerning such matters were still a very live issue.

Clinton’s team must have suspected that Assange had Podesta’s emails and they certainly knew the file of Trump opposition research was among them. So announcing that the Russians had stolen it two days after Assange’s warning is, in hindsight, either an incredible coincidence or the first step in a strategy designed to taint the damaging information in Podesta’s emails with Russian perfidy.

But CrowdStrike and the DNC weren’t the only ones calling attention to that file of Trump opposition research in the days following Julian Assange’s fateful warning.

The Russian Spy Who Was Wasn’t

The day after CrowdStrike’s announcement, a new actor dramatically took the stage announcing himself as “Guccifer 2.0.” His name was supposed to pay tribute to a hacker who’d gone by the nom de guerre Guccifer, famous for having plagued Hillary Clinton. 

Guccifer 2.0 expressed his intention to take up his imprisoned namesake’s mantle by boldly claiming to be the very hacker whose existence Alperovitch and Henry had just announced on the front page of yesterday’s Washington Post!

And, to prove it, he posted 230 pages of Trump opposition research on his newly minted blog and emailed copies to Gawker and The Smoking Gun.

If you hadn’t known it was all real, you might have thought all this sensational news coincidentally emerging on the heels of Assange’s warning was coming from a script. 

We’re supposed to think that G2 (as he’s called for short) was a Russian spy passing documents he hacked from the DNC servers to Wikileaks. In fact, though hardly anyone is aware how crucial the allegation is, G2’s alleged role as WikiLeaks’ source is the only evidence we’ve ever seen that the DNC emails WikiLeaks published really did come from Russian intelligence.

But if G2 really is a Russian spy, Putin ought to be pitied rather than feared.

When he debuted claiming to be the hacker featured on the front page of the previous day’s Post, G2 made no attempt to deny he was a Russian spy. Anyone reading his first blog post also familiar with the Washington Post story was given no reason to doubt G2 was an agent of Russia as Alperovitch and Henry had claimed. Would a real Russian spy connect himself to a report outing him as a Russian spy without denying it? 

Why on earth would he connect himself to such a report at all?

Would a real Russian spy trying to hide his nationality end the second sentence in his first blog post with “)))”, the symbol Russians use in place of our “lol.” G2 did.

Would a real Russian spy on a secret mission to sabotage Hillary Clinton reveal his purpose by naming himself after someone famous for having already done so? The story in the previous day’s Washington Post hadn’t given any indication whatsoever that Clinton was his target. Why was G2 so anxious that we know? 

And, why would a Russian spy using WikiLeaks as a clandestine front announce that he’d sent the documents he’d stolen to WikiLeaks? G2 gave the whole game away in that very first blog post:

I’ve been in the DNC’s networks for almost a year . . . The main part of the papers, thousands of files and mails, I gave to Wikileaks. They will publish them soon. 

Is it at all credible that a spy sent by Vladimir Putin on a secret mission to control the outcome of the U.S. presidential election would start a blog a day after his espionage had been reported in the Washington Post in order take credit for and inform the public of some crucial facts about his operation that hadn’t been exposed; like identifying both his target and his secret accomplice? 

Shawn Henry, Dmitri Alperovitch, James Comey, James Clapper, and Robert Mueller are all asking you to believe that it is.

Mueller uses absurdly expurgated quotes from alleged communications between G2 and WikiLeaks to prove he was the source of their DNC emails. If Mueller’s insidious gaslighting hadn’t caused so much damage, his neglecting to mention that G2 announced he was WikiLeaks’ source in his very first blog post would be comical. Mueller is, of course, also silent about the other 11 occasions in his brief time in the public spotlight on which G2 made public statements explicitly connecting himself to WikiLeaks.

Mueller also wants you to believe that G2 immediately denied he was Russian—by no means Mueller’s only blatant lie.

G2 first denied being Russian only when explicitly questioned about his nationality in an interview six days after his debut. But by then it was too late. No one believed him because it had already emerged that there were “Russian fingerprints” all over the documents he’d released. Odd enough by itself, given the “superb operational tradecraft” attributed to him by Alperovitch and the fact that he was conducting one of history’s most significant clandestine operations.

Russian intelligence must run hundreds of cyber operations every year that go entirely undetected. Yet, when agents are sent directly by Vladimir Putin himself to control the outcome of the U.S presidential election, they announce their presence to the world and leave a half-dozen clues that identify them as Russian spies which are found before they even have time to deny it.

But it gets worse.

Putin Must Not be Sending His Best

The first evidence of Russian involvement was found within hours of G2’s June 15 debut. Someone at Gawker opened the metadata in the files he sent and, what do you know? Sitting there plain as day for anyone to see was the name of Soviet secret police founder Felix Dzerzhinsky! 

Even though the name is hardly a household word in the United States, it was still impossible to miss its significance since it just so happened to be written in the Russian alphabet. All that was missing was a link to Wikipedia to save anyone the trouble of googling, “Феликс Эдмундович.” 

The five files G2 sent out when he debuted all later turned up in Podesta’s emails—absent any Russian names in their metadata, of course. The metadata in the versions released by G2, however, shows that the Russian spymaster’s name appeared because their content was cut and pasted from somewhere else into a Russian template from Microsoft Word with “Феликс Эдмундович” set as the username. 

Editing the documents couldn’t have served any legitimate purpose since the files G2 released were identical in content to the versions that later turned up in Podesta’s emails. Moreover, the needless cut-and-pasting, which also caused Russian error messages to appear in various places just in case no one bothered looking at the metadata, was done the very same day G2 released the files!

Is it at all credible that a Russian spy sent by Vladimir Putin on a secret mission to control the outcome of a U.S. presidential election would go to the trouble of editing documents he was sending to the press as a Word file with a famous Russian spymaster’s Russian name set as username, causing it to appear in the metadata? Would he cut and paste the documents’ content into a Russian template, causing Russian language error messages to pop up when the journalists to whom he was sending them tried reading the files? Is it credible that he’d do all that the same day he sent the documents out even though he didn’t alter their content at all and, hence, had no reason whatsoever to edit them?

Shawn Henry, Dmitri Alperovitch, James Clapper, James Comey, and Robert Mueller are all asking you to believe that it is.

In fact, they’re insisting that you do.

Even had G2 altered the content of the files, it’s preposterous to suppose that a Russian spy on the most serious mission imaginable would be so careless as to leave clues revealing his identity to a Gawker reporter within hours of his sending them to her. But, since the content of the documents wasn’t altered at all, the procedures which caused the “Russian fingerprints” to immediately appear could only have been designed to do exactly that.

If we weren’t so desperate for sensational news, a Gawker reporter finding evidence that G2 was a Russian intelligence agent in the files he’d sent her mere hours after his debut by itself would have raised enormous red flags.

But, believe it or not, that’s not all Henry, Alperovitch, Comey, Mueller, and their intelligence community cohorts expect you to swallow.

G2 also chose to use a company based in Russia to cloak his IP address. Even then, there are plenty of email providers that would conceal the Russian IP address. Yet G2, who Hillary Clinton suggested “clearly” took orders directly from KGB prodigy Vladimir Putin, somehow chose one that didn’t.

If G2 had simply done nothing, there would have been nothing connecting Wikileaks to Russian intelligence and no one would have been the wiser. Instead of doing nothing, he went out of his way to create the only evidence we’ve seen that any of the emails Assange released in the run-up to the 2016 election came from Russian intelligence. 

Yet, somehow, we’re supposed to believe he was sent by Putin on a mission to sabotage the Clinton campaign. Apart from G2’s self-undermining announcement that Clinton was his target, neither the Trump opposition file nor any other file he ever released contained anything damaging to her. 

So, on top of all the other completely preposterous nonsense, a Russian spy intent on getting Trump elected released 230 pages of damaging information on Trump but nothing negative about Hillary Clinton.

Viewed in quick and haphazard slices, G2’s debut may look like a collaboration between Putin and Assange. But Russian spies trying to hide their identity don’t openly confess to crimes the Washington Post attributed to Russian spies the day before.

Nor do they use Russian emoticons.

Nor do they publicly announce their mission and name their accomplices.

Nor do they send documents to reporters containing clues that they are Russian spies which are discovered within hours.

And they most certainly don’t go out of their way to plant such clues.

And when Russian spies release 230 pages of negative information about Trump, you can bet that it’s Trump, and not his enemies, they are trying to harm.

When we widen our view, the only question becomes who Alperovitch, Henry, Mueller and their cohorts are grossly insulting more: Russia’s intelligence agencies or the American public’s intelligence.

Where Did Guccifer 2.0 Get the Trump File?

Hindsight together with Adam Carter and crew’s hard work shows that G2, rather than trying to harm Clinton, worked to manufacture a fake connection between Assange and Russian intelligence. This fake connection could later be used by Clinton as a shield to immediately deflect the avalanche of damaging information in Podesta’s emails on to Trump should Assange release them. The moment he did, the fake connection allowed her to claim he’d done so at Putin’s behest and, therefore, that Putin not only wanted Trump in the White House but had perpetrated dirty Russian espionage designed to put him there. 

Putin had attacked not just her campaign but all of America on Trump’s behalf, Clinton scolded. That was the real story voters needed to focus on, not all the proof of her corruption and incompetence Julian Assange had tried to bring to their attention. In fact, it was every American’s patriotic duty to ignore they’d been given irrefutable evidence in her own words and those of her closest advisors that she was grossly unfit for office. Not ignoring it would make you complicit in a filthy Russian attack on America and likely a piece of vile Russian-loving scum yourself.

It was a message perfectly designed to appeal to the tolerant souls without a trace of bigotry in their loving hearts who make up the Democratic Party’s base.

The Washington Post headline announcing that the Russians had hacked a Trump opposition file from the DNC set the stage for its delivery. But the article made no mention of Assange or Wikileaks. Alperovitch and Henry could say they’d found Putin’s minions infesting the DNC servers. That was no problem since Comey was running the FBI and he could be counted on to say whatever words they decided to put in his mouth. 

But nothing they could plausibly claim they’d discovered examining the DNC servers would be able to connect the little Russian devils they were going to say they found there to Assange.

So, considered alone, the Washington Post story they would use to get the ball rolling had zero potential to discredit anything he might release.

G2 forged the crucial link to Assange the next day by taking credit for the Russian hack Alperovitch and Henry had announced in the Washington Post and saying he’d turned over the spoils to Wikileaks. The fact that he released the Trump opposition research file mentioned in the Post’s headline confirmed that he really was the hacker CrowdStrike’s executive duo had credited with stealing files from the DNC and not some prankster merely pretending to be. If Assange did release Podesta’s emails, as the Clinton campaign surely must have feared he would, the fact that the Trump opposition file G2 released was among them could also be used to directly connect G2 to their theft if narrative reinforcement became necessary.

Absent G2 bringing Wikileaks into the picture, the Washington Post story would have informed voters of an embarrassing Russian DNC hack of some Trump opposition research, without any mitigating way to connect those Russians to Julian Assange and thereby taint anything he might publish.

So the information released to the Post serves no purpose and, indeed, could have only harmed the DNC, unless Alperovitch and Henry knew G2 would immediately enter the fray to shift attention away from the poor internet security that had allowed Russian spies to breach the DNC servers and towards speculation about their connection to Wikileaks.

But there’s another more conclusive reason to think that G2 had to be working with CrowdStrike and Hillary Clinton.

Remember, on June 15, Guccifer 2.0 emailed the Trump opposition file along with four other documents to Gawker and The Smoking Gun and posted them on his blog. But, apart from the Russian fingerprints he planted, every one of those files was found among Podesta’s emails when Assange released them four months later.

So, how did G2 get ahold of five files from John Podesta’s Gmail account? That’s what Adam Carter wants everyone to start asking.

Given how hard G2 worked to discredit Wikileaks, it’s impossible he got the files from them.

Since G2 manifestly isn’t the implacable foe of Hillary Clinton he pretended to be, it’s unlikely that he hacked the DNC servers as claimed. Indeed, since none of those first five files G2 released appeared in the DNC emails later published by WikiLeaks, we’ve no reason to suppose they were even on the DNC servers to be hacked. 

We know they were attached to emails in Podesta’s Gmail account; which would mean they were on Google’s servers. None of them were sent to him from a DNC email address, nor did he send any of them to one, nor were they copied to any. So we have no reason to think they were on the DNC servers at all. Moreover, Carter and other experts say the methods G2 claims he used to hack the DNC make no technical sense and couldn’t have worked anyway.

Even putting aside that CrowdStrike’s announcement that the DNC servers had been hacked makes no sense unless they knew G2 would emerge to bring WikiLeaks into the picture and the question of how G2 got ahold of files the Clinton campaign knew would appear as attachments to Podesta’s emails when they were released, it’s grossly implausible that G2’s operation wasn’t coordinated with CrowdStrike. The effort G2 made to make it look like Assange had gotten anything he might publish damaging to Clinton from Russian intelligence would be bizarre if he were just some random stranger who decided to step in and help out Clinton in her time of need.

Moreover, even if that very unlikely hypothesis somehow turned out to be true, Alperovitch, Henry, Mueller, Clapper, Comey, and a host of others would still be guilty of perpetuating G2’s hoax as a means to falsely substantiate that the DNC had been hacked by Russia and the spoils passed to Assange.

And the fact that they used a hoax to substantiate the Russian DNC hack and Assange’s DNC emails having been passed to him by Russia, indicates that both of those claims must also be hoaxes. Of course, it would be an incredible coincidence if Alperovitch and Henry perpetrated a hoax and G2 came along and perpetrated a different hoax which just so happened to be exactly what the CrowdStrike executives needed to make theirs successful. 

But the fact that G2 somehow got ahold of files from John Podesta’s Gmail account seems inexplicable, given everything else we now know, unless someone very high up in the Clinton campaign gave them to him because that person knew those files were stolen with John Podesta’s emails and would be released along with them. G2’s having released them together with all the clues he’d planted indicating he was with Russian intelligence would provide a means to reinforce the idea that Podesta’s emails had been stolen by Russia should it become necessary.

Given everything we know, G2 couldn’t have been in possession of files the Clinton campaign knew would turn up in John Podesta’s stolen emails unless he was part of a CrowdStrike disinformation campaign designed to protect Hillary Clinton from the consequences of Podesta’s blunder.

But even if G2 just happened to come along and perpetrate a hoax that perfectly met Hillary Clinton’s needs, Alperovitch, Henry, Mueller and the rest would have still used that hoax to deceive Americans into believing that Julian Assange is a Russian puppet and Trump owes his 2016 victory to Russian espionage.

The absurdity of anyone claiming that Guccifer 2.0 was a Russian spy and the way in which the narrative that the WikiLeaks releases were part of a Russian plot to help Trump, means that everyone who promoted the story was pushing a monstrous lie.

It also means that Robert Mueller’s two-year, $32 million investigation, the sanctions Congress placed on Russia, and all the unbelievably nasty political strife Americans have suffered since Trump was elected were all predicated on the very same monstrous lie.

Let’s hope our political class notices and the culprits are finally punished.

The monstrous lie has reigned for far too long.

Weekend Long Read

This essay is adapted from a talk earlier in February delivered at the Center for the Philosophy of Freedom and the American Culture and Ideas Initiative at the University of Arizona.

Wokeness, Free Speech, and the Role of Education

Conservatives have rightly lamented the assault on free speech that is such a conspicuous and disfiguring reality of life in America today. But that loss only achieves its true significance in the context of a more fundamental erosion: the erosion of a shared political consensus that gives life to “We, the People.”

Back in New York, we have recently started an informal reading group at The New Criterion and Encounter Books. If that sounds dull, let me add that I have combined the reading with a little seminar on wine appreciation. At the moment, our palettes are padding around Bordeaux, learning to discriminate reliably among Paulliac, Saint-Estèphe, and Saint-Julien. Soon we’ll move east to the Right Bank and then further afield. 

At the same time, we are in the midst of reading Plato’s Republic, a book about nearly everything, including a major theme of my remarks today: the role of education. 

I thank my host Dan Asia for supplying the title of my talk, and I will get around to touching on all of its elements. In the meantime, I want to point out a certain ambiguity or incompleteness about the phrase “the role of education.” One immediately wants to know, “the role of education” in what? In free speech? In the perpetuation of wokeness? Perhaps this is the place to issue a trigger warning to the effect this talk is definitely not “woke.” Anyone anxious about being offended may leave with impunity. 

In what follows, I am basically going to follow some hints in the Republic, which inquires into the role of education in several senses: into what it means for individuals, to start with, and also what it means for society at large. Socrates signals the importance of education early on when he tells Glaucon, Plato’s elder brother and one of the chief characters in the dialogue, that “it is no trifling matter we are discussing, but the right conduct of life.” 

I think that’s right. Education, rightly understood, is important business. And it is worth noting that, traditionally, a liberal arts education involved both character formation and learning. It was, as the word “liberal” suggests, an education for freedom, for liberty. It might incidentally teach you how to plot a trajectory, dissect a frog, analyze a poem, or construct a pie chart. But at the end of the day, the aim of a liberal arts education was thoughtful reflection about the question “How should I live my life?” The goal was to produce men and women who, as Allan Bloom put it in The Closing of the American Mind, had reflected thoughtfully on the question “‘What is man?’ in relation to his highest aspirations as opposed to his low and common needs.” 

I am sure I do not need to point out to this educated audience that by “man,” Bloom meant anthrōpos, not anēr: human being, not just the male of the species. 

Bloom’s ideal seems very old-fashioned now. Since the 1960s, in fact, colleges and universities have more and more been home to what the literary critic Lionel Trilling called the “adversary culture” of the intellectuals. Now the goal was rejection, not reflection. Colleges and universities increasingly became laboratories dedicated to social and political transformation, not learning or the cultivation of free and responsible citizens.

There were many reasons for this transformation. One reason has to do with the erosion of liberalism in the face of rising moral certitude. If you believe that you are in possession of a higher virtue that trumps the pedestrian wisdom of ordinary mortals, then you are likely to be impatient with their pleas for pluralism. 

The intoxication that follows from moral certitude is one important reason that the modern academy is increasingly inimical to free speech and everything that surrounds the cultivation of free speech: free inquiry, free action, and free minds. 

The dissemination of political correctness, subordinating the pursuit of truth to the imposition of political dogma, sacrifices freedom on the altar of virtue, or supposed virtue. It’s not so much that the academy has turned its back on its traditional raison d’être—the pursuit of truth and the propagation of civilization. No, it’s worse than that. The academy has increasingly embraced an ethic that is positively inimical to its founding principles. As an illustration, consider the news from Northwestern University. Just a couple of days ago, I read that after a student proposed a resolution to protect free speech and civil discourse at the school, it was voted down by the students at large.

There are two central tenets of the woke philosophy. The first is feigned fragility. The second is angry intolerance.

The phenomenon is reminiscent of what the 20th-century Marxist philosopher Herbert Marcuse called “repressive tolerance.” It took a Marxist to come up with that idea. Our ordinary sense of tolerance, Marcuse said—an idea summed up in such phrases as “live and let live”—was not only wrong but evil, and it was evil because it tended to reinforce the moral structures of bourgeois society. Marcuse advocated instead what he called “liberating tolerance,” that is, “intolerance against movements from the Right, and toleration of movements from the Left.” Think about that. 

The classical liberal (who is also the contemporary conservative) championed tolerance partly because he knew that his own vision was limited and incomplete, partly because it helped maintain a space for civilized disagreement. Many of you will recall hearing sentences like this: “I disagree with you but I support your right to voice your opinion.” How quaint that now sounds! 

The modern social justice warrior abominates disagreement as a form of heresy. Accordingly, he rejects tolerance in favor of enforced, indeed totalitarian, conformity. It is the antithesis of what a liberal-arts education is all about, which is why its installation at the center of our erstwhile liberal-arts institutions makes for such a sad irony.

The renaissance philosopher Nicholas of Cusa touched on an important aspect of this irony in his discussion of the “coincidence of opposites.” Unpacking exactly what Cusa meant by that arresting phrase would take us deep into the thickets of metaphysical speculation. But we see pedestrian examples of that strange coincidence everywhere. Indeed, one of the great tests of our wokeness is the extent to which many things have mutated into their opposites—not awake but awoke. In short, inversion is a dominant principle of our social life.

Rick Friedman/Corbis via Getty Images

Restoring the Old Ignorance

We see this with particular vividness in the vast petri dish that is the contemporary university. 

Consider the demand for diversity. You cannot step foot on a college campus these days without being regaled about its commitment to diversity. Everywhere you turn, from the curriculum to the institution’s admission policy, diversity is hailed as the highest, sometimes it seems, the only value. Other values—the value, for example, of embracing a common moral and intellectual tradition—are drastically undervalued where they are not ignored entirely. 

And the irony is, the diversity that is so lovingly proclaimed turns out to be a sham. It turns out that in every case the demand for diversity really means strict intellectual and moral conformity on any contentious issue. 

To be diverse is to subscribe to a menu of orthodox opinions on subjects ranging from abortion to the environment to race, sexuality, and Donald Trump. Again, dissent from the orthodoxy is regarded not as another opinion, with which one might argue, but as heresy, which one must silence. According to this view of diversity, everything that is not mandatory is prohibited. The principle of inversion turned the virtue of diversity into its opposite. 

Once upon a time, and it was not so long ago, colleges and universities were institutions dedicated to the pursuit of truth and the transmission of the highest values of our civilization. Today, most are dedicated to the repudiation of truth and the subversion of those values. In short, they are laboratories for the cultivation of wokeness. This is especially true, with only a handful of exceptions, of the most prestigious institutions. The tonier and more expensive the college, the more woke it is likely to be.

There are two central tenets of the woke philosophy. The first is feigned fragility. The second is angry intolerance. The union of fragility and intolerance has given us that curious and malevolent hybrid, the crybully, a delicate yet venomous species that thrives chiefly in lush, pampered environments.

The 18th-century German aphorist G. C. Lichtenberg observed that “Nowadays we everywhere seek to propagate wisdom: who knows whether in a couple of centuries there may not exist universities for restoring the old ignorance.” 

Doubtless Lichtenberg thought he was being clever. How astonished he would have been to discover that he was a prophet, not a satirist.

Surely many of you have heard about the Twitter sensation Titania McGrath. She is the author of many extravagant woke pronouncements. A personal favorite is this: “If you don’t think exactly the same way as me, then you’ve clearly got a lot to learn about diversity.” Is that satire? Or is it a bulletin from the front? I doubt that any triggered academic could put it better. 

The world recently learned that Titania’s real name is Andrew Doyle and that all those woke observations were in jest. A certain amount of hilarity ensued. But the serious point is this: McGrath’s sly tweets are indistinguishable from what is actually, seriously being propagated today in academia—and not only in academia. The mantra is “Diversity.” The reality is strictly enforced conformity about any ideas that might disturb the heavy moral slumber of wokeness. Consider this gem: “It’s a broken kind of democracy that allows a majority of voters to impose their wishes on the rest of us.” I suspect that Adam Schiff would agree. 

But here’s an irony that underscores the theme of inversion: when the free speech movement started at Berkeley’s Sproul Hall in 1964, it was a left-wing movement that demanded tolerance and challenged conventional behavior and mores. Today the Left espouses the opposite—not tolerance and free speech but conformity, censorship, and intolerance. 

The advent of the crybully reminds of the important truth that what is preposterous can still be malevolent. 

In my book Tenured Radicals, I included a section on “academia and infantilization.” But when I wrote in 2008, the rhetoric of “safe spaces,” “microaggressions” and “trigger warnings” had not yet blazed its destructive path through the hearts and minds of students. Women back then made a point of declaring their independence, their ability to stand on their own two feet and make decisions for themselves. They would have rejected with contemptuous ridicule the idea that a college dean or “diversity officer” should police or protect their sex lives. 

Nowadays, of course, victimhood is a badge of election. I will not attempt to plumb the depressing reasons for this unlovely development other than to note that it represents another side of that infantilization I mentioned a moment ago. 

The crybully, who has weaponized his coveted status as a victim, was first sighted in the mid-2000s. He has two calling cards, race and gender. By coincidence Lawrence Summers, then president of Harvard University, was involved in the evolution of both.

Race came first. In 2001, Summers made headlines when he suggested that Cornel West—then the Alphonse Fletcher, Jr., University Professor and an eminence in the African and African American Studies Department at Harvard—buckle down to some serious scholarship. (West’s most recent production had been a rap CD called “Sketches of My Culture.”) Summers also suggested that the professor take the lead in fighting the scandal of grade inflation at Harvard, where one of every two grades was an A or A-.

A national scandal erupted. Black professors at Harvard threatened to leave—West himself soon decamped to Princeton—and the New York Times published a hand-wringing editorial criticizing Summers, who quickly recanted, noting that the entire episode had been “a terrible misunderstanding.”

Then came gender. In 2005, Summers spoke at a conference called “Diversifying the Science and Engineering Workforce” at MIT. He speculated on why there aren’t more women scientists at elite universities. He touched on several possibilities: Maybe “patterns of discrimination” had something to do with it. Maybe most women preferred to put their families before their careers. And maybe, just possibly, it had something to do with “different availability of aptitude at the high end.”

What a storm that last comment sparked! “I felt I was going to be sick,” wailed Nancy Hopkins, a biology professor at MIT, who had walked out on Summers. “My heart was pounding and my breath was shallow, low,” Hopkins said. “I was extremely upset.” To adapt Helen Reddy, “I am woman, hear me whine.” 

Once again, Summers recanted. He published an open letter to the Harvard community. “I deeply regret the impact of my comments,” he wrote, “and apologize for not having weighed them more carefully.” It was too late. By May his faculty had returned a vote of no confidence, 218-185, with 18 abstentions. By February he had been forced to announce his resignation.

These two incidents, partly because they involved such a high-profile institution, marked an important turning point. The pleasures of aggression were henceforth added to the comforts of feeling aggrieved. The crybully was slouching towards campus to be born. 

YouTube/Mark Schierbecker

The Farce of the Crybully’s Birthing Pains

The toxic fruits of this development are on view throughout the higher-educational establishment, where spurious charges of “systemic racism,” “a culture of rape,” and sundry other imaginary torts compete for the institutional budget of pity, special treatment, and financial reparation. 

Many of you will remember the Halloween Hijinks at Yale from a couple of years ago. The MacGuffin of the insanity turned on Halloween costumes. Erika Christakis, then associate master of a residential college at Yale, courted outrage by announcing that “free speech and the ability to tolerate offense are the hallmarks of a free and open society” and it was not her business to police Halloween costumes.

To people unindoctrinated by the sensitivity training that is de rigueur on most campuses today, these sentiments might seem utterly unobjectionable. But to the delicate creatures at Yale’s Silliman College they were an intolerable provocation. What if students dressed as American Indians or Mexican mariachi musicians? Angry, hysterical students confronted Nicholas Christakis, Erika’s husband and at that time master of Silliman. They screamed obscenities and demanded that he step down because he had failed to create “a place of comfort, a home” for students. The episode was captured on video and went viral.

Of course, the sickness affects not just institutions like Yale and Harvard. At the University of Missouri a couple of years back, Jonathan Butler, the son of a wealthy railroad executive, went on a hunger strike to protest what he called “revolting” acts of racism at Mizzou. Details were scanty. Nevertheless, black members of the university football team threatened to strike for the rest of the season unless Tim Wolfe, Mizzou’s president, stepped down. A day or two later, he did.

Emboldened, student and faculty protesters physically prevented reporters from photographing a tent village they had built on public space. In another shocking video, a student photographer is shown being forced back by an angry mob while Melissa Click, a feminist communications teacher at Mizzou, shouts for “muscle” to help her eject a reporter.

What is happening? Is it a reprise of the late 1960s and 1970s, when campuses across the country were sites of violent protests? There are some similarities. But again, the principle of inversion is at work. What we are seeing unfold has in many ways turned that radicalism on its head. Karl Marx touched on the central irony when he noted that history tends to repeat itself, the first time as tragedy, the second time as farce. The advent of the crybully reminds of the important truth that what is preposterous can still be malevolent. 

The response of university administrations has not been encouraging. At Yale, cringing capitulation has been the order of the day. Yale President Peter Salovey told a group of aggrieved students who complained that they did not feel “safe” at Yale that “we failed you.” At one of the several hours-long public meetings on campus, the Yale Daily News reported, Jonathan Holloway, dean of Yale College, found himself “surrounded by a sea of upturned faces and fighting back tears” as he apologized for the administration’s silence on allegations of racial discrimination.

Free speech is by nature offensive speech, at least potentially. If it couldn’t offend, if it couldn’t insult, it also couldn’t enlighten.

There are a lot of tears at Yale these days. When the conservative lawyer Amy Wax spoke at the Yale Political Union, a group of students stood up, turned their backs on her, and raised their fists in the air in protest. “Several students,” the Yale Daily News reported, “cried during her speech.”

A few days after enduring the hysterics of his students, Nicholas Christakis, accompanied by Dean Holloway and other university administrators, met with about 100 students at his home and abased himself. “I have disappointed you and I’m really sorry,” he said.

The confrontation “just broke my heart,” Christakis added. “I care so much about the same issues you care about. I’ve spent my life taking care of these issues of injustice, of poverty, of racism. I have the same beliefs that you do . . . I’m genuinely sorry, and to have disappointed you. I’ve disappointed myself.”

Perhaps he thinks such groveling will allow him to salvage his position. Not a chance. The revolution always eats its own. At midnight shortly after the Halloween Hoedown at Silliman College, a group of students marched to Salovey’s house to complain about “institutional racism at Yale” and to present six demands, including “a University where we feel safe,” the renaming of Yale’s Calhoun College, the abolition of the title “Master,” and the erection of a monument acknowledging that Yale was built on land stolen from “indigenous peoples.” They also demanded that Nicholas and Erika Christakis be removed from their administrative positions. 

I do not know whether the monument has been erected, but Calhoun College has been renamed Grace Hopper College, the title “master” has been retired, President Salovey earmarked $50 million for such initiatives as the Center for the Study of Race, Indigeneity, and Transnational Migration. Oh, and Nicholas and Erika Christakis were quietly removed from their positions at Silliman College. 

The fatuousness of these episodes—many of which might have been plucked from the annals of Maoist public-shaming events—underscores the surreal quality of life at many American colleges these days. Peter Salovey came to his office several years ago with a ringing defense of free speech. He has bravely endeavored to continue that support, but has also chained his carriage to a conflicting, indeed a contradictory, ethic: the mendacious gospel of political correctness, according to which reality must take second place to ideology. Salovey, like academic administrators around the country, hopes that he can safeguard free speech while also acceding to demands that the university be a “safe space” where no one’s feelings are hurt. It is an impossible project.

Chip Somodevilla/Getty Images

A More Sensible and Courageous Approach

Academic administrators would be better advised to take a page from the robust philosophy of Teddy Roosevelt, leavened with a little clear-eyed truth-telling from Aristotle. In Roosevelt’s autobiography, we read that—quote— “The one absolutely certain way of bringing this nation to ruin . . . would be to permit it to become a tangle of squabbling nationalities.” Teddy then warned against the destructive vogue for “hyphenated Americans.”

Back then, it was German-Americans, Irish-Americans, Italian-Americans. Today we speak of “Native-Americans,” “African-Americans,” and the like, and the terms tend to be wielded in a way to claim both special protected status and unearned privilege. The result is a tangle of national squabbling that is like nothing Roosevelt could have imagined. Perhaps this is the place to confess that I have always thought of myself as a “native American.” I was born in Shaker Heights, Ohio. Can a more native American venue be imagined?

The truth is that American universities are among the safest and most coddled environments ever devised by human ingenuity. The idea that one should attend college to be protected from ideas one might find controversial or offensive could only occur to someone who had jettisoned any hope of acquiring an education. Many commentators have been warning about a “higher education bubble.” They have focused mostly on the unsustainable costs of college, but the spectacle of timid moral self-indulgence also deserves a place on the bill of indictment.

There are some encouraging signs. When a dean at Claremont College resigned on after being accused of racism because of a carelessly worded email, some brave students at the Claremont Independent published a dissenting editorial in which they berated hypersensitive students for bringing spurious charges of racism and the dean and the president for cowardice in not standing up to the barrage.

“Lastly,” they wrote, “we are disappointed in students like ourselves, who were scared into silence. We are not racist for having different opinions. We are not immoral because we don’t buy the flawed rhetoric of a spiteful movement.”

And this is where Aristotle comes in. Courage, Aristotle pointed out, is the most important virtue, because without it you cannot practice the others. Courage has been in short supply on American campuses. Those independent-minded students at Claremont provided a breath of fresh air. It will be interesting to see if it penetrates the fetid atmosphere that has settled over so much of American academic life.

A couple of years ago at Encounter Books, I was proud to publish The Demon in Democracy by the Polish philosopher Ryszard Legutko. A prominent theme in that book is the persistence of totalitarian impulses in putatively liberal societies. Last spring, as if to illustrate that thesis, Middlebury College suddenly rescinded an invitation to Legutko to speak. Why? Because a handful of student snowflakes decided that Legutko’s ideas were not in perfect harmony with their own.

Middlebury, of course, is the institution that covered itself in shame two years ago when protestors there loudly and violently prevented the social scientist Charles Murray from speaking and then, in the resulting melee, sent a female faculty member to the hospital. And here’s the kicker: Middlebury is not some wacko exception. On the contrary, its malignant embrace of woke identity politics is the rule in the American educational establishment, and, increasingly, in the American workplace. I see that Murray is scheduled to make a return visit to Middlebury soon to discuss his new book Human Diversity: The Biology of Gender, Race, and Class. I hope that he has some bodyguards in tow. 

The suppression of free speech by the wardens of wokeness has prompted many conservatives to champion free speech as an all-purpose antidote. I sympathize with that endeavor, and have written probably dozens of articles defending a robust idea of free speech. In my view, if you say “I am for free speech, but not ‘hate speech’ or speech that offends Mohammed or speech that insults Greens or speech that mocks, satirizes, ridicules, and laughs at some P.C. icon,” etc. then you are not for free speech at all. Your “but” is merely a species of capitulation pretending to be redemptive conceptual nuance. Free speech is by nature offensive speech, at least potentially. If it couldn’t offend, if it couldn’t insult, it also couldn’t enlighten.

That said, I’d like this evening to take a stab at putting the debate over free speech into a larger context.

Ted Streshinsky/Corbis via Getty Images

What Is Free Speech?

The fact that the Left celebrated free speech in 1964 and now abominates it as a token of white supremacist ideology suggests the issue is not really, or not only, free speech. 

Like all freedoms, free speech is defined by the responsibilities it embraces and the culture in which it thrives. Some advocates of free speech maintain that, when it comes to the free expression of ideas, anything goes. No ideas, they say, should be off-limits. They say that. But I do not think that they really believe it, since one can easily produce a long list of ideas that they would be horrified to see circulating.

But that in turn suggests that the whole debate over free speech needs to be seen in the context of its larger purpose: its role in the metabolism of education, first of all, but also the place of education in the social-political dispensation of our country.

For assistance in making this point, I’d like to introduce you to a once potent, now largely forgotten political thinker named Willmoore Kendall. Kendall was an important mentor of William F. Buckley at Yale in the late 1940s. He was a founding editor of National Review. Leo Strauss said he was the most important political theorist of his generation.

Among other things, Kendall saw deeply into the dialectic of disagreement and free speech. It is understandable that conservatives should react to woke intolerance by celebrating free speech. After all, the criminalization of policy differences that underwrites woke culture is an alarming development. But I think that Kendall was right when he contended that “by no means are all questions open questions.”

To explain this, Kendall points out that all societies are founded on a “consensus,” what he calls “a hard core of shared beliefs.” This is especially true, he notes, for the United States, whose founding principles are of recent vintage and are clearly and deliberately set forth. 

Freedom of thought and expression are important, Kendall acknowledges, but only “within limits set by the basic consensus.” Should that consensus be challenged by something “with genuine civil war potential,” the proper response is not debate but interdiction. Edmund Burke made a similar point in his Reflections on the Revolution in France, as did James Madison when he spoke of “that veneration” for tradition—what he called “the prejudices of the community”—which even the wisest societies abandon at their peril. Abraham Lincoln, in his stalwart prosecution of the Civil War, demonstrated his agreement with Kendall’s insight.

Kendall was writing at a moment when international Communism posed an existential threat to the United States. With that in mind, he argued, “Some questions involve matters so basic to the consensus” that, in declaring them open, a society would in effect “abolish itself [and] commit suicide.” Accordingly, Kendall outlined two views of free speech. The first, dedicated to the proposition that “no truth in particular is true,” holds that all questions are open and that no one position is to be preferred to another. 

The second view, his view, turns on two words: “We” and “truth,” as in the phrase “We hold these truths” from the Declaration of Independence. The identity of that “We” and the substance of those truths mark the limits of interrogation.

Legal historians will note the similarity between what Kendall says and a famous observation made by Justice Robert Jackson in his dissent in Terminiello v. City of Chicago in 1949. The Bill of Rights, Justice Jackson said, is not “a suicide pact.” In other words, when it comes to free speech the choice “is not between order and liberty. It is between liberty with order and anarchy without either.”

Conservatives have rightly lamented the assault on free speech that is such a conspicuous and disfiguring reality of life in America today. But that loss only achieves its true significance in the context of a more fundamental erosion: the erosion of that shared political consensus, that community of sentiment, which gives life to the first-person plural, that “We, the People,” which made us who we are. Should we lose that, we shall have lost everything.

 

Weekend Long Read

Showdown at Fort Miamis 

Before America could be great, it first had to secure its territory, fend off enemies foreign and domestic, and maintain an unsteady peace with the most powerful empire on Earth. Here is the forgotten story of how George Washington and his administration navigated perilous diplomatic waters in the nation’s earliest days.

Are we to think of the United States of America as a republic or an empire? In particular, are we to think of the struggling young United States that Michael Taylor and I discuss in our new book, An Independent Empire: Diplomacy and War in the Making of the United States, as a republic or an empire? 

Are we to think the principal goal of the early republic was to become, as the historian Eliga Gould has said, a “treaty-worthy nation,” a respected member of an international community of states that governed its relations according to the law of nations? Or was it an aspiring regional hegemon whose lodestar in foreign policy was the desire to dominate North America beyond all fear of challenge, become the arbiter of affairs in both North and South America, and thereby avoid the entanglements of balance of power politics and separate its future from the futures of European empires and the European state system?

The Anglo-American crisis of 1794 displays the United States both as a rising empire and as a revolutionary and subversive power. From the revolutionary diplomacy of Edmond Genêt, Girondist French minister to the United States and apostle of world revolution, George Washington needed no instruction about American interests. Yet Washington also needed no instruction in revolutionary diplomacy or the subversive revolutionary substitutes for diplomacy. 

Origins of the Jay Treaty Gambit

To tamper with the loyalty of a foreign people to their prince or to their duly constituted government was a violation of the law of nations, in war as well as in peace. 

Edmund Burke, in an appendix to his December 1793 memorandum for George III, “Remarks on the Policy of the Allies with Respect to France,” quoted Emer de Vattel saying “it is a violation of the law of nations to persuade those subjects to revolt who actually obey their sovereign, though they complain of his government.” Burke claimed that no decent state would attempt to weaken a rival by aiming to subvert the loyalty of the rival’s people to its government, where that government has managed until now to secure the obedience of its people. A state that has constant recourse to such indecency had no place, Burke argued, in the international order of a civilized world.

Consider, then, Washington’s secretary of state, Edmund Randolph, on May 6, 1794, instructing Chief Justice John Jay, envoy extraordinary of the United States to Great Britain: “A full persuasion is entertained that, throughout the whole negotiation, you will make the following its general objects . . . to prevent the British ministry, should they be resolved on war, from carrying with them the British nation.”

Plan A in Randolph’s instructions to John Jay was to make a treaty with Britain to resolve a host of issues left over from the 1783 Treaty of Paris. Most pressing for Americans was British occupation of posts in territory they had ceded to the United States by the Treaty of Paris, the “Northwest,” between the Ohio River and the Great Lakes, west of the Appalachians, and their consequent interference with the Indians there. As Jay’s mission was getting underway, word had come to Philadelphia that the British in April 1794 had occupied a new post within that territory, Fort Miamis, about 70 miles south of Detroit.

Washington also hoped that Jay would negotiate a resolution to differences raised by Britain’s war against France, conflicts arising out of British seizures of American vessels, and the British definition of blockade and contraband that justified these seizures in Britain’s admiralty courts. More broadly, Washington hoped that Jay would succeed in resolving the modes under which Americans would trade with Britain and her empire. The Washington Administration and its supporters were especially concerned with American trade in the Caribbean, which in the 1790s was in the last flush of her fiscal and therefore political centrality in global politics.

Yet Washington had Randolph instruct Jay in a Plan B: if negotiations should fail, and the British should seek to block American expansion in the Northwest and American trade even by resort to war, Randolph should tell Jay to do his best to divide the British people from their government in the hope that these divisions would hamper the war effort against the United States. 

The Indians in the Northwest Territory

Keep in mind that tensions coming out of the war of the First Coalition on the Atlantic, and Lord Dorchester’s meddling with the Indians, were such that the alternative really was negotiation or war. Though the 1783 Treaty of Paris had stipulated that the British should evacuate their military posts in the region, the redcoats, clinging to the excuse that the United States would not honor prewar debts or protect loyalists, had not budged. 

There was also a strategic rationale behind this British obstinacy; for by maintaining their posts they not only could keep watch on a rival (if embryonic) empire but they could also sustain Native American alliances that could be reactivated in the event of another war with the United States. Some British commanders even felt a duty to care for indigenous allies who had been betrayed by the diplomats at Paris. 

The Earl of Carlisle, who had led that ill-fated peace commission of 1778, regarded the Treaty of Paris as nothing better than the betrayal of Britain’s Native American allies. “Twenty-five nations of Indians,” he told the House of Lords, had been “made over to the United States.” This had happened without “the smallest apparent advantage resulting to Great Britain” and, worse, without “that solitary stipulation which our honor should have made us insist upon, and have demanded with unshaken firmness: a place of refuge for those miserable persons before alluded to, some port, some haven, for those shattered barks to have been laid up in quiet.”

Hulton Archive/Getty Images

Other Britons conceived of a “special relationship” with the Indians and therefore resented direct American communication with their former allies. The British commander at Fort Ottawa, Allan Maclean, conceded the Americans’ right to cultivate diplomatic relations around the globe: “The Americans being now Independent States,” he wrote to Detroit, “will say they have a right to send Ambassadors or Emissaries to whom they please, without our consent—no doubt they may to all nations that we know of.” Yet in the case of the Indians, Maclean was “of a different opinion, it being clearly an exception to the Rule.” 

The Indians, he believed, were natural British allies, bound to the Crown. They got “from the King’s Stores the bread they are to eat tomorrow, and from his magazines the clothing that covers their nakedness.” The Indians were, in short, “not only our allies, but they are a part of our Family; and the Americans might as well . . . attempt to seduce our children & servants from their duty and allegiance, as to convene and assemble all the Indian Nations.”

By keeping their Northwest posts, by smuggling arms to the Native Americans, and by the operation of Canada’s Indian Department, the British deliberately hindered the expansion of the United States. In what amounted to a serious insult, they also offered to mediate negotiations between the United States and the Indians on U.S. land. The redcoats then refused to allow American commissioners to meet with the Indian Council at Detroit on what was, according to the Treaty of Paris, sovereign American soil. 

When Gouverneur Morris was sent to London in 1790 to persuade the British to honor the peace terms, he encountered nothing but obfuscation. Unwilling to uphold the terms of 1783, the British instead offered to negotiate anew. Morris spat back at William Pitt the Younger, “You wish to make a new treaty instead of complying with the old one,” and the prime minister conceded that such was “in some sort” his plan. By May 1792, Alexander Hamilton was so riled by British obstruction that, despite having stoutly resisted commercial warfare, he was ready for “actual” war. 

London was told that continued possession of the military posts would be intolerable to the Americans: “Any plan, which comprehended anything like a cession of territory or right or the allowance of any other power to interfere in the disputes with the Indians, would be considered by this government as absolutely impracticable and inadmissible.” British officials, though, were committed to their Northwest conspiracy.

One such official was the Crown superintendent of Indian affairs, Sir John Johnson, the son of Sir William. Sir John was an ex-loyalist who was the subject of a bill of attainder in New York. Another was Guy Carleton, Baron Dorchester, the last British commander-in-chief of the Revolutionary War and the governor-general of Canada. In the latter role, Dorchester had inflamed tension between Britain and the United States by ordering the construction of a new British fort on the banks of the Miami River in present-day Ohio, and by denouncing the expansionary politics of the American empire. 

In what Dorchester thought was a confidential letter to the Western Confederacy, but which was intercepted and leaked to the Americans, he declared that “From the manner in which the people of the States push on, and act and talk on this side, and from what I learn of their conduct towards the sea, I shall not be surprised if we are at war with them in the course of the present year: and if we are, a line must then be drawn by the warriors.” Eleven years after the Treaty of Paris, the British in North America had no intention of honoring its terms.

By the end of 1793, the British government seemed to think war with the United States was inevitable, so they had no qualms about wholesale depredations on American commerce. Americans were all agreed that even if war was inevitable, it was better to defer it as far as possible, until our might was more certain. As Hamilton wrote in July 1795: “if we can avoid war for ten or twelve years more, we shall then have acquired a maturity, which will make it no more than a common calamity.”

Catalyst of the Conflict with Britain

War between Britain and the United States may have seemed inevitable to many at the beginning of 1794, but it was deferred for not 10 or 12 but 18 years. To understand how and why, let us return to the principal issue of 1794, the Northwest Posts and the triangle of Anglo-American-Indian relations. The Americans and the Indians, had been fighting since 1785. 

The first major campaign was launched in autumn 1790 when Washington and his secretary of war Henry Knox ordered General Josiah Harmar to journey into the lands of the Miami and the Shawnee. Harmar was to exact retribution for Indian assaults on American settlers and to raze the principal Miami settlement of Kekionga. He did not succeed—the Harmar Campaign resulted in crushing defeat for the United States. 

First, at the Battle of Heller’s Corner, a reconnaissance mission led by John Hardin and James Fontaine was deceived, led into swampland, and routed by Native Americans commanded by the Miami chief Little Turtle. The next day, October 20, Philip Hartshorn was ambushed by an Indian force some eight miles outside Kekionga. With morale diminishing rapidly, Hardin advanced on Kekionga with 350 men. Outnumbered almost three-to-one, he sent an urgent request to Harmar for reinforcements, but General Harmar, who allegedly was drunk, refused and arranged his troops into a defensive formation around his own camp. This left Hardin in an impossible position. When the Indians attacked, all he could do was resist and after three hours he fell back after the loss of 150 men. 

The steam that rose from the American scalps is said to have reminded the Indians of hot squash in the cool autumnal air, so the encounter is known as the Battle of the Pumpkin Fields. With winter approaching, Harmar concluded he could no longer attack and he retreated in disgrace. 

On receiving the dispatches from the field, Washington was crestfallen. “My mind,” he wrote, “is prepared for the worst; that is, expense without honor and profit.” Harmar’s Defeat also acted as a catalyst for further Indian aggression. In January 1791, the Big Bottom Massacre saw 11 American settlers killed by Menape and Wyandot Indians in the southeast of present-day Ohio; the next week, the Siege of Dunlap’s Station saw 30 Americans attacked by 500 warriors of the Western Confederacy.

Washington reacted to Harmar’s calamities and the Indian insurgency by ordering Major General Arthur St. Clair, the governor of the Northwest Territory, to assemble another force for another campaign. It did not begin auspiciously. It took months for St. Clair to recruit the necessary troops, and before the campaign had even started about a quarter of his force had deserted. By the autumn of 1791, St. Clair was ready at last. Once more, the American objective was to destroy the Miami village of Kekionga. 

By early November, however, several hundred more soldiers had deserted and St. Clair, hobbled by gout and incapable of imposing discipline, had only 920 troops—and two hundred ancillary followers—at his disposal. On the night of November 3, his bedraggled party made camp at the present-day location of Fort Recovery, Ohio. At dawn, as St. Clair’s troops ate breakfast, Little Turtle and Blue Jacket led a thousand Native Americans from the surrounding woods. Over the next few hours and during the flight from battle, the United States Army suffered one of the worst defeats in its history and the worst United States defeat in the course of two centuries of Indian wars. Of the 920 soldiers, 632 were killed and 264 were wounded. Only 24 were unharmed. Nearly all the 200 camp followers were killed too. 

On his return to Philadelphia, St. Clair requested a court-martial that he might be exonerated; the courtesy was refused and he was forced to resign. The House of Representatives then took St. Clair’s defeat as the subject of its first special committee investigation. The fallout was such that Washington summoned for conference the secretaries of all governmental departments: St. Clair’s humiliation thus “inspired” the first meeting of the United States cabinet. When the same news was received in London, the British were elated. Still ensconced in their posts, still furnishing arms to the Indians, Pitt the Younger’s government began to contemplate an Indian “buffer state” between the United States and Canada.

Yet St. Clair’s defeat was also a turning point in the Northwest Indian War and the history of the United States Army. In March 1792, Congress voted for the establishment of more regiments, for longer enlistments, and for better pay for soldiers. “Mad” Anthony Wayne, a Revolutionary War general from Pennsylvania and a former congressman unseated over claims of electoral fraud, was made senior officer of the Army and ordered by Washington to create a regular military force that could at last pacify the Northwest. Recruited and trained in Pittsburgh, and combining infantry, cavalry, and artillery, Wayne’s force was named the Legion of the United States.

“The Americans must certainly be a restless People,” observed one Detroit trader, “for no sooner is one army destroyed than another springs up in its place.”

Universal History Archive/Universal Images Group via Getty Images

The Choice at Fort Miamis

Wayne’s Legion soon reversed the course of the Indian War. Establishing Fort Recovery at the precise location of St. Clair’s defeat and building fortifications throughout the Northwest Territory, the Legion’s campaigns culminated in August 1794 at the Battle of Fallen Timbers. Here, a force of 2,000 United States soldiers conclusively defeated the Native Americans and a company of Canadian militiamen.

In the prelude to the battle, Wayne’s Legion marched northward; Blue Jacket’s Indians took a defensive position along the Maumee River, where scores of trees had been uprooted by recent storms, hence “Fallen Timbers.” The battle itself was anticlimactic. Wayne’s infantry launched a bayonet charge and his cavalry outflanked the Native Americans, who were routed. The defeated Indians fled to Fort Miamis, which by April 1794 had been rebuilt and reoccupied by the redcoats. 

This was significant, even momentous, because in June of that year, Henry Knox—still secretary of war —had authorized an American military assault on the fort. 

“If . . . in the course of your operations,” Knox had written to Wayne, “it should become necessary to dislodge the party at the rapids of Miami, you are hereby authorized in the name of the President of the United States to do it.” These orders represent a remarkable volte-face. As late as March 31, Knox had ordered Wayne to abstain “from every step or measure which could be . . . construed into any aggression on your part against England or Spain.” Yet in his orders of June 7, explicitly countermanding the previous orders, Washington, through Knox, had authorized a military assault on this new British fort. 

Washington and Knox knew the potential cost of the mission, so Wayne was told that “no attempt ought to be made unless it shall promise complete success.” The “pernicious consequences” of the assault, whether successful or not, would likely have been open and declared war between the United States and Great Britain for, as Wayne had advanced, Upper Canada’s Lt. Governor John Graves Simcoe had written to London, begging permission to unleash his troops and Indian allies to attempt reconquest of the United States.

After the Battle of Fallen Timbers, when Wayne followed the defeated Native Americans to Fort Miamis, he was presented with a dilemma. If he chose to attack the British fort, could he guarantee success? Fort Miamis was commanded by Major William Campbell, who opened the gates to the Canadian militiamen only: he too feared war, so he would not defy Wayne by sheltering Blue Jacket’s Indians. Lieutenant William Clark, who later joined Meriwether Lewis in the Corps of Discovery, recorded that Wayne’s soldiers were “all full with expectation & anxiety, of storming of the British Garrison,” but Wayne himself remained unsure of what to do.

Sorely lacking in artillery, Wayne first tried other means of compelling a British withdrawal. He wrote to Campbell, demanding that he “immediately desist from any further act of hostility. . ., by forbearing to fortify, and by withdrawing the troops, artillery, and stores under your orders and direction, forthwith.” Campbell’s reply was terse: “I certainly will not abandon this post.”

Wayne then tried to lure the British and Canadians from their position by destroying their property outside the fort’s walls and by parading within range of the palisade. When this “showdown at Fort Miamis” failed to provoke a reaction, Wayne decided against an assault. Yet at no time between 1781 and 1812 were the United States and Great Britain closer to the resumption of open warfare.

Content to declare his own victory, Wayne marched back along the Maumee River to await the peace missions of the beaten Indians. The envoys came soon enough, and by August 1795 they had agreed to the Treaty of Greenville. Signed by the Wyandots, the Delawares, the Shawnees, the Ottawas, the Chippewas, the Potawatomis, the Miamis, the Eel River Tribe, the Weas, the Kickapoos, and the Kaskaskias, the treaty substantially diminished the danger of the Western Confederacy to the northwestern United States. Trade was opened with the tribes, who were permitted “to hunt . . . without hindrance or molestation, so long as they demean themselves peaceably, and offer no injury to the people of the United States.”

The federal government meanwhile disavowed any American citizen who presumed to settle upon Indian lands, the extent of which— and therefore of American territory—was defined at length. But most important, in return for $20,000 and an annual stipend of $9,500 (paid in kind with “useful goods”), the tribes relinquished to the United States the title to all the land beyond a line that ran, in present-day terms, south from Cleveland to the Portage Lakes, along the Tuscarawas River to Bolivar, and then southwest to Fort Loramie. The line then ran gently northwest to Fort Recovery before turning abruptly southwest toward Carrollton, Kentucky. 

The Greenville Treaty Line, as it became known, became the effective border between the United States and the Western Confederacy. This would not be the end of hostilities in the American Northwest—far from it—but the treaty brought an unprecedented degree of security to the region and, most significant of all, it undercut British plans for future intrigue and subversion.

Many Americans interpreted the treaty as a form of genuflection to the British, a betrayal of fellow republicans in France, and a repudiation of the principle of “free ships, free goods.”

Stopping British Subversion and Securing the Peace

The defeat of the Native Americans at Fallen Timbers meant that the Western Confederacy had failed despite British support, while the escalation of the war in Europe meant that Britain was desperate to prevent the Americans from honoring their still-extant alliance with France.

The Pitt Administration, as Gouverneur Morris told John Quincy Adams, was now “well disposed” to the United States: “They have made their arrangements upon a plan that comprehends the neutrality of the United States, and are anxious that it should be preserved.”

Reconciliatory sentiment prevailed in Philadelphia, too. The Federalists, who controlled the cabinet and the Senate, were anxious to strike a deal not with France but with Britain, a deal that might foster the commerce needed by the Hamiltonian system.

Jay’s position was compromised from the start. To gain concessions from Great Britain, he could have threatened American participation in the League of Armed Neutrality, the Russian-led alliance of northern European states that strove to uphold the immunity of neutral shipping. Yet British spies had learned that American flirtation with the League was far from serious; the British also knew from their diplomatic network in Europe that the League did not even want American membership. Jay’s only leverage came from Wayne’s recent victory in the Northwest and British anxiety to maintain American neutrality. He could not, therefore, press the British on several key issues, such as the impressment of American sailors into the Royal Navy. 

Even so, a treaty was signed on November 19, 1794, by Jay and Lord Grenville, the British foreign minister, and its terms resolved several issues festering between the nations. Most significantly, Britain engaged to withdraw all “Troops and Garrisons from all Posts and Places within the Boundary Lines assigned by the Treaty of Paris to the United States.” Commissions would be established to settle the northwestern and northeastern borders with British Canada, while the entire border would be renegotiated if the Mississippi was found not to extend into British territory. Further commissions would resolve claims for unpaid British debts and plunder on the high seas.

Finally, American rights in British trade and vice versa were fixed, abolishing defunct “colonial” privileges and opening British ports in Europe and the East Indies to American vessels. Even the British West Indies was opened to American commerce, but only so narrowly that the Senate struck out the provision during ratification.

Getty Images

Upon its receipt in the United States, the treaty became a source of violent partisan controversy, not least because it failed to outlaw the impressment of American sailors. Moreover, Jay had accepted significant limits to American participation in the West Indian trade. The British were also allowed to seize French goods from American ships, and there was nothing about compensation for slaves who had been freed by the British during the Revolutionary War: possessed of antislavery sympathies, Jay was never likely to press that point. 

For these reasons, many Americans interpreted the treaty as a form of genuflection to the British, a betrayal of fellow republicans in France, and a repudiation of the principle of “free ships, free goods.” Public meetings in Boston, New York, Philadelphia, Baltimore, Richmond, Charleston, and Savannah voted to condemn the treaty. Jefferson’s Republican Party, which opposed the treaty, swelled its numbers accordingly.

The Jay Treaty thus became a decisive factor in the development of American partisan politics. “Pro-British” Federalists demanded its ratification, with Alexander Hamilton and Rufus King apologizing for the treaty under the pseudonym of “Camillus.” In the first of their essays, “The Defense,” they argued that Jay had covered “in a reasonable manner the points in controversy between the United States and Great Britain.” No “improper concessions” had been made, nor were any “restrictions which are incompatible with their honor” laid upon the Americans. 

Compared to the “other commercial treaties” of the United States, Jay’s effort was “entitled to a preference”; in fact, the Americans had obtained “concessions of advantages . . . which no other nation has obtained from the same power.” Most pointedly, Hamilton suggested that “the too probable result of a refusal to ratify [the Jay Treaty] is war” and so “violations of our rights” would go “unredressed and unadjusted.”

Conversely, the Republicans derided Jay’s work, which they labeled the “Grenville” Treaty (as opposed to the Treaty of Greenville that Wayne had extorted from the Indians). Republicans explained Jay’s treaty as the product of addled Anglomania. Writing to the Italian physician and gunrunner Philip Mazzei, Jefferson balked at the symbiotic relationship between Federalism and pro-British policy. 

“The aspect of our politics,” he wrote, “has wonderfully changed since you left us. In place of that noble love of liberty and republican government which carried us through the war, an Anglican, monarchical, and aristocratical party has sprung up.” 

Their “avowed object,” wrote Jefferson, “was to draw over us the substance as they have already done the forms of the British government.” Within this “Anglican” party Jefferson identified most of American political society. “Against us,” he listed “the Executive, the Judiciary, two out of three branches of the legislature, all of the officers of the government, all who want to be officers, all timid men who prefer the calm of despotism to the boisterous sea of liberty, British merchants and Americans trading on British capital, speculators and holders in the banks and public funds.” These merchants and speculators had conspired “for the purposes of corruption and fear” to involve the American people in “the rotten as well as the sound parts of the British model.” Jefferson warned Mazzei that he would suffer “a fever were I to name to you the apostates who have gone over to these heresies, men who were Samsons in the field and Solomons in the council, but who have had their heads shorn by the harlot England.”

Such was the partisan rancor that the Treaty was ratified by the Senate without a vote to spare: the necessary two-thirds majority, 20 to 10, reflected the parties’ share of seats precisely. Even after the Jay Treaty became law it remained the subject of dispute. The prescribed commissions on borders and debts had to be financed by the Republican-controlled House of Representatives, so House Republicans held up proceedings while they tried to undermine public confidence in the Treaty, the defining symbol of Federalist foreign policy. 

Washington and Federalists such as the Massachusetts Congressman Fisher Ames, though, built support assiduously. In one memorable speech in the House, Ames rose despite serious illness to plead that rejection of the treaty—that is, rejection of Britain’s offer to surrender its military posts—meant war against the Indians without the means of peace. 

“Until the posts are restored” to American possession, Ames declared, “the treasury and the frontiers must bleed . . . By rejecting the posts, we light the savage fires—we bind the victims . . .  The voice of humanity issues from the shade of their wilderness. It exclaims that, while one hand is held up to reject this treaty, the other grasps a tomahawk.” In the end, Ames succeeded and the necessary grants were made by April 1796.

Peace With Britain and Rancor at Home

The Jay Treaty fomented American partisan division more than any other event of the 1790s. As “Camillus” reflected, “There was no measure in which the government could engage so little likely to be viewed according to its intrinsic merits.” 

Support and opposition to the Hamiltonian system of federal finance had been the first polarizing factor; reaction to the French Revolution and Citizen Genet might well have been the second; but the Jay Treaty and the associated debates about American foreign policy had much vaster effect in mobilizing the wider political public around the elite partisan factions. Thus was the division between Republicanism and Federalism simplified as the division between support for the French Republic and support for Great Britain.

Now to come back, finally, to the comparison of Burke’s strictures to King George III and Randolph’s instructions to John Jay.

If in the Age of Federalism there was limited American mucking about in British North America, Ireland, or Britain herself, this was not just because American capacities for troublemaking were limited, but because after Jay’s treaty, the principal threats to American interests came from France and Spain. Federalists aided revolutionary forces in the nominal French colony of Ste. Dominique, and considered aiding them throughout Spanish America—Jefferson did his bit to help stir unrest in Spanish-ruled Louisiana. The United Irishmen in America organized for their revolution against King George, and many Irish returned from America to fight and die in Ireland. It was James Monroe who aided Theobald Wolf Tone, but it was Hamilton who talked of raising an American party in England, and Washington who through Randolph instructed Jay to prepare the revolution card in England in the face of British intransigence. Because Jay secured a treaty that Americans could live with, there was no need to embark on plan B in Britain.

In short, no American public man felt the qualms Burke expressed about the prospect of fostering revolutions in other people’s countries. Such is the verdict of Palmer’s comparative study of the world revolution in the Age of Federalism: “John Adams was not much like Edmund Burke, even after [Adams] became alarmed by the French Revolution; and Alexander Hamilton never hoped to perpetuate an existing state of society, or change it by gradual, cautious, and piously respectful methods.” 

American statesmen certainly disagreed about which foreign revolutions, if any, were worthy of American support. Jefferson and Pickering disagreed about the wisdom of supporting, say, the revolt of the slaves against their masters in Saint Domingue. Party strife about foreign affairs spread in large part because all of America’s political elite were committed to the revolutionary idea that foreign policy should be a matter for public deliberation, that the public should be the subjects of diplomatic action. 

As Washington wrote to Marshall, “the mass of our citizens require no more than to understand a question than to decide it properly.” It was to enlighten Americans as to how to manage their country’s foreign affairs that Washington transmitted his political testament, the Farewell Address, not to his would-be heirs among the elite, but to the public at large through the newspapers.

Online Censorship • Weekend Long Read

The Undifferentiated Human Matter of Replacism

Absent intact and confident national Western cultures who know where they came from and who they are, the immigrant waves that retain the most confidence in their collective identity will overwhelm those cultures that do not. And that may not end well for anyone or anything, including the Davos-cracy, including modernity itself.

Just over a year ago, an English translation was published of the 2012 book You Will Not Replace Us. Written by Renaud Camus, a French author and political thinker, it was intended as a condensed summary of lengthier volumes he’d already published on the subject of culture and demographics.

The phrase “you will not replace us” gained notoriety in August 2017 when it was chanted by an assortment of right-wing protesters who had shown up in Charlottesville, Virginia, to protest the planned removal of Confederate monuments in that town.

There is no excusing the violent extremists who were among those present in Charlottesville, much less the unforgettable and tragic outcome. And it is unlikely that many of the protesters in Charlottesville had any idea that a relatively obscure French writer had coined the phrase they were shouting as they marched across the University of Virginia campus.

But Renaud Camus, whose literary career began in the 1980s as a “pioneering gay writer,” in more recent years has become, as described in The Nation, “the ideologue of white supremacy.” In March 2019, The Washington Post referenced Camus’ book as the inspiration for the mass murder of Islamic worshipers that had just happened in Christchurch, New Zealand. In September 2019, the New York Times described Camus as “the man behind a toxic slogan promoting white supremacy.”

It’s always problematic to discuss anything questioning the demographic transformations sweeping the West. It’s easy and politically acceptable to celebrate diversity, and even gleefully to anticipate the permanent political ascendancy of the global Left in Western democracies, as the demographic character of the electorate inevitably shifts as a result of mass immigration. But to ask whether or not this shift is desirable invites accusations of racism, xenophobia, and white nationalism. It even invites accusations that to open this discussion is to encourage extremist violence.

Given these stigmatizing constraints, the only reason to bother exploring the potential downside of “diversity” is that behind the term “diversity” is possibly the most unexamined, voluntary, abrupt and profound transformation of a civilization in the history of humanity. And what if suppressing this discussion, pretending nothing of consequence is happening, and censoring voices of caution is actually what encourages extremism and violence?

In a New Yorker article written about Camus in 2017 by Thomas Chatterton Williams, entitled “The French Origins of ‘You Will Not Replace Us,’” the Frenchman is described as “a kind of connective tissue between the far right and the respectable right,” who can “play the role of respectable reactionary because his opposition to multicultural globalism is plausibly high-minded, principally aesthetic, even well-mannered.”

That description offers a broader perspective on Camus than one of someone merely motivated by xenophobia or racism. Camus is reacting against globalism as an economic nationalist and as a cultural preservationist. He claims that what he calls a “Davos-cracy” has deemed cultures secondary to having a critical mass of consumers, and that it considers all humans interchangeable. The phrase he’s selected to drive his point home, and repeated throughout his book, is “Undifferentiated Human Matter,” or UHM.

Replacers, Replacists, Replacees, Replacism, Anti-Replacism

Camus begins his book by declaring “replacing is the central gesture of contemporary societies.” But he isn’t just talking about people, he’s talking about everything. Claiming “the world itself is fast becoming just another amusement park,” he describes the process of replacism in all-encompassing terms. In an extended explanatory passage, he writes:

Faux, simili, imitation, ersatz, simulacrum, copies, counterfeiting, fakes, forgeries, lures, mimics, are the key words of modern human experience. Stone masonry is being replaced by ferroconcrete, concrete by plaster, marble by chip aggregate, timber by PVC, town and countryside by the universal suburb, earth by cement and tar….literature by journalism, journalism by information, news by fake news, truth by fallacy, last name by first name, last name and first name by pseudonyms….history by ideology, the destiny of nations by plain politics, politics by economics, economics by finance, the experience of looking and living by sociology, sorrow by statistics, residents by tourists, natives by non-natives, Europeans by Africans….peoples by other peoples and communities, humanity by post-humanity, humanism by transhumanism, man by Undifferentiated Human Matter.

What Camus is defending is more than preserving an indigenous ethnic majority in his country. He is defending, as he puts it, “an order, a prosperity, a sense of generosity in terms of social benefits and safety nets, the sound functioning of institutions which have been achieved through centuries of nurturing efforts, trials and tribulations, cultural transmission, inheritance, sacrifices and revolutions. What makes countries, continents, cultures and civilizations what they are, what we admire or regret, are the people and the elites who have fashioned them….man is not, or not quite yet, some undifferentiated matter that one can spread indiscriminately, like peanut butter or Nutella, anywhere on the surface of the Earth.”

Rejecting most conventional terms, Camus has built his own nomenclature around what he believes are fundamental mega-trends that are not adequately described with existing vocabulary or commonly understood polarities: liberalism vs conservatism, globalism vs nationalism, capitalism vs socialism. Instead, he has come up with the ideology of “replacism,” with three protagonists, “the replacists, who want to change the people and civilization, which they call multiculturalism, the replacers, mostly from Africa and very often Muslims, and the replacees, the indigenous population, whose existence is frequently denied.” He then divides the “replacees” into two groups, the consenting replacees, and the unwilling replacees.

Is France Actually Destined to Replace Its Population?

The concept of demographic replacement brings with it an assortment of tough questions, largely ignored, dismissed, or even censored by the establishment media and mainstream politicians. In France, the government collects no census or other data on the race or ethnicity of its citizens, which means any tracking of alleged “replacement” of the native population has to rely on estimates. Estimates, however, reveal dramatic shifts in just the past two decades.

An article published by the Brookings Institution in 2001 estimated that five percent of the French population was non-European and non-white. From what information can be found since then, that percentage has changed at a blistering pace. According to World Population Review, “when statistics were released in 2008, it was reported that 11.8 million foreign-born immigrants and their immediate descendants were residents in the country; a figure which accounted for around 19% of the total population of the time.”

While a rise from 5 percent to nearly 20 percent in less than a decade is a stunning statistic, it may actually understate the magnitude of the so-called replacement, because it doesn’t take into account birthrates. For example, a chart on the Wikipedia page “Demographics of France,” quoting data available (in French) from the “Institut national de la statistique,” reports that in 2014, an estimated 29 percent of all births in France were to parents where at least one was foreign-born. Moreover, of the 71 percent of births in that year to parents who both were born in France, it is probable that a significant portion of those were to second- or third-generation immigrants of non-European origin.

A 2017 article appearing in the Washington Times, referencing a study published (in French) by the “Institute des Libertes,” offers projections based on known population demographics and birthrates in France. The study predicts that within 40 years, or barely after mid-century, the white population in France will become a minority. This forecast extrapolates from a white birthrate in France of 1.4 children per woman, compared to a Muslim birthrate of 3.4 per woman. If these birthrate disparities persist, France is destined to become a Muslim majority nation within just a few decades, even if immigration were stopped entirely. Among the younger generations of French, that threshold will be reached much sooner.

Is Integration Possible in France and How Is Mass Immigration Justified?

According to Camus, several false narratives are being spread in France by the “replacists” to dismiss the significance of the current migration by saying it is nothing new. Camus argues that it is preposterous to say that “France has always been a country of immigration,” because “for about fifteen centuries the French population has been remarkably stable, at least in its ethnic composition.” To the extent there was immigration, it was always thousands of people, of European stock and Christian faith, compared to millions today who “have almost all been African and more often than not Muslim.”

Whether or not Camus is a white supremacist is debatable, but his skepticism towards the possibility of integration is unambiguous. He writes “Their African culture and Mahometanism make it a much stronger challenge for them to become integrated into French culture and civilization, all the more so because most of them show no desire whatsoever to achieve any such integration, whether as individuals or communities.” Sadly, without honest, balanced, and well-publicized research into this very question, it is impossible to dispute this assertion.

Other popular narratives, according to Camus, also designed to justify mass immigration, include the claim that France was liberated from the Germans in 1944 by Northern and Central Africans recruited by the Free French. Anyone familiar with the battles of World War II would dispute this based on the fact that the main invasion was at Normandy by American and British forces. While units of the Free French army did land along with other Allied forces in Southern France two months after D-Day, this later invasion was launched after the Germans had begun to withdraw their forces to fight in the north, and in any case, only about one-third of the Free French troops were of African origin.

Another popular myth that Camus claims is promoted by France’s multiculturalists, or replacists, is that North African workers reconstructed France after World War II. This is clearly inaccurate since France’s post-war reconstruction was completed well before the 1970s, which is when mass migrations began from Africa into France.

Possibly what might be considered by replacists to be the most compelling argument in favor of mass migration is that it serves as recompense for the depredations of the French as colonial occupiers. But if the colonial era were so horrible, Camus asks, why is it that millions of Africans “appear to nurture no plan more clearly and cherish no higher ambition than to come to France and live with the French?”

Camus makes an important distinction between European colonialism and mass migration into Europe from Africa, one that calls into question both mainstream claims—that integration is possible, or that mass migration is justified. As he puts it, “France and Europe are much more colonized by Africa, these days, than they ever colonized it themselves.” His point is that the Europeans imposed a military, administrative and economic occupation on its overseas territories, but “this type of colonialism, developed in a political framework, is much easier to end—all that is required is for the conqueror’s army to withdraw.” What is happening in France today is what Camus refers to as “settler colonialism,” which is far more difficult to undo, if not impossible.

If the immigrant vs native French interactions Camus writes about are typical—“making life impossible or an unbearable ordeal to the indigenous people….through aggressive gazes, overbearing posturing to force passers-by down from the sidewalk….the creation in the citizenry of a general feeling of fear, insecurity, dispossession and estrangement….unprecedented forms of hyper-violence up to full-blown terrorist acts and massacres….which in the process secure under their rule additional chunks of territory for themselves”—then eventual integration may be very unlikely, and his characterization of mass migration as a foreign occupation may be more descriptive.

The Case for “Undifferentiated Human Matter”

To criticize the double standard applied by most online and offline media on topics relating to race has been dismissed as “whataboutism,” as if double standards don’t matter, as if differing sets of moral criteria should apply depending on what group or worldview is being examined. This double standard is in effect throughout the West, enforced in matters ranging all the way from online censorship to offline criminality. Camus notes countless Christian church desecrations in France, rarely prosecuted, and compares those to the heavy sentences levied onto protesters who unfolded a banner on the roof of the “Great Mosque” of Poitiers during its construction.

In France, Camus writes, “non-European youngsters by the thousands can post horrible and very disturbing messages on Twitter or Facebook about European or White people in general without the slightest threat to have their social network accounts suspended or be interrogated by the police; while opponents to mass migration are the permanent target of the most finicky censorship.”

Camus marvels at the fact that contemporary Western Civilization is the first in history to be lenient “towards those who want its eradication while it relentlessly persecutes those who would put up efforts to defend it and work for its salvation.” But what is Western Civilization? Is it bound up with ethnicity, or is it something more intangible yet more profound?

In an irony of history, Lenin’s useful idiots, the leftist movements in Western nations, are now serving not the international communists, but global capital.

In France, the very notion of “race” has been deleted from Basic Law texts. The conventional explanation for this transformation, implemented in the 1970s, was that it reflected the revulsion the French people felt towards Nazism and their horrific experience under German occupation when Jews were being deported to German death camps. Undoubtedly, this is true, but Camus focuses on how the termination of the concept of race fulfills the goals of the replacists.

Mocking the mainstream scientific dogma that proclaims races do not exist, Camus takes the position that “race” embraces “social, literary, or poetic, or taxonomic creations of such considerable impact that proclaiming they do not exist is tantamount to seriously testing the meaning of the verb to exist.” He uses “race” interchangeably with “a people” and argues that conflating biology with culture is to suggest that Europe does not exist, that European civilization did not exist; no such thing as French culture; no such thing as French people—that there are only people with a French passport.

“In industrial and post-industrial societies, especially those where the main industry is the industry of Undifferentiated Human Matter, where man is the producer, product and consumer at once, there is no such thing as a genuine product.”

The “Anti-Racist” Paradox: The True Agenda of the Anti-Racists

If everyone is undifferentiated human matter, and races—biological or cultural—do not exist, how can racism exist? And if races do not exist, why must anti-racists so aggressively enforce a drive to achieve perfect equality among races; why must they insist that all races are equal?

This logical flaw is inexplicable, according to Camus, until you consider how the meaning of anti-racism has changed. Anti-racism no longer means a stance against racism as it is historically understood, it now denotes a stance against the existence of races and a willingness to have them disappear. Camus considers this evolution of the term anti-racism, impelled by the paradoxical concept that races both do not exist and are all equal, was a critical enabling condition for the Great Replacement.

As he puts it, “Paradoxically, without the non-existence of races, the change of race would not be possible . . . since there are no races, there can be no substitution of races . . . change was obvious, and rather unpleasant, but it was not taking place. How could it occur, since it was scientifically impossible?” But why? Who benefits?

It is here that Camus’ opening remarks, “replacing is the central gesture of modern societies,” comes back into play, addressing a phenomenon of which mass migration is only a part, albeit a very, very big part. If the native French are being replaced by settler colonials, then who is orchestrating this, and why? Camus claims “what we are dealing with here is a delegated form of colonization, a colonization by proxy, and that the forces that want it, and who organize it, are not the forces who actually accomplish it.”

This two-fold colonization, orchestrated by the very rich and implemented by the very poor, is part of the destruction of culture that began before the mass migrations. As he writes, “no people that knows its own classics would accept numbly and without balking to be thrown into the dustbins of history . . . this numbness had to be created.” Here and elsewhere, Camus is not talking about a conspiracy, but rather “powerful mechanisms” created by the combination of ideals and interests. The main ideal; equality. The main interests: “normalization, standardization, similarity, sameness.”

What Camus calls a “powerful mechanism” can indeed explain the rise of globalism without resorting to conspiracy theories. For global investors and multinational corporations to achieve maximum growth and profit, the prerequisites are standardization, free trade, open movement of people and capital, and a growing mass of consumers in every economic zone—dependent, destitute, it doesn’t matter. But to justify this, to make it a virtue, even a populist cause, the ideology of equality and anti-racism are in-turn prerequisites.

This erasure of high culture, this popular contempt for a cultivated class that might perpetuate reverence for traditions and greatness, this devolution, suits the ideology of the anti-racists. But it is useful as well to global commercial and financial interests. In an irony of history, Lenin’s useful idiots, the leftist movements in Western nations, are now serving not the international communists, but global capital.

It isn’t just France, of course, where traditional culture and proud national histories are being deconstructed and disparaged by the Left. In the name of anti-racism, the history of Western Civilization is now being taught in America, increasingly, from elementary school through graduate school, as an unending saga of oppression and exploitation. In the name of equality, SAT scores, and even grades, are being dispensed with in schools and universities, double standards are established based on racial quotas in academia and business, because race does not exist, yet all races are equal. All this paves the way for an erasure of peoples, the replacement of culture and identity with undifferentiated human matter.

The Genealogy of Replacism

On page 138 of the English edition of You Will Not Replace Us, Camus offers a family tree of sorts that pulls together the historical events and ideological evolution which led France, and by extension the West, to its present state. It not only attempts to illustrate the origins of replacism, but also the cultural devolution that he believes made replacism possible. Shown below is a graphic representation of what Camus describes in painstaking detail. Here is the “marital status” of replacism. “Son of Anti-Racism and High Finance (themselves, respectively son of Egalitarianism and Anti-Fascism, and daughter of Taylorization and Ultra-Liberalism, granddaughter of Industrial Revolution and Capitalism), marries Petite-Bourgeoisie, daughter of Democratization and Welfare State, grand-daughter of French Revolution and Proletariat.”

The logic of this genealogy makes a lot of sense. Replacism is ideologically justified by anti-racism at the same time as it serves the interests of High Finance. “Taylorism,” loosely synonymous with “Fordism,” is the system of factory management that evolved in the late 19th and early 20th centuries to break production into standardized repetitive tasks, greatly improving both the efficiency of manufacturing as well as making it possible to hire far less-skilled workers for less money, and making them easily interchangeable. Ultra-liberalism is Liberal ideology as originally conceived, devoted to the virtues of free trade and free movement of capital.

By marrying replacism to petite bourgeoisie, Camus is showing the synergy between a loss of higher culture and the replacist agenda. By depriving Western Civilization of its “cultivated class which is indispensable to culture in the old sense of the word,” by allowing respect for Western Civilization to slowly disappear, indeed by demonizing all vestiges of privilege, and by glorifying the most popular, largest common denominators of human experience, by democratizing education to the point where everyone and nobody is educated anymore, by mass-producing simulacrums of culture designed to appeal to the most universal and primal ambitions, there is no longer a people, there is no longer a unique culture, there is no longer history, tradition, pride, identity, the nation becomes an economic unit and nothing more.

Another fascinating aspect of the genealogy that Camus has described is that it is not just logical, but perhaps some of what he is describing is also inevitable. In hindsight, where would the human path have deviated from these outcomes? Is it much of a stretch to say the industrial revolution was inevitable, or the innovation of mass production and standardization? Is it unreasonable to suggest the rise of workers and unions to the abuses that characterized the first hundred years of industrialization may have been inevitable? Is all that Camus really has to say mere sentimentality, mere nostalgia, is this just a primal scream of a book and the movement it represents merely the last mad roar of a primitive nationalism whose time has come and gone?

Nostalgia and sentimentality may well inform the millions who merely wish that things could go back to the way they were, but for Camus, at least, stronger emotions and reason inform his motivation. First of all, he would probably deride it as thoughtless and typical for his critics to think that objecting to the destruction of Western Civilization, in all of its traditions and values, is mere reactionary nostalgia and sentimental longing for the past. But he also would remind us of the threat we face, not only at the hand of the replacists, but when the replacers eventually confront the replacists.

Replacism, for all its deplorable sameness, for all its drive to conquer and merge all cultures in the name of anti-racism and in the interests of high-finance, at least has a new world to offer. It may be grotesque and shallow, hedonistic and common, replete with addictive gadgets that pass for fulfillment and while away lifetimes, but there is profit, there is order, bread, circuses. There is still civilization, after all, cheapened, flattened, filled with undifferentiated human matter. But what if the replacers have a different agenda entirely?

Camus believes the combination of leftist morals and traditional right-wing business interests gives a unique power to replacism. He writes, “as if the ruthless power in the upper district of Metropolis, had, to top it all and make it worse, the capacity to project to the world the gentle image of the soft social order found in the Alpine pastures of The Sound of Music. He describes replacism as a totalitarian ideology devoted to promoting the replaceability of everything, man included. But he also claims that the only totalitarian ideology in the world capable of rivaling replacism in the world today is radical Islam. What a choice.

Is there such a thing as nationalist capitalism? And if not, is the battle taking shape one between national socialists and international socialists?

Neither Conspiracies Nor Scapegoats Account for Replacism

The phrase “conspiracy theorist” or “conspiracy theory” recently has been weaponized by globalists throughout the West. Wielded along with the more established word weapons, “racist” and “denier,” “conspiracy theorist” is now used as a verbal bludgeon to silence anyone who questions globalization or replacism.

Camus has much to say on this and the related topic of scapegoating. He writes, “The theory of conspiracy theory is one of the most effective, catchy and brilliant inventions of the ideological power and its executive clique, the media, to discourage any reflection on its own workings, on the nature of its power and on the crimes it might have committed. The theory amalgamates all conspiracy theories into one, whose model are the most eccentric views about the attacks of September eleventh against the Twin Towers and the Pentagon. But just as being paranoid does not mean you have no enemy, accusing everyone whose views differ from yours of being an adept of some conspiracy theory does not mean there is no plot and no conspiracy.”

Having made that assertion, Camus backs away from alleging there is a conspiracy. Dismissing attempts by others to blame replacism on the European Union, Wall Street, the International Monetary Fund, or Jews, he suggests, in fact, it is “some enormous, bizarre and complex process, so intricate that no one can understand perfectly how they work and why, and no one can master and stop them once they are started.”

This makes more sense than it may initially seem. It returns to the idea of a logical and almost inevitable flow of history. Only at pivotal historical moments can that flow be willfully directed through the exertions of a united people, because so much of its momentum is mechanical. And clearly that is what Camus is calling for, when he writes “it is for us to break the machines which churn out men like others churn out cookies, or Nutella, or surimi.”

Camus explicitly challenges the theory, not his, but prevalent among some right-wing factions, that Jews are providing the money and brains behind replacism. He correctly notes that in Europe they are the first victims of the Great Replacement. He discusses at length how “the change in the population of Europe has made daily life very difficult, if not impossible, for a number of Jews who are almost permanently exposed to very strong Muslim aggressiveness, modern anti-Zionism flourishing both as a form of exasperation and as an excuse, a more decent cover, for very classical Arab and Muslim anti-Semitism.”

While identifying Muslim immigrants as the source of revived anti-Semitism in Europe, Camus dismisses the role of “classical occidental European anti-Semitism,” referring to it metaphorically as “a derelict shop in the dilapidated historical downtown, now entirely driven out of business, and fashion, by the enormous shopping malls in the banlieues.” He notes that many Jewish communities in Europe that survived the Holocaust are not going to survive the Great Replacement, with thousands of Jews now being driven out of France every year.

The experience of European Jews today in the face of mass immigration of Muslims has led Camus to conclude that while there are some prominent Jews involved in promoting the Great Replacement, such as George Soros and others less known, he believes that in recent years the proportion of replacist Jews and anti-replacist Jews is now almost reversed, with anti-replacists predominating. And he makes a claim, similar to sentiments observed by Churchill a century earlier, that “Jews are very much divided on that issue [replacism], which makes them no different than any other community.” It may be fair to say that Camus sees the Jewish community, certainly in Europe, as a microcosm, split on the polarizing issues of our time in a way reasonably proportional to the rest of the Western elites.

And perhaps in this we will come a recognition that Zionism is only one form of nationalism, and Jews and Gentiles alike throughout the West will begin to coalesce in support of preserving the peoples and cultures of all Western nations. Camus writes “Israel belonging to the Jewish People, with Jerusalem as its capital, is the model and the essential reference, at least in Western culture and civilization, to all sense of belonging. If those three did not belong to each other, it would be the end of all belonging. If Jerusalem were not Jewish there would be no reason for Paris or Saint-Denis to be forever French, for London or Winchester to be English, or indeed for Washington or Concord to be American.”

The Flight 93 Civilization

If you believe even half of what Camus has to say, Western Civilization is all but doomed. It is to be replaced either by a generic replacist world consisting of undifferentiated human matter, or an Islamic world, which would take shape in the aftermath of a cataclysmic conflict in which the replacers overthrew the no longer useful replacists. What can be done?

Towards the end of his book, Camus calls for “remigration” of immigrants out of France and back to their nations of origin. To accomplish this, he views the European Union, currently controlled by replacist interests, as something that could potentially be taken over by anti-replacists. As he puts it, “The continent is being invaded, the nations which are part of it should stick together and resist, not try and find salvation one by one, in dispersion and isolation.” But he reemphasizes how what threatens European civilization is bigger even than colonization, writing “when we Europeans started to be subjected to another, more brutal and direct colonization, we were submitted to an Islamisation of our Americanization.”

American cultural power, such as it is according to Camus, populist, egalitarian, flattened, Petite bourgeoise, is almost—stress, almost—a proxy for globalism sweeping away the unique cultures and peoples of the world. Camus might say that America, when it comes to replacism, is as much a culprit as a victim.

Which brings us to America, where, just as in Europe, resurgent nationalism—unwilling replacees—contends with a daunting coalition of replacists, replacers, and willing replacees. The eventual outcome hangs by a thread, and no matter what the outcome, so much can go wrong.

In 2016, an influential essay entitled “The Flight 93 Election” compared the presidential contest between Hillary Clinton and Donald Trump with the choice passengers faced on the doomed Flight 93 on September 11, 2001. As he put it, “2016 is the Flight 93 election: charge the cockpit or you die. You may die anyway. You—or the leader of your party—may make it into the cockpit and not know how to fly or land the plane. There are no guarantees. Except one: if you don’t try, death is certain.”

Written by Hillsdale College research fellow Michael Anton, who went on to serve for a time as a senior adviser in the Trump White House, this essay addresses all of the same issues of replacism, in the broadest context of the term. The dispossession of the American people, culturally, economically, and eventually, through actual physical replacement. Anton manages to make his points without inviting quite the opprobrium that Camus has attracted, but his words—a breath of fresh air to many but an unforgivable transgression to others—were so frank and so incendiary that he initially wrote under the pseudonym “Publius Decius Mus.”

What Camus has dubbed the Davos-cracy, Anton called the “Davoisie,” as he implicates America’s conservatives as “sophists who rationalize open borders, lower wages, outsourcing, de-industrialization, trade giveaways, and endless, pointless, winless wars.” Anton went on to reserve an entire section of his essay for the “other” issue, writing that “The sacredness of mass immigration is the mystic chord that unites America’s ruling and intellectual classes.”

Anton’s description of America under a Clinton administration is almost synonymous with how Camus describes France under Macron, differing only in the particulars. “A Hillary presidency will be pedal-to-the-metal on the entire progressive-left agenda, plus items few of us have yet imagined in our darkest moments. Nor is even that the worst. It will be coupled with a level of vindictive persecution against resistance and dissent… We see this already in the censorship practiced by the Davoisie’s social media enablers; in the shameless propaganda tidal wave of the mainstream media; and in the personal destruction campaigns—operated through the former and aided by the latter—of the Social Justice Warriors. We see it in Obama’s flagrant use of the IRS to torment political opponents, the gaslighting denial by the media, and the collective shrug by everyone else.”

Three years after Trump’s stunning upset victory, the power of the Left in America remains pervasive and growing. Under the twin ideological poles of anti-racism and climate action—which is a proxy for economic replacism—they have more or less consolidated their hold on academia, and continue to expand their influence in government at all levels along with most major corporations. Imagine if Trump had lost.

Characterizing the U.S. election of 2016 as a last chance to have a chance, a last chance to avoid certain death, was accurate. Now the battle is joined but the odds remain stacked against the anti-replacists. The Davoisie in all its power is doing everything it can quiet the passengers and regain full control in the cockpit. The Flight 93 Civilization remains fitfully airborne, but for how long?

To the extent Renaud Camus fights a lonely battle, with the smug opinion-makers of the world stigmatizing him and everyone like him as a “white supremacist,” chances are France will become a nation of undifferentiated human matter, or an Islamic state, or some hybrid of the two. But France will no longer be France.

The Inchoate Rebellion Against the Ruling Class

Across the United States and Europe, a rebellion is brewing that lacks coherence or unity. Indeed many of the rebellious groups are battling each other at the same time as they share a rage against the Davos-cracy. In France, the Yellow Vest Movement which has gripped that nation for over a year has attracted far-left and far-right demonstrators.

While the Yellow Vest Movement in France was sparked by rising fuel taxes, the duration and intensity of the protests bespeak years of frustration. What unifies the participants is the punitive cost-of-living in France, but there is no apparent agreement on the cause. To speculate as to the cause, for the Right, immigration is the primary factor; for the Left, global capitalism is the main reason. In fact, they’re both correct.

The unemployment rate among immigrants in France in 2018 was 15.3 percent, nearly twice that of non-immigrants at 8.3 percent. This ratio is virtually unchanged for over a decade. While it is now almost impossible to find reports connecting the Yellow Vest protests to anger over immigration—which means nothing—even President Macron has agreed to new, tougher immigration enforcement. In November 2019 the New York Times quoted Macron as saying“The bourgeois live in areas with few immigrants and do not encounter immigration in their daily lives. It is France’s working classes that live with the difficulties of immigration, and have thus migrated to the far right.”

On the other hand, huge sectors of the French economy have been devastated since the introduction of the Euro in 1999, and this consequence of globalization would have happened with or without immigration. Two searing, pessimistic visions of where this is leading are found in books by the bestselling French author Michel Houellebecq. His 2015 book, Submission, describes a bloodless transition in France from a secular republic into an Islamic theocracy. His 2019 book, Serotonin, includes chapters describing how France’s agriculture industry, which for centuries was a vital, productive, diverse ecosystem comprising hundreds of thousands of independent farmers, was within just a few years nearly wiped out by foreign imports and corporate takeovers.

It would be simplistic and inaccurate to characterize the Yellow Vest Movement as either Right or Left, just as it would not be accurate to describe Marine Le Pen’s National Rally political party as right-wing. The Yellow Vest Movement is a populist reaction to replacism, for mostly economic reasons. The National Rally candidates are a nationalist reaction to economic and cultural replacism.

This illustrates how Camus has invented a term, replacism, that not only transcends conventional definitions but creates space for new combinations of political ideologies to form. Why should the anti-replacists be capitalists instead of socialists? Capitalism has been the justification to impoverish the middle class and fill the nation with foreigners. Globalist (or international) capitalism has been rejected by all within the otherwise inchoate Yellow Vest Movement. Is there such a thing as nationalist capitalism? And if not, is the battle taking shape one between national socialists and international socialists? That would make sense.

The Rise of the Bronze Age Mindset

If Renaud Camus now plays the role of “respectable reactionary,” a book that has quietly sold its way into influence and infamy is Bronze Age Mindset, self-published in 2018, written by a pseudonymous author “Bronze Age Pervert,” which he typically shortens to “BAP.” Bronze Age Mindset is a book that disrespects pretty much everything about modern life. Instead, the author exhorts readers to aspire to become the piratical, fearless figures of Bronze Age antiquity. Talk about reactionary!

The author, who in his book periodically dispenses with grammar, recently surfaced to publish a response to a review of Bronze Age Mindset written by Michael Anton. Both the review and the response are valuable reading for anyone trying to understand the evolving mindset of the anti-replacists. Because closely linked to the reactionary resistance to both cultural and economic annihilation is, obviously, a rejection of the so-called ruling class. This sentiment, and little else, unites the Yellow Vest Movement in France. A feeling of being betrayed by the ruling class also informs movements in the United States that are otherwise bitterly opposed to one another. BAP writes:

What you are witnessing is the unraveling of the postwar American regime—or what is mendaciously called by its toadies the ‘liberal world order’—in a way that is far more thorough than the disturbances of the 1960s, and with consequences that will be far more dire. The ‘altright’ doesn’t exist and has nothing to do with the media representations of it as a form of ‘white nationalism,’ or even—and here is what is crucial to understand—just ‘white males’ or just the ‘right wing.’ The same phenomenon is taking place on the left, and there is much more crossover than older people realize: there is much more involvement also by nonwhite youth and particularly by Latino, Asian, and multiracial youth in this phenomenon than people want to admit.

In BAP’s essay, titled “America’s Delusional Elite is Done,” he accuses the conservative intellectual establishment of failing to oppose “the violent racial hatred and other forms of unprecedented insanity coming from the new left,” including “the destruction of the family, and the new push to groom children on behalf of transsexualism and other supposed sexual identities.” He points out that “this one crucial matter extends the appeal of the ‘frog people’ far beyond that of any one racial or ethnic group.”

So where Camus saw cultural deconstruction as a prerequisite to ethnic replacement, to be resisted, BAP sees resistance to cultural deconstruction as something that is unifying various ethnicities. Economic globalism and cultural deconstruction may have left France open to ethnic replacement and ethnic conflict, but in the United States, these same two mega-trends could form a reactionary and multiethnic solidarity. The difference is that the Yellow Vest Movement unifies a diverse assortment of factions based, so it appears, purely on economic grievances. In the United States by contrast, among the still gestating Bronze Age resistance, the economic factors are present but equally unifying are the cultural grievances.

In the long run, France and the United States face very different challenges with respect to mass immigration. Compared to America, France is a nation poorly equipped culturally to absorb and assimilate millions of immigrants, and—can we say this?—the immigrants entering France are not easily assimilated, insofar as they are mostly African and mostly Muslim. Moreover, France’s mostly secular native population will not find much common ground with the social conservatism practiced by Muslims, whereas a far higher percentage of white Americans are Christian, practicing variants of Christianity that overlap almost completely with those of immigrants to the United States from Latin America.

Until very recently, America’s dominant culture emphasized the importance of assimilation, and even in its atrophied, discredited current state, America’s ability to assimilate its immigrants remains robust. Asian immigrants entering the United States typically come from successful, developed nations, bringing a strong ethic for higher education and entrepreneurship. America’s Muslim immigrants constitute a far smaller fraction of America’s immigrant population, and on average they have more education and skills than the waves of Muslim immigrants entering France. For these reasons, America is far more likely than France to eventually absorb its immigrants while leaving its culture relatively intact.

But BAP isn’t done. Perhaps he offers further encouraging words to those conservative nationalists whose demographic awareness has made them give up when he writes the following: “Conservatives pretend to be able to recruit Latinos to their cause with the degraded ideology of Jack Kemp but Latinos see David French call forced ‘drag queen’ visits for schoolchildren ‘part of free life,’ and want nothing to do with it. We are far better at recruiting Latinos, and as the example of Bolsonaro among many others shows, this new, energetic and popular form of the right is a Latino movement, and it is the future.”

And where is the Davos-cracy in all of this leftist debauchery and conservative cowardice? BAP is one with Camus in implicating the “large monopolies that promote mass immigration, mass surveillance, and the most bizarre type of speech restrictions, not only on its own employees, but now on American society at large.” In America, the NeverTrumpers and Libertarians, and all of what Michael Anton may have been the first to refer to as “Conservatism Inc.,” have been worse than useless, they have been puppets of the Davoisie.

Finally, BAP’s observations are in accord with Camus on how the meaning of “equality” has been entirely perverted by the replacists. BAP writes:

It is indeed possible to oppose this vicious and exterminationist hatred on purely liberal and racially egalitarian grounds. But this didn’t happen, which puts the lie to the claims that traditional conservatives care about equality under the law or about any of the ideals they claim to espouse. We are now faced with a left that has embraced a dialectic of racial and class destruction in a context where belief in absolute human equality is professed at the same time that no one believes in it anymore.

In the 21st century, the United States and Europe, France in particular, faces increasingly radicalized, politically disenfranchised, economically abandoned, embittered masses. What mindset they adopt, what alliances they form, may be the surprise of the century.

The Solution to Replacism Is a Community of Nations

Camus considers an “orderly and peaceful” remigration of millions of French immigrants back to their nations of origin to be the only way to preserve French culture. It is hard to imagine how this could ever happen. But it is probably true that either assimilation or remigration will be necessary in France in order to avoid either civil war or submission to Islam. Houellebecq’s book of that name is not in the least far fetched, although if it were to happen it prefigures a larger eventual clash, since an Islamicized West would still have to deal with China and other Asian nations that remain committed to preserving their own cultures.

Which begs the question: What does it take for a nation to be willing to fight to again assimilate its immigrants? In France, the economic challenges caused by globalization have already sparked the Yellow Vest Movement, which led to dramatic recent shifts on immigration policy by Macron. But can France, and the other Europeans, recover a sufficient belief in their own history and traditions and identity to demand others assimilate to their ways, instead of the other way around?

In his 2017 book, The Strange Death of Europe, British conservative author and journalist Douglas Murray suggests that those forces still extant in Western societies that resist the leftist derangements of our time—the secular and the religious—put aside their differences and unite to save their civilization. That’s an interesting idea not only because it might enable a critical mass of resistance to arise, but because it represents a new synthesis of Western culture that might help defuse the mutual resentment of Right and Left. They’d better get busy.

Nothing BAP discusses, either in his book or in his essay addressing Michael Anton’s review, offers a solution. BAP describes his work as that of a Samizdat, those Eastern Bloc dissidents who reproduced and distributed censored and underground publications critical of the regime. Anton, for his part, adheres to the ideals of the American Founding Fathers. To which BAP responds, “he [Anton] should admit that this form of government would today be called white supremacism or white nationalism, as would Lincoln’s later revision of it, as would indeed the America of FDR and Truman, not to speak of Theodore Roosevelt.”

Indeed it is. By the Left.

So where does Camus cross the line? How is Camus the “ideologue of white supremacy?” Why did Michael Anton have to use the pseudonym “Publius Decius Mus” when writing candidly about the Davoisie’s embrace of mass immigration into the United States? Why is Bronze Age Mindset written by “Bronze Age Pervert,” instead of whoever lives behind that name?

Camus answers this repeatedly in his book. Anti-racism has come to mean anti-white. Examining the phenomenon uncovers endless examples and makes a strong case for the truth of this statement. Neo-commissars variously described as Chief Equity Officers now infest public and private bureaucracies in departments of “Diversity, Equity, and Inclusion.” They manage aggressive staffs, expensive and empowered, micromanaging everything from micro-aggressions to the precise ethnic proportions represented in the personnel headcounts of every institution in America. This is authoritarian, totalitarian fascism, bureaucratized and masquerading as anti-fascism. It is explicitly racist, yet it markets itself as anti-racist. That is already a reality in much of America, and it’s spreading fast.

In Europe in general, and France in particular, the same applies. If you question the future of your nation, based on utterly indisputable facts—consistent and immutable voting patterns by ethnicity, leading societal indicators by ethnicity, demographic reality—you are branded a “white supremacist” and the consequences are swift. In ascending order: Unwelcome in polite society. Banned or suppressed online. Fired from your job. Denied various public and private services. Prosecuted and fined. Imprisoned.

And yet the movement of anti-replacists isn’t necessarily “white,” at all. The Yellow Vest Movement isn’t white, and it is ideologically heterogeneous. The rising Bronze Age reactionaries in the United States aren’t ethnically pure, and their ideology remains very much in flux. For these reasons, practical nationalism—centrist but honest, faithful to culture and tradition, having expectations of immigrants instead of the other way around, willing to protect national industries in defiance of the libertarian Davos-cracy, able to put the national interest first—still could have a future in the West. And it may have nothing to do with “whiteness” at all.

The alternative, prosecuted by the Left and condoned by a cowardly Right establishment, is Balkanization based on race and gender, even though race and gender “are a social construct.” It is enforced equality according to race and gender, even though all races and cultures are already equal, and in any case, “race and gender are social constructs.”

The alternative, prosecuted by the Davos-cracy, is to flatten the world, erase borders in the interests of commerce, and reduce humanity to undifferentiated human matter. How does this square with the “celebration of diversity” that informs every coopted institution of the Davos-cracy, from mainstream media to monopolistic multinationals? It doesn’t until you return to one of the first points Camus makes, where he emphasizes that replacism isn’t merely to turn humanity into undifferentiated human matter, but to create simulacrums of culture replacing genuine culture. The iconic buildings and monuments and historic plazas of Paris or London will be faint and boring ruins compared to the neon recreations of those same places around the planet, in cities turned into theme parks. The commodification of high culture is the essence of replacism.

Understanding this fact, that replacism is a wholistic repatterning of all national cultures and a wholesale erasure of national economies, is crucial to refuting the claim that to be anti-replacist is to be a white supremacist. The journey into the future, with technology and globalization whipping forward faster than anyone can fully track or comprehend, changing everything in decades, then changing everything yet again, and again, will not be weathered without the strength of national cultures that embrace and cherish and share a common faith, tradition, values, patriotism, being part of something.

Absent intact and confident national Western cultures who know where they came from and who they are, the immigrant waves that retain the most confidence in their collective identity will overwhelm those cultures that do not. And that may not end well for anyone or anything, including the Davos-cracy, including modernity itself.

To the extent Renaud Camus fights a lonely battle, with the smug opinion-makers of the world stigmatizing him and everyone like him as a “white supremacist,” chances are France will become a nation of undifferentiated human matter, or an Islamic state, or some hybrid of the two. But France will no longer be France.

Weekend Long Read

This essay is adapted from “Trump’s World: GEO DEUS,” by Theodore R. Malloch with Felipe J. Cuello (Humanix Books, 336 pages, $27.99)

Davos: Peering Behind the Elite Curiosity Curtain

At Davos, investment dollars flew like sand in the desert wind. The Chinese wanted factories, and they lined up from Nortel to Motorola just to shake hands. Big Pharma met and colluded on patents and pricing. Want to sell airplanes? Autos? You name it, even armaments. It was a global bazaar of high-altitude wheeling and dealing with high price tags.

Thomas Mann, the German Nobel laureate and author of The Magic Mountain, made Davos famous for its mystical and curative powers. For him, it was a sanatorium to overcome the disease, psychological stress, and damage inflicted by modern life. In some ways, it remains so.

These are my personal impressions as a former executive board member of the inside, albeit they are but a snapshot in time. Davos is many things to many people, but it remains a curiosity. It was also, according to many accounts, the place Bill Clinton got the bug and started his own Clinton Global Initiative—seeing gold in them thar hills.

Today on its 50th anniversary, Davos is synonymous with a different kind of cult. It is the cult of business celebrity; elites from every avenue of life, every industry, every country, leaders and wannabes who will do anything to be seen there, especially during the last week of January, when the World Economic Forum conducts its annual meeting.

They pay over $70,000 just to be invited, or $1 million to be members, according to The Guardian. It has become the hub of political, economic, cultural, and every other kind of power imagined by postmodern globalist man. In fact, it is about the emergence of what the ringmaster at Davos calls “Davos Man,” (he actually borrowed the phrase from the late Harvard professor Samuel Huntington) a kind of ubermensch who can transform the world. Nietzsche would be proud.

This wasn’t always the case. After World War II the German part of Switzerland in the far eastern part of the land and up in the rugged mountains was underdeveloped. Skiing, new hotels, and better train service brought in more tourists, but it wasn’t until a half nutty, half brilliant professor of business policy brought his European Management Forum there in 1971 that it started to take off.

In its own words, the WEF is on a mission: “The World Economic Forum is an independent international organization committed to improving the state of the world by engaging leaders in partnerships to shape global, regional, and industry agendas.” Over the course of its history, the World Economic Forum has achieved a limited record of accomplishment in advancing progress on some key issues of global concern. It has also placed itself as the epicenter of New Age globalism—a new ideology. Globalism is not the same as the gradual process of globalization, which sees countries involved in more and more trade and investment across borders. Globalism is a movement toward and a belief in one-world government.

The WEF logo itself puts the organization in the very center of the globe’s sphere; and Herr Professor Dr. Schwab is the “Wizard” of this “Oz,” behind the curtain, who makes the whole thing run—just as in the movie.

The Davos Model

Every year now, for five decades, high in the wintry alpine resort of Davos-Klosters, Switzerland, the world’s elite convenes under the auspices of the World Economic Forum. They have what is termed “convening power.” It’s all over the news. But not much is really known about the organization—the convener. Everyone sips schnapps and talks about the future of the globe under the banner “Rethink, Redesign, Rebuild.” The “Re-” word is always the operative phrase! Be sure to use it in every sentence and you can pass “Go.”

True, Davos can be cynical and trite. The best thing to be said for it is perhaps, as I once put it in the Weekly Standard, that it does not really believe the answer to the world’s problems is more Marx. But they do come close. Davos phrases abound: rethink economics, redesign governance, put (European) socialist values back in business, promote financial literacy, the future of this and that, risk abatement, and on and on. Frankly, while the WEF is full of suggestions, most of them are half-baked.

Here is a sample insight from Davos: “At times of panic credit markets have a tendency to freeze.” Here is another: “The bubble forms when expectations exceed reality.” Cue the applause from the civics class.

The WEF applauded the public rescue of banking and government-inspired guarantees (bailouts), and their mantra has been “print more money” and, when in doubt, “strengthen regulatory measures.” We also need much more “coordination” to defeat systemic risk, according to the Davos line. Did Keynes really get it right? Is Big Government good government? Do markets always fail when left to their own devices? These questions are verboten in Davos—for the hallmark of all believers here gathered is that government is the solution—perhaps assisted by some special council, formed of course by “FOK”—friends of Klaus (Schwab).

Davos wants to tame the “animal spirits” of the market, which are not good and must be tamed. Their authority on this is none less than the turncoat, George Soros, a former robber baron and greenmailer who saw the light. Radical stakeholder capitalism enters left stage, with improved statistics, and a Sarkozy or now Macron–style Commission on Measurement of Economic Performance and Social Progress. We can change accounting as we know it and have a perfect “global solution”—a super–International Monetary Fund. At Davos it is always a globalist solution. Besides dumping the dollar, the world must also have “international consensus” since the United States has been so naughty and learning to share power and give up control can be difficult. But it is necessary.

For Davos Man (and occasionally now a woman or two), an all-powerful global central bank will run money, ignoring notions of national interest; but like gun ownership, the spread of capital will also need to be controlled. We will also need a Tobin carbon tax collected by the United Nations. Bill Gates’ version of “creative capitalism” flies well here, and he goes to Davos every year—where he preaches that we must all “give back” and invest entirely on a social basis. His zeal is perennially featured these days, now that he’s retired from bad, old Microsoft.

Ironically, in the end at Davos—powerful and lucre filled as it is—money is the great taboo; it’s what leads to subprime lending and to bad capitalism. Realizing that the love of money is the “root of all evil,” a “competent global economic citizenry” must fight the inherent flaws of capitalism. If we don’t fight capitalism, we are warned, we could end up with Chinese-style authoritarianism.

At Davos, it’s repeatedly said, we can’t do “business as usual” any longer, and most certainly America, who started all this money madness and interventionism, cannot dictate since the United States is no longer a “hegemon.” A thin veil of anti-Americanism lurks behind a lot of the content at Davos. It is a cabal of multilateralism with an impresario professor as its progenitor.

It’s interesting to note that, through all the sermonizing and flagellation at Davos, short shrift is given to the classical virtues and religion. Instead, the underlying credo here is the need for more confidence in global government, since finance is an imperfect tool for managing risk in an uncertain world.

Tim Mosenfelder/Getty Images

Schwab’s Intentions

I went to Davos for the first time in 1988 as a special guest of Herr Dr. Schwab (K-man to his friends). It was fascinating and certainly involved many leaders and business types, mostly from Europe and especially the Third World. Some were on the make and others on the take.

Throughout 1989 Schwab courted me, had me to dinner over and over, invited me to meetings, and pressed me to give him advice on how to stretch the goals and involve both more CEOs and particularly top Americans from all walks, across all sectors and in every major industry group. In his thick, German accent he would say, “Vell, Ted, kunt vie change die velt?” He wouldn’t stop and at one point stuck his lieutenants on me as well. One was an attractive woman with long, dark hair, an American named Gail Bidwell. She was good looking and bright, and she and her German counterpart, who was ill with cancer, both kept calling on me in my office at the U.N. Schwab had Lester Thurow (economists refer to him as “less than thorough” for his popularizing tendencies) over from MIT. He invited me to lunch. The minister of finance was in from Pakistan; could I spare time for “an interesting” dinner? The head of the central bank of XYZ was here; could I convene in a few hours? No end.

By 1990 he had asked me to serve on some loony council and to help prepare the agenda for the next annual meeting. What were the “veally big questions, mit ein Q” going to be? He asked in his dreary, thick accent.

His staff was mostly low-level flunkies and hangers-on, very young, no higher degrees, just yes-men and plenty of women (many were sexy and were randy with the professor, I later discovered), as well. They reported directly to the Wizard of Oz, as we referred to him behind his back. I was pitching in, adding key names, and, eventually, they asked me to moderate some sessions at the big confab.

My boss at the U.N.-Geneva, Hinteregger and others, were slightly jealous as they were not invited. The head of the U.N. was there, and increasingly more and better CEOs were attending. With my invitations we got some of the top Americans to join the ranks. Even Coca-Cola came, and they brought so much Coke with them it could have filled entire lakes with that fizz.

Schwab also had an in-house rag, a glossy vanity magazine with lots of pictures of leaders at his meetings and articles by those same leaders. It had a goofy look and name: World Link. The idea was to link world leaders permanently. He also devised a failed electronic system to do the same that was well before its time. He called that Welkom. Way too Germanic, I thought. Schwab got me to write a few pieces for his publication, on the U.S. economy and on reform in Eastern Europe and published them with my picture. It was all rather flattering.

The organization was, however, far too Eurocentric, and he knew it and wanted to break out and step up. He knew of my background, history, and work in academia, industry, on Wall Street, in politics, and as a diplomat. He wanted my help and convinced me that we could work together. Pronounce that in German three times!

When we talked privately, Schwab said we were both “thinkers and doers.” He liked to ask that trick question of people: which are you, a thinker or a doer? Pick one and you were wrong. Now, I was warned that Schwab used people, ran through directors like water, and was a first-class name-dropper. Some said there was no substance in his doings, just frills, a media fest. He was a pompous windbag to some. In checking him out I found out some things I didn’t like. He was a German (born in Ravensburg, 1938 where Hitler had come to power), not Swiss, reportedly with strong ties to those who fled to Switzerland. Like Waldheim, he had Nazi-youth in his résumé and tried to hide it. Davos has always been known as a place for the rich and the sick; that much is well established. Not much has changed!

The two most secret items at the World Economic Forum were the budget and the VIP list and its attachment, noting their “guests,” i.e., who they were sleeping with. The budget was not a public document, and it showed the income at well over $100 million. Less than half came from membership fees . . .

The Nazi Party in Switzerland was headquartered there. In 1936, a famous assassination of Wilhelm Gustloff, the top Nazi in all of Switzerland by a Yugoslavian Jew named, David Frankfurter was in all the international headlines and made the Nazis machine irate. The Nazi connection in Davos is noteworthy given the now established Swiss complicity in the German war effort, hidden accounts, stolen goods, holocaust victims, and the anti-Zionism that continues to this day. The World Economic Forum even in recent years has itself called for the boycott of Israel (before it retracted it). Schwab was in total cahoots with the Swiss government. In fact, the Swiss Federal Council paid many of his bills

Why? Because the WEF strategy was to get people to Switzerland to invest there, to bank there, and to use its central location and supposed neutrality. It was all a clever public relations tool or ploy for the Swiss.

Schwab lacked good American connections and didn’t sell particularly well in the CEO corner offices with his thick accent, professorial look, and all this mystical (we called it Davosian) talk about a better world and partnerships for this and that, which sounded like and were mostly “you give us money and we make you a member; more money and you can be a higher-order member; more yet, and you’re on some board.” He was, simply put, what some people call an old-fashioned snake oil salesman.

But the companies were buying from Arthur Andersen to A. T. Kearney to Booz Allen Hamilton and hundreds more. The membership consisted of more than 1,001 companies; some were only midsized but from all over the world. He sold memberships as a way for them to meet other members. Clever. Too clever? He took some hits as a grandstander and then many more from the anti-globalists on the Left and Right. Protests mounted in the tiny ski village, and he had to get the Swiss troops to guard everything.

At a certain point at Christmas 1990, Schwab had me to dinner at his own house in Cologny with his wife and children. It was one of those Swiss chalets on the lakeside, quite large and immaculate. He asked me at dinner if I would consider some arrangement whereby, I could join him and go onto the executive board full-time. Money was no problem; he said they would match what I was being paid, although that amount was the highest paid to any employee, so I should keep it quiet. I would get six weeks’ vacation, home leave, and could travel anywhere in the world I needed to go. The offer was interesting, but I had a job, and the term was set. I said I’d think about it and wanted to work with him in some fashion. Klaus is a hard person to say no to, as he is so fawning. He also makes it appear that the noble mission to save the world that he has created is, well, missionary work.

I had to tell Hinteregger, and I knew it would break his heart. So, we worked a deal out whereby I could spend a portion of my time working on WEF affairs and gradually shift over. By the end of the following year, I would switch teams and play for Schwab and the Davosians and appease the gods of business. Schwab meantime had to do some fast moves to get me a Swiss work visa and to tell the U.N. he would not poach any more people. He also went to his Geneva bankers and arranged for me to not only get a mortgage but to get the right to buy a house in the Canton de Geneva. These were not small things. The WEF as a Swiss Foundation has lots of pull with the cantons, especially where it is based, and in Graubünden, where Davos is situated. It is not a lightweight by any means.

I became not just involved but seminal to the Davos planning and helped set the themes and choose the speakers. Klaus and I made many trips together to the United States and other capitals to get the heads of state and captains of industry on board.

I was brought into the super-secret World Wide Web brainstorming on the future forecasts of the global economy. Those sessions brought together chief economists from leading organizations, banks, and certain economic ministers to spin a story about what lay ahead and where the challenges lie.

At Davos itself, I was a panel moderator of a half dozen sessions and in the big stage held forth as the questioner or respondent on the big economic sessions. My favorite one included the likes of the chairman of the U.S. Fed, the CEO of a global bank, the CEO of Salomon Brothers, my old pal John Gutfreund, the new head of the bank set up for Eastern Europe, a leading French intellectual, the president of the World Bank, and the CEO of Moody’s, the rating agency. I beat up on each of them but let John off the hook. I ended by having each of them play the role of one of their counterparts and tell the “honest” truth. It was a hoot and brought the house down in laughter.

I suggested to Schwab that because we had so many bankers from around the world going to Davos, we should create a World Financial Services Forum meeting as a subset. He was afraid of that for some reason, I think because he did not speak “financese.” We decided to have a governors’ meeting with CEOs alone and then on the last day open it up to the entire financial services industry. It succeeded wonderfully and completely sold out. I chaired both sessions and played Phil Donahue at the latter, with a roving mike, sticking it literally in people’s faces to get instant responses. Everyone wanted to go to Davos, and this was a new way to include more people, and most critically, collect their lucrative fees. The head of Citibank wondered out loud why nobody had done this before. He knew it was a cash cow.

Nicholas Ratzenboeck/AFP via Getty Images

Bringing in Cold War Adversaries

The two countries that were weakest in representation at Davos were the United States and the USSR (until it broke up in 1992).

I was given a mission. The United States was the easy part. Getting the right people, the stars, the CEOs, and the think tank heads and members of Congress and the administration was just a matter of pecking away and showing them the materials and noting the benefits: personal and institutional. The toughest sell was their most precious commodity—their time itself. But with spouse programs, superb skiing, Audi driving schools, and all the socializing and partying, who wouldn’t want to join the world’s greatest schmooze fest in an Alpine village? Besides, it was tax-deductible, and the fees were paid to a foundation!

The most powerful elites in the history of the world all gathered in one place? And that place is Davos? The media certainly ate it up. They enjoyed themselves and the after-hours drinking and dancing more than the participants themselves. They came in droves. It made their jobs easy having so many world leaders in one small town, captive to give “exclusive” interviews.

We let companies break stories there. Countries could do the same, but usually only those who had paid some huge tab to sponsor a reception, a gala (complete with famous rock bands), or initiate some new policy announcing it to the world. Turkey’s then prime minister asked to be admitted to the EU one year, which caused quite a stir; they even made peace with the Greeks. The Alpine countries announced an initiative to save the Alps another year. The Aga Kahn announced his new Central Asian University. The West Germans announced the unification there and the bold one-mark policy. It took the roof down. The U.N. unleashed a program for corporate citizenship there. Every year the U.N. or World Bank came up with some new, far-fetched proposal. Most of these initiatives lasted about a year, some two, and then fizzled out, soon to be replaced with a new, far more urgent one. They too fizzled in about the life span of a newt.

The other country that was underrepresented was the USSR. They were suspicious of market capitalism and didn’t quite know how to use such a forum. But when glasnost hit and the leaders bent to the West, the doors swung wide open.

I was sent to Moscow three times, and twice with the perky Maria Livanos, who was Klaus’s go-to girl. A rich, bossy, very organized Greek who lived for the Davos energy boost. She was a real groupie. We talked the Soviets into both sending a high-level delegation with top ministerial leaders to interact at Davos, but also into doing what we called a “country forum” in Moscow that would bring hundreds of investors and their companies to learn more about the opportunities and changes sweeping their country. We said cash in the form of foreign direct investment would flow the next week. They ate it up. We ate too much caviar! I even bought—well, traded Marlboros for—a few extra pounds of the fish eggs to eat at home. A box of cigs would buy just about anything in the USSR in those days.

The first Soviet delegation to appear at Davos was in 1990, and I was asked to be their official host. I went to the Zurich airport tarmac to greet the Aeroflot flight arrival on a red-carpeted runway. When he stepped off the plane, their delegation head, the all-powerful Arkady Volsky, head of all industry in the USSR, gave me a bear hug and presented me with the most beautiful Russian red fox hat you have ever seen. He greeted me in a dacha-like laced, Russian accent. With him were 20-odd CEOs of all the giant Kombines, oil and gas, autos, agriculture, steel, timber, minerals, you name it. A few of their top pro-market economists who spoke good English were also along for the ride and the free show.

The Forum paid their freight completely, and boy could these guys—only one translator female—drink! It was a demanding group, but we bonded, and everyone wanted to meet and hear from them. Volsky’s sole demand was that they be put up at a good hotel with a swimming pool. He was a daily swimmer. We accommodated. We also organized a giant powwow with a Soviet at each table of eight, and the room was so overflowing that people were gathered around the outer walls. Everyone at Davos wanted to know what the market opening meant for him or her and their corporate interests in a future Russia.

Six months later we put on the show in Moscow, and more than 250 Western business leaders seeking to do business in the new Russia eagerly attended and paid big bucks (OK, Swiss francs) to be there, to have dinner in the Kremlin and to seize the day. Deals were struck and relations enjoined.

At that time many of the pundits wondered about the future of perestroika and Gorbachev, our man in Moscow. One of my favorite little stories about one of these visits to Moscow in this time frame, in the dead of frozen winter, was the accommodation we were given. Volsky’s people, for safety and effect, in a chauffeured ZIL limousine, met us at the airport. Rushed off to the elite Little Oktoberist hotel, we were greeted like VIP party members of the Politburo.

I don’t think any foreign dignitaries had stayed at this small, elite, off-limits Party hotel before. It was posh and filled with goods in a non-Soviet sort of way, but it was meant for upper-echelon apparatchiks from the nomenklatura. After a late dinner and the obligatory Stoli, I checked in for the night. At about 2 a.m. my phone rang, and I awoke from a deep sleep. I couldn’t figure who would be calling me at such an hour. It was a soft woman’s voice, and she said with a delicate Russian accent, “Do you vant kompanie?” My brain lit up and I shot back and spontaneously answered, “No, and I don’t want photos, either” and hung up. They were still the old Soviet Union.

At lunch the next day, I had a reserved table arranged with Boris Fyodorov, who had a Western education from, of all places, the University of Glasgow (Adam Smith’s birthplace), and he had just become head of the central bank. He was polite, had good questions, and seemed somewhat embarrassed.

We had caviar, the best from the Caspian Sea, the finest Soviet champagne, and Georgian red wine. We had three delightful courses of fish, beef, and pork with roasted fresh potatoes and many vegetables. Dessert was a fine chocolate torte, served with strong coffee. It was the best meal I had ever had in the Soviet days. When the bill came, he took it after I pleaded to pick it up. The cost was 78 rubles. The exchange rate may have been one to one officially, but we got ours for exactly 78-to-1. The elegant lunch cost all of one dollar. How long could this last? I wondered. Was the USSR ready to implode?

Behind the scenes at Davos and in the various country capitals, however, real business got done. At Davos and at the country forums around the world, real businesspeople, top executives, paid hard cash not to be photographed or just for bragging rights—well, not entirely; they came and spent money to get access to important people to do deals.

Spencer Platt/Getty Images

The Global Bazaar of the Bizarre

I was involved in dozens of those in Eastern Europe, India, Brazil, and most prominently, in the United States. The U.S. forum had been poorly attended and deadly dull. It was hard to get top speakers. These things are a dime a dozen in Washington and happen nearly every other week. So, we reinvented the U.S.S.  country forum, and I got all of my old pals and their bosses, and their bosses’ bosses, to come, if only for a few hours. It had a who’s who cast. With these players we were able to bait the hook and pull in all the gullible Europeans and Third Worlders who badly wanted closer access to the real American leaders and power brokers. It worked and became another cash cow. We held it at the Willard Hotel and it oozed power.

Behind closed doors is where all the collusion and cartelization took place, ha-ha. At Davos, two bankers met from UBS and Swiss Bank Corp, and later you read of a merger. The steel company of Holland sold out to their counterparts in India. Investment dollars flew like sand in the desert wind. The Japanese wanted plants in America; hello Mr. Governor (you give tax holiday, let’s shake on it). The Chinese wanted factories, and they lined up from Nortel to Motorola just to shake hands. Big Pharma met and colluded on patents and pricing. Did I see that? I swear the oil companies had a cartel going. And ADM it was said, cooked the price of corn fructose, right there. Want to sell airplanes? Autos? You name it, even armaments. It was a global bazaar of high-altitude wheeling and dealing with high price tags. And Schwab got not just praise but perhaps a cut—or at least more sponsors in the process.

There were closed dinners only for Goldman Sachs clients and lunches with Price Waterhouse where the latest and greatest author on some exotic subject held sway. The Business Exchange office had people waiting to get in to make appointments with a potential supplier, vendor, or joint venture partner. There was a fee for that service; did I mention it? You could rent a Soviet reformer or a university president; any and everything was for sale. Jeff Sachs, the notorious Harvard economist, was there rounding up country clients for his reform and anti-IMF packages. Whenever we heard he had signed someone up, it was time to “short” the countries’ debt, as I knew his advice would lead in just one direction—down.

Bono and the movie star set were parading as intellectuals and begging for donations for their favorite causes. Angelina Jolie in a hot tub, some swami in a headdress talking about inner spirituality, and a German theologian talking interfaith dialogue—it was so Davosian

It was all there like a marketplace, the Agora. Mr. Zia, have you met Mr. Singh? Oh, you two are enemies? Well, not here in Davos. We all get along and do business. Jews and Arabs not allowed to meet? Everyone had a so-called “project” to sell. No one knows that here. And all the while the cash registers are going cha-ching for the impresario, the Wizard of Oz. They were not just stroking his ego and bowing to Swiss acumen but coughing up fees, donating again and again.

There was a lounge in the upper reaches of the huge concrete Congress Hall and a busty Texan, a former Miss Texas, I believe, worked it, serving coffee of every delight and catering to the “needs” (need to ask) of the delegates, as well. Massages, rubdowns, she knew how to please, and the lounge always seemed full for some odd reason, even at eight in the morning.

It was hard to keep going at that pace for days on end. There were partying, receptions, and dancing late at night into the wee hours of the morning. Is that the young Mr. Baja I see dancing with the Swissair stewardess(es)? Did I mention Klaus handpicked the prettiest stewardesses, and they were assigned to Davos as escorts? A lot of older men had that thank you, madam look on their faces in the morning briefings. There was considerable one-upmanship, too. Who has the biggest wallet, deal, and penis, kind of talk.

Johannes Eisele/AFP via Getty Images

One final tale at Davos involved a former employee journalist, an Irish-American from Chicago, who drank far too much. He also laughed a lot.

John had taken a job at the International Labor Organization in Geneva as a press officer after his Davos stint, and he invited me to dinner one night at a less-than-reputable restaurant in Paquis near the red-light district. He had someone I “had” to meet. When I got there, we had drinks at the bar, and he took me to a backroom to meet Sergei, a Russian. We broke bread and exchanged pleasantries.

Near the end of the meal, he said, “I have an offer for you; would you be willing to work for us in exchange for money? We like your access to people, leaders, and businessmen, and it could be of use to us.” I got up and said on leaving, “No thanks, and I don’t appreciate such KGB solicitations.” John seemed disappointed, as he clearly was on their take.

The two most secret items at the World Economic Forum were the budget and the VIP list and its attachment, noting their “guests,” i.e., who they were sleeping with. The budget was not a public document, and it showed the income at well over $100 million. Less than half came from membership fees; more than half was an outright line-item gift from the Swiss federal government, and that didn’t even include the vast sums of money that were spent on security, military, and otherwise.

When some protests materialized one year, those costs went through the roof. Best of all was the super-secret list I mentioned above. I once, in jest, joked that I had mistakenly given that list to the press. By leaking it to the equivalent of People magazine, the world would know the next day who was in bed with whom, both boys and girls, and notice that many were, well, not exactly married. No one thought it was funny, but of course, I was only kidding.

The other favorite story I can personally relate is the battle over pricey real estate. Naturally, the biggest CEOs and heads of states wanted the best rooms. What’s new? But there are only so many of them to go around in a small ski village like Davos. Or next-door Klosters, which was viewed as second-class. They came at a steep price, and priority went to the loudest complainers. The president of Peru was lodged in prime top-floor space in the best, Hotel Belvedere (and with a mistress, I might add). When the CEO of Salomon Brothers at the time arrived, he had shabby accommodations, unfit for the king of the money game. In a normal diplomatic protocol, a head of state would outrank a CEO, but not at Davos. We kicked the president out of his room with apologies so we could please and satisfy the CEO and his perky wife, Susan. Money talks and power walks. Wicked Solly traders probably shorted Peru’s debt the next day just to rub it in.

The lesson in this Swiss power tale is, simply, never trust before you verify. It worked well for the Gipper, after all.

Weekend Long Read

This essay is adapted from “How to Keep From Losing Your Mind: Educating Yourself Classically to Resist Cultural Indoctrination” (TAN Books, 384 pages, $24.95)

There Are Great Books

All lists measuring greatness are subject to reconsideration—the truly greats remain on the list with the passing of centuries.

Those classics that are called the Great Books are most closely associated with Mortimer J. Adler and Robert Hutchins.1 When Hutchins became president of the University of Chicago in 1929, he hired Adler to teach philosophy in the law school and the psychology department. Upon arriving, Adler, rather brashly he admits, recommended to Hutchins a program of study for undergraduates using classic texts. Adler had taught in the General Honors program at Columbia University begun in 1921 by professor John Erskine. Hutchins asked him for a list of books to be read in such a program. When Hutchins saw the list, he told Adler that he had not encountered most of them during his student years at Oberlin College and Yale University. Hutchins later wrote that unless Adler “did something drastic he [Hutchins, referring to himself] would close his educational career a wholly uneducated man.”2 Hutchins remained president for 16 years before serving as chancellor until 1951, and the following year, they did something drastic.

In 1952, Adler and Hutchins published the Great Books of the Western World in 54 volumes.3 Adler and Hutchins included the 714 authors they considered most important to the development of Western Civilization.4 The influence of their Great Books movement on American culture for several decades was considerable and continues to this day.

Their selection of books from over a half-century ago has held up rather well. For example, I compared them to the 2007 list published by journalist and cultural critic J. Peder Zane. Zane asked 125 leading writers to list their favorite works of fiction.5 Zane found that the 20 most common titles listed by the writers were:

Anna Karenina, Leo Tolstoy (1877)
Madame Bovary, Gustav Flaubert (1856)
War and Peace, Leo Tolstoy (1869)
Lolita, Vladimir Nabokov (1955)
The Adventures of Huckleberry Finn, Mark Twain (1884)
Hamlet, William Shakespeare (1600)
The Great Gatsby, F. Scott Fitzgerald (1925)
In Search of Lost Time, Marcel Proust (1913-27)
Stories of Anton Chekhov (1860-1904)
Middlemarch, George Eliot (1871-72)
Don Quixote, Miguel de Cervantes (1602, 1615)
Moby Dick, Herman Melville (1851)
Great Expectations, Charles Dickens (1860-61)
Ulysses, James Joyce (1922)
The Odyssey, Homer (9th century B.C.)
Dubliners, James Joyce (1916)
Crime and Punishment, Fyodor Dostoevsky (1866)
King Lear, William Shakespeare (1605)
Emma, Jane Austen (1816)
One Hundred Years of Solitude, Gabriel Garcia Marquez (1967)

Adler and Hutchins included all these books except for the two by Nabokov and Marquez. In spite of their absence, modernity is well-represented in the Great Books by Bertolt Brecht, Samuel Beckett, and William Faulkner, among others.

Zane’s survey refutes the claim that lists of “greats” reflect only the opinions of middle-aged white men. The 120 writers interviewed by Zane would satisfy any diversity requirement. If someone asked the same number of philosophers, historians, or scientists about their favorite books, I predict the results would have been much the same: the new list would contain a majority of acknowledged classics with the addition of some more recent and specialist books.

Poetry

Classic poetry is well-represented in the Great Books—Homer, Virgil, Shakespeare, Dante, Milton, and Eliot are there, plus others. But poetry read in the context of the Great Books can be approached as a source of ideas or as another link in the history of ideas. This is a mistake. Poetic language is a linguistic fusion of form and content, a creation that resists being plucked for concepts to fill in the philosopher’s timeline.

Someone might object to the continued relevance of poetry because no one reads poetry anymore except when assigned in a classroom. However, poet and critic Dana Gioia reports that poetry has undergone a cultural revival outside of the academy, where poets have often found a steady paycheck. Gioia calls it “a tale of two cities”; a new generation of poets is finding their voice in the real world:

They work as baristas, brewers, and bookstore clerks; they also work in business, medicine, and the law. Technology has made it possible to publish books without institutional or commercial support. Social media connects people more effectively than any faculty lounge. An online journal requires nothing but time. Any person with an iPhone and a laptop can produce a professional poetry video. Any bookstore, library, cafe, or gallery can host a poetry reading.6

A 2017 study by the National Endowment of the Arts shows that 11.2 percent of American adults, 28 million people in the United States, still read poetry.7 But young adults, in particular, ages 18 to 24, are leading the return, with 17.5 percent reporting regular poetry reading, a doubling of interest since the last such study in 2012 (8.2 percent). Regardless of how many poetry books you have on your shelves, or how many you see at your local bookstore, poetry thrives. Human beings need to sing, to express themselves beyond the limits of discursive reasoning.

Like music, poetic language engages the reader at an emotional level that goes untouched by philosophical reasoning. Before the philosophers, it was Homer who instructed the Greeks about gods and heroes. But his epics were sung, not read. The Iliad and Odyssey were sung by bards who held them in memory for a thousand years before they were written down.

Fine Art Images/Heritage Images/Getty Images

Wilfred Owen

If someone assigned me the job of introducing poetry to neophytes, one of the first books I would assign my students is the poetry of Wilfred Owen (1893-1918). His life was short because he went to war, dying exactly one week before the end of World War I. After college, Owen went to Paris where he taught both English and French. He witnessed the beginning of the war and two years later returned to England where he was commissioned as a second lieutenant. His experience in battle is recorded in the poetry which was inspired, in part, by time spent in a hospital with the already-established poet Siegfried Sassoon

Owen could have stayed home but returned to the trenches where he died four months later. I’m amazed at what Owen wrote before turning 26. In “Disabled,” he writes about a soldier returned home without his legs, in a wheelchair, watching football from the sidelines:

About this time Town used to swing so gay
When glow-lamps budded in the light blue trees,
And girls glanced lovelier as the air grew dim,—
In the old times, before he threw away his knees.
Now he will never feel again how slim
Girls’ waists are, or how warm their subtle hands.
All of them touch him like some queer disease.8

In “Strange Meeting,” Owen imagines a soldier jumping into a crater in no man’s land and finding the corpse of an enemy soldier staring at him: “By his dead smile I knew we stood in Hell.” The live soldier addresses the dead one: “Strange friend,” I said, “here is no cause to mourn.” But the corpse interrupts:

“None,” said that other, “save the undone years,
The hopelessness. Whatever hope is yours,
Was my life also; I went hunting wild
After the wildest beauty in the world”9

The glory of war as told by Homer, Virgil, Herodotus, Thucydides, and Caesar Augustus was read in school by soldiers on both sides of the trenches. What this soldier found instead was “the pity of war, the pity war distilled.” With his thoughts of glory extinguished by death, he imagines himself back in battle:

“Then, when much blood had clogged their chariot-wheels,
I would go up and wash them from sweet wells,
Even with truths that lie too deep for taint.
I would have poured my spirit without stint
But not through wounds; not on the cess of war.
Foreheads of men have bled where no wounds were. 

“I am the enemy you killed, my friend.
I knew you in this dark: for so you frowned
Yesterday through me as you jabbed and killed.
I parried; but my hands were loath and cold.
Let us sleep now. . . .”

A spiritual sense pervades these lines: “I would have poured my spirit without stint.” The bloody “chariot-wheels,” a reference to Homer’s Iliad, are cleansed “from sweet wells,” like Jacob’s Well (John 4:5–6), a pilgrim site in the ancient city of Nablus for centuries. The dead soldier tells the one living, “I am the enemy you killed, my friend,” and then offers him his forgiveness with the words, “Let us sleep now.” At the end of this ghastly encounter, Owen concludes on a note of nobility and common cause. 

Reading Owen answers our questions about what men experience in battle, how they are able to face death, and how they cope with the experience of battle. The best literature takes us to places and circumstances we can only vaguely imagine and gives us access into the interior lives of people we would otherwise never know.

Making Lists

An indispensable guide to classics is The Western Canon: The Books and School of the Ages by literary scholar Harold Bloom. He organizes his book around 26 select authors, including the poets Shakespeare, Dante, Chaucer, Milton, Goethe, Whitman, Dickinson, Neruda, and Pessoa. His appendices, however, include lists of other books he considers canonical catalogs by era—Theocratic, Aristocratic, Democratic, and Chaotic—and by country. Bloom’s book is one of the best sources I have found to help one become familiar with the names and works of important writers around the world, and he has published a remarkably helpful set of lists for the reader.

Although classic texts are included in some high school and college curricula, it’s the rare student who can deeply appreciate King Lear or Macbeth as a teenager or young adult. The worldly profundity of Flaubert’s Madame Bovary or Edith Wharton’s Age of Innocence, for example, is lost on all but a few teenagers, as it was lost on me. Books like Moby Dick, The Scarlet Letter, and The Great Gatsby pose the same challenge. We want to introduce young readers to the classics, but, frankly, these, like many other classics, are books for grown-ups.

Philosophy and theology are central to any version of the Great Books—they address discursively those questions that have arisen in the lives of every person since Adam and Eve—meaning, morality, truth, justice, love, death, and eternity. Some philosophers and theologians, however, are more easily approached than others. There are always technical terms to master; for example, in Greek philosophy, the concept of Logos (“word,” “reason,” or “order”) which also plays a central role in Christianity: John 1:1 “In the beginning was the Word “ (λόγος, Logos). Every philosopher and theologian wrote in a historical tradition. Readers who pick up, for example, St. Thomas Aquinas will quickly see that he quotes from Scripture, Greek philosophers, Patristic Fathers, Roman writers, and Arab theologians. However, with some patience and access to online reference works, readers can acquire enough background knowledge to read Aquinas intelligently.

Later philosophers such as Kant, Hegel, and Heidegger are more difficult and test the patience of the nonspecialist. With the reader in mind, I discuss mainly the ancients and medievals in How to Keep From Losing Your Mind. These works are foundational for understanding Western civilization, and their influence is seen throughout the philosophy and theology that followed. 

SSPL/Getty Images

Greats and Classics

Greatness is measured in many ways, and any list of greats should be subject to criticism. I remember asking my then-college dean at a dinner party to name his top 10 novels, and he answered that “top 10 lists” were “nonsense.” A bit surprised, I replied, “But they are such great conversation starters!” He reluctantly agreed, but I had made a more important point than I realized at the time. Reliable lists are the answer to “Where do I go next?” 

Let’s imagine a situation that I am sure has happened over and over: You’re listening to the car radio, flipping through channels; you hear a snatch of music that makes you stop, and you listen enthralled to the end. (This has happened to me more than a few times.) You wait to hear the announcer name the piece and the composer. You hear, “That was the ‘Violin Concerto’ of Samuel Barber.” “Who is Samuel Barber?” you ask yourself. What else did he write? Does anyone else write music that sounds like that? The Internet has made the answers very easy to find. You can read about Samuel Barber (1910-1981), see a list of his works and the best available recordings. Search further and you can find other composers who, like Barber, wrote music in a “late-Romantic” style. Good lists are invaluable to tell me what I don’t know.

Barber’s “Violin Concerto” inspires me to declare my description, not definition, of greatness. A book, a film, or a musical composition is great when you think to yourself, “I want to listen to all the music (or read the books and watch the movies) by this composer right away.” You may consider this too subjective, but I know I’m not alone in having that thought after reading Tolstoy, Shakespeare, Dostoevsky, Proust, Homer, Dickens, or Jane Austen; listening to Brahms, Dvorak, or Stravinsky; or watching the films of Kurosawa, Welles, or Eisenstein. As the late Harold Bloom put it, “I think that the self, in its quest to be free and solitary, ultimately reads with one aim only: to confront greatness.”10

Let me clarify one thing: I am using the words great and classic as though they were interchangeable. There’s a distinction. Take, for example, All Quiet on the Western Front by Erich Maria Remarque. It’s a well-known classic novel about World War I. Remarque portrays the absurdity of the war for the soldiers on both sides who were expected to “go over the top” day after day. Remarque’s novel, published in German in 1928, had the good fortune of being translated into English the following year. Then the novel was made into an Oscar-winning 1930 film, “All Quiet on the Western Front,” directed by Lewis Milestone.11 Remarque’s book is still very readable, a classic novel about war and the First World War in particular.

However, when you compare Remarque’s novel to Magic Mountain (1924) by Thomas Mann, the limitations of Remarque’s novel are evident. Whereas Remarque explores the experience of life in the trenches of World War I, Mann’s scope is more universal, possessing layers of meaning about the shattering of European civilization as the result of the First World War. Magic Mountain depicts a turning point in Western culture through the fate of one man, Hans Castorp, who lives in a sanitorium for seven years trying to recover his health.

The most important criteria to use in determining greatness is the opinion of experts. Everyone has their personal favorites—arguing about why, say, one film is better than another is part of the delight of filmgoing. Experts, however, are qualified to make the hard call: to answer the question, where does this film or that book rank in comparison to the others? When I want to buy a new car, I ask the opinion of the mechanic who has been working on my cars for 20 years. Anyone who knows what is required to be an expert at anything will recognize the depth of knowledge needed to measure a book, a movie, a musical composition against all that has come before.

But it should be said, experts are not always right. Consider the list of Nobel Prize winners for literature. The first literature prize, given in 1901, went to the French poet René François Armand (Sully) Prudhomme (1839–1907) “in special recognition of his poetic composition, which gives evidence of lofty idealism, artistic perfection and a rare combination of the qualities of both heart and intellect.”

Prudhomme was a strange choice given the competition. In the previous decade, Dostoevsky had published Brothers Karamazov (1880). The next year, Henry James published A Portrait of a Lady followed by The Bostonians in 1896. Twain’s Huckleberry Finn was published in 1854, Robert Louis Stevenson’s Dr. Jekyll and Mr. Hyde in 1886, along with Thomas Hardy’s Tess of the d’Ubervilles. In addition, Tolstoy published his “Kreutzer Sonata” in 1890. Searching for Prudhomme, I found only one of his books in English translation, the 1875 Les vaines tendresses.

Imagine being Sully Prudhomme when he received a letter from the Nobel committee and realizing he had beat out Dostoevsky, Tolstoy, Twain, Stevenson, Hardy, and Henry James. He may have also thought of other writers active at the time: Guy de Maupassant, Emile Zola, Rudyard Kipling, W. B. Yeats, Paul Verlaine, Arthur Rimbaud, August Strindberg, Henrik Ibsen, and George Bernard Shaw. In the work of these “also-rans” we find inexhaustible stories of the human condition, universal in scope, all told with faultless command of language. All lists measuring greatness are subject to reconsideration—the truly greats remain on the list with the passing of centuries.

Endnotes

1) There were precursors to Adler and Hutchins’s Great Books. For example, in 1886, Sir John Lubbock published his list of “The Best Hundred Books, by the Best Judges” in the Pall Mall Gazette. See W. B. Carnochan, “Where Did Great Books Come From Anyway?” Stanford Humanities Review, vol. 6, 1995. Sir John’s list can be found here: Alex Johnson, “The Book List: Meet Sir John Lubbock, Godfather of the must-read list,” Independent, April 24, 2018. 

2) Mortimer J. Alder, Philosopher At Large: An Intellectual Biography (New York: Macmillan Publishing Co., Inc., 1977), 129.

3) Mortimer J. Adler and Robert Hutchins, Great Books of the Western World, 54 vols. (Chicago: Encyclopædia Britannica, Inc., 1952). A complete list of books can be found at “Adler’s Great Book List.”

4) I had the privilege of knowing and working with Dr. Adler later in his life, and I contributed several essays to his series of volumes, The Great Ideas Today.

5) J. Peder Zane, The Top Ten: Writers Pick Their Favorite Books (Boston: W.W. Norton & Company, 2007).

6) Dana Gioia, “Introduction,” Best American Poetry 2018 (New York: Scribner, 2018).

7) Sunil Iyengar, “Taking Note: Poetry Reading Is Up—Federal Survey Result,” June 7, 2018.

8) Wilfred Owen, The Collected Poems of Wilfred Owen (New York: New Directions Publishing Company, 1993), 67.

9) The Collected Poems of Wilfred Owen, 35.

10) Harold Bloom, The Western Canon: The Books and School of the Ages (New York: Riverhead Books, 1994), 485.

11) Hilton Tims, Erich Maria Remarque: The Last Romantic (New York: Carroll & Graf Publishers, 2003), 55–60, 69–72.

Weekend Long Read

This essay is adapted from “America’s Revolutionary Mind: A Moral History of the American Revolution and the Declaration That Defined It”
(Encounter Books, 447 pages, $32.99)

Americanism and the Spirit of American Liberty

The time has come for Americans to rediscover the philosophy of Americanism, a philosophy which says that, despite our differences of race, ethnicity, class, sex, gender, sexual orientation, religion, or place of origin, all men and women are equally free, morally sovereign, and self-governing.

In 1782, just as the American War of Independence was coming to an end, Hector St. John de Crèvecoeur, who had come to North America from France in 1755 and by 1765 had settled in New York, published Letters from an American Farmer. In it, he asked a fascinating and enduring question: “What then is the American, this new man?” Crèvecoeur’s question suggests that 18th-century Americans were somehow different from all other peoples, and thus he invites us, some 230 years later, to reflect on the nature and meaning of America.

Crèvecoeur’s new man was the existential embodiment of Thomas Jefferson’s “American mind.” He practiced and made real the principles expressed in the Declaration of Independence. The moral, political, social, and economic philosophy associated with the American mind is sometimes reduced to a single word: “Americanism.” The “ism” suggests that being an American is part ideology, part way of life, part attitude, and even part personality. Broadly defined, Americanism is that philosophy which identifies the moral character and sense of life unique to the people of the United States, and which, under distinctive conditions, was translated into practice by millions of ordinary men and women in late 18th- and early 19th-century America.

Interestingly, the idea of Americanism has no foreign counterpart. No other nation has anything quite like it. We may speak of a French, an Italian, or a Persian culture, but there is no Frenchism, Italianism, or Persianism. Americanism, by contrast, is more than just a culture steeped in historically evolved folkways (i.e., the forms and formalities associated with speech dialects, food, music, dress, architecture, etc.). America’s traditional folkways are no doubt different from those of any other nation, but such cultural accouterments do not capture the essence of the American mind. My book attempts to explain the revolution in thought that culminated in the creation of what Jefferson called the American mind. We now conclude with a brief overview of the world created by Crèvecoeur’s new man—the world later described by Alexis de Tocqueville in his magisterial account of Democracy in America.

As we now know, the content of the American mind was synonymous with the self-evident truths of the Declaration of Independence. The Declaration forever associated the American way of life with a social system that recognized, defined, and protected as sacrosanct the rights of individuals. The greatest achievement of the American Revolution was to subordinate society and government to this fundamental moral law.

The radical transformation in thought and practice that followed would have enormous implications for the development of a new American society in the century that followed. The revolutionaries’ ethical individualism promoted the idea that human flourishing requires freedom—the freedom to think and act without interference, which means security from predatory threats against one’s person or property. Freedom requires government, but only government of a particular sort—the sort that protects individuals from force and coercion and that defines a sphere of liberty in which individuals are free to pursue their own welfare and happiness. Within that protected sphere, American revolutionaries and their 19th-century heirs created a new world unlike anything anywhere else.

The revolutionaries’ natural-rights republicanism was the product of a relatively recent revolution in thought that had its source in 17th-century England, originating in the Enlightenment ideas of Bacon, Newton, and, most importantly, Locke. These ideas were first injected into the intellectual life of the colonies in the early 18th century through the universities and the book trade; polemical writings such as Cato’s Letters, by John Trenchard and Thomas Gordon, then democratized these ideas through the newspapers. The radical individualism associated with the natural-rights philosophy armed the Americans with an entirely new morality that would provide the foundation for an unprecedented political, social, and economic system.

The moral philosophy of the American Revolution was closely associated with the idea of self-government—that is, with the idea that individuals must govern their own lives in the fullest sense of the term. Prior to the American Revolution, wrote John Taylor of Caroline, “the natural right of self-government was never plainly asserted, nor practically enforced; nor was it previously discovered, that a sovereign power in any government was inconsistent with this right, and destructive of its value.”

Ultimate sovereignty rests with the individual and not government. After the Revolution, “the natural right of self-government” was made “superior to any political sovereignty.” The Americans now believed, said Tocqueville, “that at birth each has received the ability to govern himself.”

In this new world, the individual replaced the government as the primary unit of moral and political value. This meant sovereign power began with self-governing individuals and extended outward in concentric circles of voluntary association, but never beyond the reach of a man’s control. Thomas Jefferson described the relationship between individual self-government and the various layers of political government this way:

The way to have good and safe government, is not to trust it all to one, but to divide it among the many, distributing to every one exactly the functions he is competent to. Let the national government be entrusted with the defense of the nation, and its foreign and federal relations; the State governments with the civil rights, laws, police, and administration of what concerns the State generally; the counties with the local concerns of the counties, and each ward direct the interests within itself. It is by dividing and subdividing these republics from the great national one down through all its subordinations, until it ends in the administration of every man’s farm by himself; by placing under every one what his own eye may superintend, that all will be done for the best.

All government in postrevolutionary America (local, state, or federal) was grounded on the free political association of individuals who retained ultimate authority and sovereignty over its power. Political power was imploded down to the local level. The Americans, Tocqueville observed, “have a secret instinct that carries them toward independence . . . where each village forms a sort of republic habituated to governing itself.” Government was to have no power that was not explicitly delegated to it by the people and for specific purposes. Or, as John Taylor put it, the “sovereignty of the people arises, and representation flows out of each man’s right to govern himself.”

The ideal of individual self-government set in motion forces that weakened the centralizing tendencies of government power. “What has destroyed liberty and the rights of man in every government which has ever existed under the sun?” asked Jefferson. His answer was clear: “The generalizing and concentrating all cares and powers into one body.” The men who designed America’s constitutional system understood and accepted the truth that Lord Acton’s famous maxim would much later capture: “Power tends to corrupt and absolute power corrupts absolutely.” In 1788, James Madison wrote in Federalist 48 that “power is of an encroaching nature, and . . . ought to be effectually restrained from passing the limits assigned to it.” A few months later, Jefferson noted pithily, “The natural progress of things is for liberty to yield and government to gain ground.” 

Thus the great question confronted by America’s revolutionary constitution-makers was this: How could the grasping power of government be tamed and harnessed in a way that would serve the legitimate functions of government? The Founders’ revolutionary solution to the problem posed by the expansionary nature of power was to subordinate governments (the rule of men) to constitutions (the rule of law). By constitutionalizing their governments, they would constrain arbitrary political rule with the rule of law—laws universal and objective, known and certain. Government officials would be denied discretionary power in applying the law, and the law applied to one man would apply to all men. 

“In questions of power,” Jefferson declared, men were not to be trusted, and so they should be bound “from mischief by the chains of the Constitution.”

Between 1765 and 1788, American revolutionaries invented and then implemented the architectonic idea of the American Revolution: the idea of a written constitution as fundamental law. Written constitutions would capture and guide liberty-promoting subsidiary principles, such as the separation of powers, bicameralism, federalism, judicial review, bills of rights, and various limitations on executive, legislative, and judicial power. These were the principal means by which individual rights and the rule of law would be protected and promoted. By explicitly and exactly defining both the power that may be exercised by government and the rights of individuals, written constitutions would create protected spheres of human action that were knowable and predictable.

The founders’ vision of government was the original version of what is sometimes called the “night-watchman” state—a government strictly limited to a few necessary functions, supported by low taxes, a frugal budget, and minimal levels of regulation. Ideally, government’s role was to protect individuals in their rights by serving as a neutral umpire, sorting out and judging conflicting rights claims. Even Alexander Hamilton, the founding generation’s greatest advocate of energetic government, saw the purpose and power of the national government as strictly limited to a few functions: “the common defence of the members—the preservation of the public peace as well against internal convulsions as external attacks—the regulation of commerce with other nations and between the states—the superintendence of our intercourse, political and commercial, with foreign countries.” 

Jefferson offered the classic statement of the limited purpose of government in his First Inaugural Address: “Still one thing more, fellow citizens—a wise and frugal government, which shall restrain men from injuring one another, which shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned. This is the sum of good government.” The classical liberals of the early republic supported a form of government that would ensure their liberty and property by prohibiting murder, assault, theft, and other crimes of coercion and fraud. James Madison summed up the entire revolutionary generation’s definition of a “just government” as one that “impartially secures to every man whatever is his own.”

This was the great paradox of American society: it united radical individualism with tight bonds of civil association.

Jefferson was particularly sensitive to the tendency of government officials to intervene in both the spiritual and material lives of their fellow citizens. This is why, on the one hand, he claimed that the “opinions of men are not the object of civil government, nor under its jurisdiction,” and, on the other, that the acquisition, production, ownership, and trade of men’s property is not the proper purview of government. Jefferson therefore supported both the separation of church and state as well as the separation of economy and state. He did not think that government should be in the business of religion, nor did he think it should be in the business of business. He strongly inclined toward supporting a policy of religious and economic laissez-faire.

Jeffersonian Republicans envisioned a government that would function without a standing army, that would eliminate debt and dramatically reduce federal taxes and tariffs, that would shun public works projects and internal improvements, and that would reduce controls and regulations on the economy. The founders’ emerging view of the purpose and role of government was most clearly described by William Leggett, one of the great antebellum, Locofoco individualists. “Governments,” Leggett announced, “possess no delegated right to tamper with individual industry in a single hair’s-breadth beyond what is essential to protect the rights of person and property.”

Like Leggett, most Americans of his time distrusted political power, believing that a good society was defined by the paucity of its laws. Accordingly, in the 18th and 19th centuries, there was little government in America relative to the major countries of Europe. In fact, government at all levels before the Civil War was Lilliputian compared to what followed in the postbellum period. Political power—what little of it there was—was concentrated in the states and localities.

In 1839, John L. O’Sullivan, editor of The United States Magazine and Democratic Review, memorably captured the postrevolutionary view of government:

The best government is that which governs least. No human depositories can, with safety, be trusted with the power of legislation upon the general interests of society, so as to operate directly . . . on the industry and property of the community Legislation has been the fruitful parent of nine-tenths of all evil, moral and physical, by which mankind . . . since the creation of the world has been self-degraded, fettered and oppressed.

The only proper purpose of legislation, according to O’Sullivan, was to protect individual rights. In domestic affairs, the action of legislatures

should be confined to the administration of justice, for the protection of the natural equal rights of the citizen, and the preservation of social order. In all other respects, the voluntary principle, the principle of freedom . . . affords the true golden rule. The natural laws which will establish themselves and find their own level are the best laws. This is the fundamental principle of the philosophy of democracy, to furnish a system of administration of justice, and then leave all the business and interests of society to themselves, to free competition and association—in a word, to the voluntary principle.

Government in America before the Civil War had limited power: Its primary responsibilities were to protect the nation fromforeign invasion, to preserve the peace, and to adjudicate disputes among citizens. Much beyond that, it dared not go. William Leggett summed up the prevailing political worldview with the following maxim, which he recommended “be placed in large letters over the speaker’s chair in all legislative bodies”: “do not govern too much.”

Indeed, too much government was not a feature of life in the early republic. As William Sampson, a recent émigré from Ireland, observed, “the government here makes no sensation; it is round about you like the air, and you cannot even feel it.” Americans, said Leggett, were an independent lot who wanted little to “no government to regulate their private concerns; to prescribe the course and mete out the profits of industry.” They wanted “no fireside legislators; no executive interference in their workshops and fields.” In America, wrote the 19th-century individualist Josiah Warren, “Everyone must feel that he is the supreme arbiter of his own [destiny], that no power on earth shall rise over him, that he is and always shall be sovereign of himself and all relating to his individuality.” America’s new-model man mostly just wanted to be left alone.

Over the course of a century, the American idea of freedom and the experience of life on the frontier worked together to create and define the uniquely American spirit—a spirit defined by honesty, adventure, energy, daring, industry, hope, idealism, enterprise, and benevolence.

Wherever there was a frontier in the early republic, government was especially thin, light, and weak. American pioneers, having broken free from the mother country, began a process of declaring independence from their own national and then their state governments, and, finally, from each other as they migrated in ever-increasing numbers to the western frontier, which continued to move toward the setting sun until the close of the 19th century. What was happening politically in late 18th- and early 19th-century America was unlike anything else seen anywhere in the world.

In the end, the new world order created by America’s Founding Fathers asked only three things of its citizens: first, that they not violate each other’s rights; second, that they live self-starting, self-reliant, self-governing lives by practicing certain uniquely American virtues and character traits (e.g., independence, initiative, industriousness, frugality, enterprise, creativity, adventurousness, courage, and optimism); and third, that they deal with each other by means of persuasion and voluntary trade. In return, the free society made certain promises to those who lived by the American creed: it promised to protect all citizens’ freedom and rights from domestic and foreign criminals; it promised to govern by the rule of law; and it promised a sphere of unfettered opportunity that made possible their pursuit of material and spiritual values undreamed of in other societies.

The changes wrought by the Revolution were truly momentous. The individual-rights revolution of 1776 launched the greatest moral, social, and political transformation not just in American history but also in world history. A new civilization—a republican civilization—was born, free from the dead weight of the past, free from the encrusted hierarchies of old-regime Europe, free from artificial privilege and haughty arrogance, free from ostentation, decadence, and corruption, free from vicious, medieval laws, free from overweening state power, and free from the cynicism of low expectations.

The society Tocqueville discovered in America did not experience a brutal revolutionary upheaval after 1776. There were no guillotines or revolutionary calendars that began with Year One. Instead, the moral, social, political, and economic revolution that followed the end of the War of Independence and the Treaty of Paris was unlike anything ever seen before. The revolution in thinking, principles, and sentiments that preceded 1776 resulted in a gradual, evolutionary, but thorough transformation in American life that blended the Revolution’s libertarian philosophy and the circumstances of life on an ever-expanding frontier.

Nawrocki/ClassicStock/Getty Images

The American Mind in Practice

The American Revolution began as a revolution in ideas, but its ultimate success required that theory be translated into practice. Ultimately, as Alexis de Tocqueville noted, the “American mind turns away from general ideas; it does not direct itself toward theoretical discoveries.” The whole purpose of the Declaration’s ideas was to liberate men to act.

The way of life associated with the American spirit of liberty was thus born of a fortuitous meeting between the ideas of men like Thomas Jefferson and James Madison and the actions of men like Daniel Boone and Davy Crockett. As the ideas of the Revolution spread westward through the Cumberland Gap, they were lived day by day on the frontier. Over the course of a century, the American idea of freedom and the experience of life on the frontier worked together to create and define the uniquely American spirit—a spirit defined by honesty, adventure, energy, daring, industry, hope, idealism, enterprise, and benevolence. American-style frontier republicanism was unlike anything ever seen anywhere in the world.

In Tocqueville’s America, “hardy adventurers”—avatars of Crèvecoeur’s new man—left the shelter of their “fathers’ roofs” and plunged “into the solitudes of America,” where they sought a “new native country.” They marched westward toward the“boundaries of society and wilderness.” Late 18th- and 19th-century American pilgrims chased a frontier that followed the direction of the setting sun. Living alone and far from the comforts of civilization, the “pioneer hastily fells some trees and raises a cabin under the leaves.” While all “is primitive and savage around him,” he brings with him the ideas that freed him to leave in the first place: he “plunges into the wilderness of the New World with his Bible, a hatchet, and newspapers.” 

Through this process, according to Tocqueville, the Americans are habituated “little by little to govern themselves.” Frontier life was partly defined by the absence of government (including legislatures, courts, police, and armies), all of which eventually followed. Until the end of the 19th century, a decent, law-abiding frontier American could pass through life and hardly see or feel a trace of government beyond the post office and the marshal. For the most part, the state left men and women alone. Despite the poverty and barbarism of his condition, America’s new man knows “what his rights are and what means he will use to exercise them.”

America once was and one hopes still can be a nation for the ambitious, hard-working, creative, productive, adventurous, and entrepreneurial. That is the meaning of Americanism and the spirit of American liberty.

In the half-century following the Revolution, these pioneering adventurers—many of whom, at least in the first wave, were veterans of the War of Independence—created a society the likes of which had never been seen before. The Americans destroyed the remnants of the ancien régime, with its artificial hierarchies and unchosen duties, regulations, and social stasis; in its place they created a dynamic society defined by equal rights, freedom, the pursuit of happiness, competition, and social mobility. They built both that society and its governments on the premise that individuals are self-owning, self-making, and self-governing.

Once men came to believe that they owned and controlled their own lives, free from the burden of overbearing government power, they began to pursue their own self-interested values and to explore new ways of conducting their lives. Freedom became the rallying cry for those seeking to challenge all forms of authority and to tear down traditional social, political, and economic barriers. In this new world, society preceded government, and the individual preceded society.

The new man who developed along with this new kind of political society was one of entrepreneurial energy and creativity. Nothing contributed more to this explosion of social vitality than the twin principles of freedom and rights. These conjoined ideas represented the most radical and most potent philosophical force let loose by the Revolution.

Within a couple of decades following the Declaration of Independence, the United States became—at least in the northern states—the freest nation in world history (at the same time, paradoxically, that the existence of slavery made it one of the least free). The Revolution brought new producers and consumers into the emerging market economy. It aroused and liberated previously dormant acquisitive impulses, and it freed the “natural aristocracy” promoted by Thomas Jefferson to build a new kind of hustling and bustling society.

It was a society of individuals constantly on the move. The people of the early republic were restless, rootless, and sometimes homeless. It was not uncommon for individuals and families to move—almost always westward—every few years. Nor was it uncommon for them to change jobs and professions. When Tocqueville toured the country, he encountered Americans “who [had] been successively attorneys, farmers, traders, evangelical ministers, doctors.” In Tocqueville’s America,

a man carefully builds a dwelling in which to pass his declining years, and he sells it while the roof is being laid; he plants a garden and he rents it out just as he was going to taste its fruits; he clears a field and he leaves to others the care of harvesting its crops. He embraces a profession and quits it. He settles in a place from which he departs soon after so as to take his changing desires elsewhere.

Should his private affairs give him some respite, he immediately plunges into the whirlwind of politics. And when toward the end of a year filled with work some leisure still remains to him, he carries his restive curiosity here and there within the vast limits of the United States. He will thus go five hundred leagues in a few days in order to better distract himself from his happiness.

In 1817, George Flower, an Englishman recently arrived on the Illinois prairie, was not convinced that the American people always lived up to the moral principles of the Declaration, but he was certain that the open space of the frontier environment aided in spreading freedom: “The practical liberty of America is found in its great space and small population. Good land, dog-cheap everywhere, and for nothing, if you will go for it, gives as much elbow-room to every man as he chooses to take,” Flower wrote. He continued: “Poor laborers, from every country in Europe, hear of this cheap land, are attracted to it, perhaps without any political opinions. They come, they toil, they prosper. This is the real liberty of America.” The distinctively American ethos associated with frontier life held that individuals are morally sovereign and that they therefore must be self-starting, self-governing, and self-reliant in order to succeed in life. They just needed, as Flower noted, a little elbow room.

Life on the frontier unleashed in America’s new man a primordial energy that would conquer a broad and wild continent and build a new kind of meritocratic society, defined by the natural aristocracy of ability, inventiveness, daring, and hard work. The new frontier ethos broke down Old World social barriers and hierarchies, replacing them with a social order that judged men not by their circumstances at birth but by what they made of their lives. The American frontier was the refuge where ambitious men and women could escape their past and the burden of living for others—the guilt, the pressure, and sometimes the compulsion to live one’s life for family, tribe, church, king, or state. It was the place where men and sometimes even women could reinvent themselves. Only in America could a man who came from nothing prove his ability and worth and become a man of accomplishment and wealth. Only in America could there be such a creature as the “self-made man.”

The ideal of the self-made man was a reality for many 19th-century Americans. Ironically, the best exposition of the self-made man as ideal and fact is found in the speech of a runaway slave, Frederick Douglass. In an 1859 lecture titled “Self-Made Men,” the former slave defined in unmistakable terms the story and the qualities of the quintessential American:

Self-made men . . . are the men who owe little or nothing to birth, relationship, friendly surroundings; to wealth inherited or to early approved means of education; who are what they are, without the aid of any favoring conditions by which other men usually rise in the world and achieve great results. In fact they are the men who are not brought up but who are obliged to come up, not only without the voluntary assistance or friendly co-operation of society, but often in open and derisive defiance of all the efforts of society and the tendency of circumstances to repress, retard and keep them down. They are the men who, in a world of schools, academies, colleges and other institutions of learning, are often compelled by unfriendly circumstances to acquire their education elsewhere and, amidst unfavorable conditions, to hew out for themselves a way to success, and thus to become the architects of their own good fortunes. They are in a peculiar sense, indebted to themselves for themselves.

Douglass observed America’s self-made men all around him, and of course, he was the living embodiment of the ideal. Notably, he did not think that the success of the self-made man was due to accident or good luck. Instead, success in life could be explained, he insisted, “by one word and that word work! work!! work!!! Work!!!!”

A few Europeans who came to America were nonplussed by what they saw. In 1787, Charles Nisbet, a Scottish academic recently arrived in Pennsylvania, described the American Revolution as having “commenced on just and solid grounds.” It was “carried on,” he continued, “by honest, enlightened, noble-minded patriots” who were “prompted by a sincere love of rational liberty.” Still, this Old World professor did not quite fully understand or appreciate the new world created by the Revolution, which was made up of “discordant atoms, jumbled together by chance, and tossed by inconstancy in an immense vacuum.” Less than impressed, Nisbet complained that America lacked “a principle of attraction and cohesion.” He was mistaken.

This new American creed of “rational liberty” did not mean that its practitioners lived alienated and crabbed lives in atomistic isolation from one another. It did not mean that Americans were indifferent or unneighborly to each other, that they did not help each other during times of crisis or distress. Quite the opposite. These rugged American individualists joined together in bonds of civic friendship as they experienced and lived through seemingly never-ending disasters like floods, fires, tornadoes, earthquakes, native attacks, and diseases such as smallpox, measles, tuberculosis, yellow fever, and influenza. The moral and political philosophy by which they lived their lives was no antisocial creed that confined men to their own spiritual cages. Together, as friends and neighbors, the westward-moving Americans built—literally—cabins, houses, barns, roads, canals, libraries, schools, colleges, villages, towns, and cities. Freedom produced unparalleled social cooperation.

From the Revolution to the Civil War, American society developed its own principles of attraction and cohesion that naturally melded its individual atoms into a common culture. The country was unified through a commercial system of natural liberty and a harmony of economic interests. Instead of anarchy, the natural system of liberty encouraged and generated new associations and bonds of civil cooperation.

Tocqueville observed that “Americans of all ages, all conditions, all minds constantly unite.” Ordinary Americans voluntarily united with each other to form all kinds of benevolent associations in order to improve their material and spiritual lives. According to Tocqueville, the Americans not only have “commercial and industrial associations in which all take part, but they also have a thousand other kinds: religious, moral, grave, futile, very general and very particular, immense and very small; American use associations to give fêtes, to found seminaries, to build inns, to raise churches, to distribute books, to send missionaries to the antipodes; in this manner they create hospitals, prisons, schools.”

This, then, was the great paradox of American society: it united radical individualism with tight bonds of civil association. The former was responsible for the latter. It was e pluribus unum.

What made this revolutionary society unique was that the force and authority of government and the ties of land and blood were not what held it together, as was true of most countries of the Old World. The American people were united instead by self-interest, rights, freedom, money, benevolence, voluntary associations, and—most importantly—by a common moral ideal that was expressed so eloquently in the ringing phrase: “We hold these truths to be self-evident . . . ”

The American experiment in self-government truly was a novus ordo seclorum.

Universal History Archive/Getty Images)

An Afterthought

Tragically, though, the revolutionary society founded in 1776 with the Declaration of Independence and refounded in 1863 with the Emancipation Proclamation is now fraying. Americans are divided like never before. 

America is more fractured today than at any time since the Civil War. The American people are so polarized in 2019 that we might now speak of the “Disunited States of America” or the “United States of Hate”! 

Americans are irredeemably divided over Donald Trump, impeachment, capitalism, socialism, democracy, pronouns, abortion, marriage, immigration, climate change, reparations, Brett Kavanaugh, the Covington kids, free speech, drag queen reading hour, political correctness, and many other topics.

All of our cultural institutions—the schools, Boy Scouts, the NFL, the Oscars, soap operas, late-night television, Broadway, stand-up comedy—have become politicized and weaponized. We can’t even come together over the flag and national anthem.

From Charlottesville to Berkeley, street riots in the last two years have turned into violent pitched battles between armed gangs of masked street thugs representing the so-called alt-Left and the alt-Right. Ideologically motivated mass shootings are taking place in our schools, synagogues, churches, malls, and nightclubs. Some of our democratically elected politicians are calling for violence and some are the targets of harassment and violence. We are on the verge of lawlessness.

To make matters worse, few Americans believe that our political institutions are working. Just about half the nation thinks that the election of Donald Trump was illegitimate and the other half thinks the Democratic Party is engaged in a silent coup to overturn a democratic election of 2016.

It is not an exaggeration to suggest that liberal and conservative Americans hate each other. There are now two Americas and the division is not between “haves” and “have nots” or between whites and blacks. The coastal, blue state, Ivy-educated ruling class has contempt for flyover, red state, trailer park deplorables and vice versa.

And where is all this leading us? This much is certain: to paraphrase Abraham Lincoln, a nation that hates itself cannot stand.

The time has come for Americans to rediscover the philosophy of Americanism, a philosophy which says that, despite our differences of race, ethnicity, class, sex, gender, sexual orientation, religion, or place of origin, all men and women are equally free, morally sovereign, and self-governing. This is the philosophy that inspired hundreds of millions of people from around the world to immigrate to America.

America once was and one hopes still can be a nation for the ambitious, hard-working, creative, productive, adventurous, and entrepreneurial. That is the meaning of Americanism and the spirit of American liberty.

Weekend Long Read

A Science-Based Case for Ending the Porn Epidemic

We know what porn does to the brain, because the medical science is solid. Because social science is much softer, we can’t know for certain what causal impacts porn has on society, if any. But once we realize that we have to be much more humble in this area, we can still make prudential judgments.

They say the first step is admitting you have a problem. I think many readers of this article will respond with outrage, and many will see it says things they already knew to be true—and I think these two groups will largely overlap. The most powerful obstacle to confronting a destructive addiction is denial, and collectively we are in denial about pornography.

Since it seems somehow relevant, let me state at the outset that I am French. Every fiber of my Latin, Catholic body recoils at puritanism of any sort, especially the bizarre, Anglo-Puritan kind so prevalent in America. I believe eroticism is one of God’s greatest gifts to humankind, prudishness a bizarre aberration, and not so long ago, hyperbolic warnings about the perils of pornography, whether from my Evangelical Christian or progressive feminist friends, had me rolling my eyes. 

Not anymore. I have become deadly serious. A few years ago, a friend—unsurprisingly, a female friend—mentioned that there was strong medical evidence for the proposition that online pornography is a lot more dangerous than most people suspect. Since I was skeptical, I looked into it. I became intrigued and kept following the evolving science, as well as online testimonies, off and on. It didn’t take me long to understand that my friend is right. In fact, the more I delved into the subject, the more alarmed I became.

The central contention of this article is that, however we might feel morally about pornography in general, a number of features about pornography as it has actually existed for the past decade or so, with the emergence of “Tube” sites that provide endless, instant, high-definition video in 2006, and the proliferation of smartphones and tablets since 2007, is fundamentally different from anything we’ve previously experienced. 

A scientific consensus is emerging that today’s porn is truly a public health menace: its new incarnation combines with some evolutionarily-designed features of our brain to make it uniquely addictive, on par with any drug you might name—and uniquely destructive. The evidence is in: porn is as addictive as smoking, or more, except that what smoking does to your lungs, porn does to your brain. 

The damage is real, and it’s profound. The scientific evidence has mounted: certain evolutionarily-designed features of our neurobiology not only mean that today’s porn is profoundly addictive, but that this addiction—which, at this point, must include the majority of all males—has been rewiring our brains in ways that have had a profoundly damaging impact on our sexuality, our relationships, and our mental health. 

Furthermore, I believe that it is also having a far-reaching impact on our social fabric as a whole—while it is impossible to demonstrate any cause-and-effect relationship scientifically beyond a reasonable doubt when it comes to broad social trends, I believe the evidence is still compelling or, at least, highly suggestive.

Indeed, it is so compelling that I now believe that online porn addiction is the number one public health challenge facing the West today.

If the evidence is so strong and the damage so deep and pervasive, why is nobody talking about this? Well—why did it take so long for society to admit, and respond to, the evidence on the harms of smoking? In part because, even when emerging scientific evidence is quite solid, in the best of worlds there is always a lag between specialists making a discovery and academic gatekeepers embracing it, thereby granting it the social stamp of authority of scientific consensus. In part it is because, for many of us, our background assumption is that “porn” means something similar to Playboy and lingerie catalogues. In part, it is because of widespread (and, in my view, mistaken) assumptions about what important values like free speech, gender equality, and sexual health entail. In part it is because deep-monied interests have a stake in the status quo. And in very large parts, it is because most of us are now addicts—and like good addicts, we are in denial. 

Porn Is the New Smoking

I’ve been a smoker since my early 20s. I have said things like, “I can quit any time,” “I just do it because I enjoy it,” “My grandmother smoked for decades and she’s perfectly healthy,” while feeling secret shame for not being able to climb a flight of stairs without losing my breath. No form of delusion is more powerful than self-delusion. 

Anti-porn advocates like the phrase “porn is the new smoking.” Call today the beginnings of the “Mad Men” stage of the process, then: the time when most people still see smoking as harmless, but the scientific evidence is starting to pile up, and the drip-drip-drip of new data is just starting to be heard beyond specialist circles of academia and the few kooks who had a hunch all along that this was nastier than it looked. We can hope, some time not too long from now, we will look at today’s jokes about PornHub with the same mix of bafflement and shame we feel when we see 1950s ads with slogans like “More Doctors Smoke Camel Than Any Other Cigarette.”

So, what is this new scientific data?

The first step is to look at the evidence on the effect of porn on the chemistry of the brain. It is an understatement to say that mammals, particularly males, are wired by evolution to seek out sexual stimulation. When we get it, a deep part of our brain called the reward center, which we share with most mammals and whose job it is to make us feel good when we do things we are evolutionarily designed to seek, releases the neurotransmitter dopamine. 

Dopamine is sometimes called “the pleasure hormone,” but this is an oversimplification; it would be more accurate to call it “the desire hormone” or “the craving hormone”. Crucially, the release of dopamine starts not with the reward itself, but with the anticipation of reward. The reward center’s job is to make us crave those things which we are evolutionarily designed to crave—starting with sex and food.

It’s not exactly a scoop that humans are wired to seek out sexual stimulation, is it? No, but today’s internet porn plays differently with our reward system. The design of mammals’ reward system causes something scientists call the Coolidge Effect. 

It is named after an old joke: President Calvin Coolidge and the First Lady are separately visiting a farm. Mrs. Coolidge visits the chicken yard and sees the rooster mating a lot. She asks how often that happens, and is told, “Dozens of times each day.” Mrs. Coolidge responds, “Tell that to the president when he comes by.” Upon being told, the president asks, “Same hen every time?” “Oh, no, Mr. President, a different hen every time.” “Tell that to Mrs. Coolidge.”

Hence, the Coolidge Effect. If you place a male rat in a box with several female rats in heat, the rat will immediately begin to mate with all the female rats, until it is utterly exhausted. The female rats, still wanting sexual congress, will nudge and lick the drained animal, but at some point he will simply stop responding—until you put a new female in the box, at which point the male will suddenly awaken and proceed to mate with the new female. 

Most of us are now addicts—and like good addicts, we are in denial. 

It’s a good (albeit corny) joke. But the Coolidge Effect is also one of the most robust findings in science. It has been replicated in all mammals, and most other animals (some species of cricket don’t have it). The evolutionary imperative is to spread genes as widely as possible, which makes the Coolidge Effect a very suitable adaptation. Neurochemically, this means that our brain produces more dopamine with novel partners. And—this is the crucial bit—on Tube sites, each new porn scene our brain interprets as a new partner. In a study, the same porn film was shown repeatedly to a group of men, and they found that arousal declined with each new viewing—until a new film was shown, at which point arousal shot right back up to the same level as when the men were shown the film the first time. 

This is one of the critical ways in which today’s porn is fundamentally different from yesterday’s: unlike Playboy, online porn provides literally infinite novelty with no effort. With Tube sites and a broadband connection, you can have a new clip—what your brain interprets as a new partner—literally every minute, every second. And with laptops, smartphones and tablets, they can be accessed everywhere, 24/7, immediately.

This can be likened to what Nobel laureate Nikolaas Tinbergen called a superstimulus: something artificial that provides a stimulus that our brains are evolutionarily wired to seek, but at a level way beyond what we are evolutionarily prepared to cope with, wreaking havoc on our brains. Tinbergen found that female birds could spend their lives struggling to sit on giant fake, brightly-colored eggs while leaving their own, paler eggs to die. An increasing number of scientists believe the obesity epidemic is the result of a superstimulus: products like refined sugar are textbook examples of an artificial version of something we’re designed to seek, in a concentrated form that doesn’t exist in nature and that our bodies aren’t prepared for. 

Evolution could not prepare our brains for the neurochemical rush of an always-on kaleidoscope of sexual novelty. This makes online porn uniquely addictive—just like a drug. Some scientists believe that the reason why chemical drugs can be so addictive is that they trigger our neurochemical reward mechanisms linked to sex; heroin addicts often claim that shooting up “feels like an orgasm.” A 2010 study on rats found that methamphetamine use activated the same reward systems and the same circuitry as sex.

(Along with dolphins and some higher primates, rats are the only mammals who mate for pleasure as well as reproduction; and humans’ sex reward systems are neurologically basically the same as rats’, since they are one of the least evolved parts of our brains. These factors make the little critters excellent test subjects for experiments on the neurochemistry of human sexuality. Yes, when it comes to sex, us men are basically rats. The more you know . . . )

What’s more, no one is born with a reward circuitry wired in their brain for alcohol, or cocaine—but everyone is born with a hardwired reward system for sexual stimulation. Addiction research has shown that not all people have a predisposition to addiction to chemical substances—only if you have a genetic predisposition can your brain’s reward system be tricked into mistaking a particular chemical for sex. This is why some people become alcoholics even after being exposed to moderate amounts of alcohol, while others (like me) can drink heavily without developing an addiction, or why some people can have just one cigarette at a party and then not worry about it while others (like me) must have their nicotine fix every day. By contrast, all of us have a predisposition to addiction to sexual stimulus. 

Another well-established evolutionary mechanism is something called the bingeing effect. We evolved under conditions of resource scarcity, which meant it was evolutionarily advantageous to have a reward system programmed to give us a very strong drive to binge whenever we hit a motherlode of something. But putting mammals wired for the bingeing effect in an environment of abundance can wreak havoc on their brains. (The bingeing effect has also been linked to obesity.)

If our reward system interprets each new porn clip as the same thing as a new sexual partner, this means an unprecedented sort of stimulus for our brain. Not comparable to Playboy, or even ’90s-era dial-up downloads. Even decadent Roman emperors, Turkish sultans, and 1970s rock stars never had 24/7, one-click-away-access to infinitely many, infinitely novel sexual partners.

The combination of a pre-existing natural circuit for neurochemical reward linked to sexual stimulus and the possibility of immediate, infinite novelty—which, again, was not a feature of porn until 2006—means that a user can now keep his dopamine levels much higher, and for much longer periods of time, than we can possibly hope our brains to handle without real and lasting damage. 

iStock/Getty Images

Theory vs. Practice in Today’s Porn

So, that’s the theory. What about the practice? The evidence has been gradually piling up; at this point, we can say that the scientific evidence that online porn works on our brains just like cocaine or alcohol or tobacco, while recent, is very strong. 

A consensus has been slow to emerge in part because of a broader issue: addiction researchers traditionally have been reluctant to use “addiction” as a label for behaviors that don’t involve chemical substances, understandably so since our therapeutic culture tends to put many things under the label “addiction.” We all collectively rolled our eyes when prominent men felled by #MeToo piously blamed “sex addiction” and announced their intention to go to rehab, and we were right to.

But our cultural need to put all sorts of dysfunctional behavior under the addiction label (“shopping addiction”!) is not the same thing as the science of addiction, and advances in brain imaging techniques have tilted the scales in favor of the view that addiction is a brain disease, not a chemical disease.

A landmark 2016 paper  by Nora D. Volkow, director of the National Institute on Drug Abuse, and George F. Koob, director of the National Institute on Alcohol Abuse and Alcoholism, in the New England Journal of Medicine, went over new neuroscience and brain imaging data and concluded that it supports the “brain disease model of addiction.” The scientific definition of addiction is shifting to one that looks at specific things happening inside the brain causing people to exhibit certain patterns of behavior, as opposed to whether a patient is hooked on a particular chemical compound.  

As we consume more and more porn, our brain is rewired so that what triggers the reward system that is supposed to be linked to sex is no longer linked to sex but to porn.

Online porn fits this model. Slowly, the evidence has been piling up, and it looks, by now, overwhelming: porn does do the same things to our brains as addictive substances.

A 2011 study on the self-reported experiences of 89 males found “parallels between cognitive and brain mechanisms potentially contributing to the maintenance of excessive cybersex and those described for individuals with substance dependence.” A 2014 Cambridge University study watched people’s brains through an MRI machine; Valerie Voon, the study’s lead author, summarized the findings thus: “There are clear differences in brain activity between patients who have compulsive sexual behaviour and healthy volunteers.”

Another Cambridge University study the same year, this time comparing porn addicts’ responses to psychological tests to the responses of normal subjects, found that “sexually explicit videos were associated with greater activity in a neural network similar to that observed in drug-cue-reactivity studies.” Almost all of the neuroscience studies on this topic find the same result: online porn use does the same things to our brains as drug addiction. 

But don’t take my word for it. Scientists have done many reviews of the literature. Only one review that I am aware of, from 2014, disputes the idea of online porn addiction; it’s the only review that doesn’t look at brain and brain-scan studies, and combines studies from before the Tube era and after. Meanwhile, a thorough 2015 review of the neuroscience literature on internet porn found that “neuroscientific research supports the assumption that underlying neural processes (of online porn addiction) are similar to substance addiction” and that “Internet pornography addiction fits into the addiction framework and shares similar basic mechanisms with substance addiction.” Another 2015 review found that “Neuroimaging studies support the assumption of meaningful commonalities between cybersex addiction and other behavioral addictions as well as substance dependency.” A 2018 review found the same thing: 

Recent neurobiological studies have revealed that compulsive sexual behaviors are associated with altered processing of sexual material and differences in brain structure and function. . . . existing data suggest neurobiological abnormalities share communalities with other additions such as substance use and gambling disorders.

In January 2019, a team of researchers published a paper straightforwardly titled “Online Porn Addiction: What We Know and What We Don’t—A Systematic Review” which concluded, “as far as we know, a number of recent studies support (problematic use of online pornography) as an addiction.” It’s hard to call this anything but overwhelming evidence.

The studies have been done in numerous countries, and using various methods, from neuro-imaging to surveys to experiments and, to varying degrees, they all say the same thing. 

All right, you might respond, online porn addiction may be a real thing, but does that mean we need to freak out? After all, smoking and heroin will kill you, serious cannabis addiction will melt your brain, alcohol addiction will wreak havoc in your life—compared to that, how bad can porn addiction be?

The answer, it turns out, is: pretty bad.

Let’s start with what we all know about addiction: you need more and more of your drug to get less and less of a kick; this is the cycle which makes addiction so destructive. The reason for this is that addiction simply rewires the circuitry of our brain. 

When the reward center of our brain is activated, it releases chemicals that make us feel good. Mainly dopamine, as we’ve seen, and also a protein called DeltaFosB. Its function is to strengthen the neural pathways that dopamine travels, deepening the neural connection between the buzz we get and whatever we’re doing or experiencing when we get it. DeltaFosB is important for learning new skills: if you keep practicing that golf swing until you get it right, you feel a burst of joy—that’s dopamine—, while the accompanying release of DeltaFosB helps your brain remember how to do it again. It’s a very clever system.

But DeltaFosB is also responsible for making addiction possible. Addictive drugs activate the same nerve cells activated during sexual arousal, which is why we derive pleasure from them. But we become addicted to them when DeltaFosB, essentially, has reprogrammed our brain’s reward system, originally written to make us seek out sex (and food), to make it seek out that chemical instead. This is why addiction is so powerful: the addict’s urge is really our most powerful evolutionary urge, hijacked. And since online pornography is a sexual stimulus to begin with, we are all predisposed, and it takes much less rewiring for consumption to cause addiction.

As we’ll see, this neurobiological feature of our brains has far-reaching implications for the effect porn addiction has on us: on our sexuality, on our relationships, and even on society at large.

Porn Kills the Urge for Real Sex

Porn is a sexual stimulus, but it is not sex. Notoriously, heroin addicts eventually lose interest in sex: this is because their brains are rewired so that their sex reward system is reprogrammed to seek out heroin rather than sex. In the same way, as we consume more and more porn, which we must since it is addictive and we need more to get the same kick, our brain is rewired so that what triggers the reward system that is supposed to be linked to sex is no longer linked to sex—to a human in the flesh, to touching, to kissing, to caressing—but to porn.  

Which is why we are witnessing a phenomenon which, as best as anyone can tell, is totally unprecedented in all of human history: an epidemic of chronic erectile dysfunction (ED) among men under 40. The evidence is earth-shattering: since the Kinsey report in the 1940s, studies have found roughly the same, stable rates of chronic ED: less than 1 percent among men younger than 30, less than 3 percent in men aged 30-45. 

As of this writing, at least ten studies published since 2010 report a tremendous rise in ED. Rates of ED among men under 40 ranged from 14 percent to 37 percent, and rates of low libido from 16 percent to 37 percent. No variable related to youthful ED has meaningfully changed since then, except for one: the advent of on-demand video porn in 2006. It’s worth repeating: we went from less than 1 percent of erectile dysfunction in young men to 14 to 37 percent, an increase of several orders of magnitude. 

Online forums are full of anguished reports from young men about ED. An agonizing story is eerily common: a young man has his first sexual experience; his girlfriend is willing, he loves her or at least is attracted to her, but finds himself simply unable to sustain an erection (though he is perfectly able to maintain one when he watches porn). Many more report a milder version of the same problem: during sex with their girlfriend, they must visualize pornographic movies in their heads to sustain their erection. They are not fantasizing about something they like more: they want to be present, want to be aroused by a real woman’s scent and touch. They understand perfectly well how absurd it is to be more attracted by the substitute than by the real thing, and it distresses them. Some must put hardcore pornography on in the background in order to be able to have sex with their girlfriends (and, incredibly, the girlfriends agree to this). 

iStock/Getty Images

Fred Wilson, an internet venture capitalist and thought leader, commenting on digital natives’ uncanny ease with new technology, once quipped that there are only two kinds of people: those who first got access to the internet after they lost their virginity, and those who got it before. My family got the internet in the late ’90s when I was a preteen, and so I belong to the latter category, and yet I feel like Grandpa Simpson when I read those testimonies and compare them to my early sexual experiences (which were, I assure you, quite unremarkable). Then again, back in my day, cars got 40 rods to the hogshead, and online pornography meant an endless maze of text link directories and broken search engines with dead links, slow-to-load images, short video clips you had to download, frustrating paywalls guarding the “good stuff”—not Tube sites with infinite, immediate, streaming, high-definition video, 24/7, in your pocket, for free, driven by powerful algorithms designed by data scientists to maximize user engagement. 

Imagine that we discovered that some bacteria were causing ED to jump from 1 percent to 14 to 37 percent—there would be a national panic, cable news networks would go wall-to-wall, Congress would be holding hearings every day, state and federal prosecutors would be on a hunt for perpetrators to make the Mueller and Starr investigations look like an Amazon customer satisfaction survey. Collectively, we would take very seriously the alarming possibility that anything that could cause something like this was bound to have other, likely profound, effects on human health and social life. 

Last year, an article in The Atlantic went viral after it decried a “sex recession” among young people. Young people are simply having less and less sex. The author, Kate Julian, noted that the phenomenon is not exclusive to the United States but is prevalent across the West—Sweden’s health minister called its declining sex rates (even Sweden is having less sex!) “a political problem,” in part because it risks negatively impacting the country’s fertility. 

Julian also noted that Japan has been a precursor, entering its sex recession earlier—and that it is also “among the world’s top producers and consumers of porn, and the originator of whole new porn genres” and “a global leader in the design of high-end sex dolls.” To her credit, she seriously looked into porn as a probable cause for the sex recession, although none of the voluminous subsequent commentary on the piece I can recall reading discussed this potential cause. 

Now, a conservative like myself might think that young people having less sex might not be such a bad thing! And it is true that over the same period, pathologies such as teen pregnancies and teen STDs have declined. Except that whatever the causes, I think we can safely rule out a religious revival or a sudden upsurge of traditional values. Whatever we might believe men should do about their sexual urges, if young healthy men aren’t having sexual urges at all in massive, unprecedented numbers, that is surely a sign of something wrong with their health.

Warping the Brain

Perhaps young people aren’t having sex because the men can’t get it up. Or perhaps it’s because women don’t want to have sex with those men who can do it, but whose brains have been warped by porn.

Because porn does warp the brain. The basic mechanism of porn addiction, you’ll recall, is that when we watch porn, we get a jolt of dopamine, and when we do, we get a follow-up dose of DeltaFosB that rewires our brain to link sexual desire with porn—but not just to any porn. To the porn we watch. 

Remember the Coolidge Effect: the thing that causes a veritable flood of dopamine and makes online porn a “superstimulus” that breaks our brains, unlike Uncle Ted’s Playboy collection, is novelty. 

Like all addictions, online porn has diminishing returns. We need more. We need new. And the easiest way to get it—especially on Tube sites, which, like YouTube and Netflix, “helpfully” provide suggestions all around the video you’re watching, generated by algorithms programmed to keep viewers glued and coming back—is new genres. Just a click away. And there’s infinitely many. 

Virtually all pornography, very much including the “vanilla stuff,” has grown more extreme, more violent, and specifically more misogynistic and degrading towards women.

In 2014, researchers at the Max Planck Institute used fMRI to look at the brains of porn users. They found that more porn use correlated with less grey matter in the reward system, and less reward circuit activation while viewing sexual photos—in other words, porn users were desensitized. “We therefore assume that subjects with high pornography consumption require ever stronger stimuli to reach the same reward level,” the authors wrote.

Another study, this time from Cambridge University in 2015, also used fMRI, this time to compare the brains of sex addicts and healthy patients. As the accompanying press release put it, the researchers found that “when the sex addicts viewed the same sexual image repeatedly, compared to the healthy volunteers they experienced a greater decrease of activity in the region of the brain known as the dorsal anterior cingulate cortex, known to be involved in anticipating rewards and responding to new events. This is consistent with ‘habituation’, where the addict finds the same stimulus less and less rewarding.” 

But it’s not just sex addicts who show this behavior. When the healthy patients repeatedly were shown the same porn video, they got less and less aroused; but, “when they then view a new video, the level of interest and arousal goes back to the original level.” In other words, it doesn’t take much for the addiction mechanism to kick in, since we’re already genetically predisposed to seek out sexual stimulus.

The bottom line is the syndrome doesn’t just make us crave more, it makes us crave novelty. And what kind of novelty, specifically? Empirically, it’s not just any kind of novel. In practice, what most triggers the Coolidge Effect is what produces surprise, or shock. In other words, like water flowing downhill, we are drawn to porn that is increasingly taboo—specifically, more violent and degrading. 

The Disturbing Shock Drive of Porn

Recently, comedian Ryan Creamer became a viral online sensation after it surfaced that he had created a channel on PornHub, the world’s biggest “YouTube for porn” site, where he posted, as Buzzfeed aptly described it, “hilariously wholesome and uplifting videos.” Creamer’s G-rated videos invert online porn clichés, featuring him in his best impression of Ned Flanders, with titles like “I Hug You and Say I Had a Really Good Time Tonight” and “POV FOREHEAD KISS COMPILATION” (“POV” stands for “point of view,” or videos filmed from a character’s first-person perspective; compilations are a rising online porn genre, another data point to show widespread habituation: even a new video doesn’t have enough novelty, we need quick-cut montages). 

None of the commentary pointed out the obvious implication: his stunt captured people’s imagination precisely because almost all of PornHub—what its sophisticated algorithms know its viewers want—is not just pornographic in some abstract sense, but nasty, shocking, and degrading. 

One of Creamer’s videos is titled “I, Your Step Brother, Decline Your Advances but Am Flattered Nonetheless”; last year, Esquire reported) that “incest is the fastest growing trend in porn.” (Tube sites ban videos that explicitly refer to incest, but it is still full of videos featuring “stepdads” and “stepmoms” and “half-brothers” that everyone understands to mean “dads,” “moms,” and “brothers.”) 

Another rising popular genre has been so-called “interracial” porn, which nearly always means a specific type of interracial congress: black men and white women. The genre is inevitably based on the worst racial stereotypes and imagery. And interracial porn not only has been getting more popular, and more degrading to women, but more racist. As conservative writers who opposed Trump in 2016 found out from their Twitter mentions, a newly popular genre is “cuckolding,” which involves a white man watching his wife or girlfriend have sex with a black man (or several). When mainstream media outlets notice the phenomenon, it is taken as evidence of white Americans’ deep racism. No doubt buried racial attitudes must play a role, but consider the trendline; if hidden racism is the main cause, why should racist porn suddenly explode in popularity while most surveys say racial attitudes are either holding steady or slowly improving? If you keep in mind the sudden popularity of incest porn, the hypothesis that it is widespread desensitization due to addiction which is causing the rise becomes much more plausible. 

It’s worth pausing to note the denial-driven disconnect between what we talk about and what we all know to be happening. Earlier this year, the country went into a moral panic when it was discovered the governor of Virginia had once worn blackface as part of a costume as a medical student; meanwhile, there is a massively popular and fast-growing genre of entertainment that makes minstrel shows look like a racial sensitivity seminar, and almost nobody talks about it. 

Shock is what best triggers the Coolidge Effect, and taboo-breaking is shocking, by definition; it is a Pavlovian response to shock and surprise from our rat-like reward system. If we had a deep societal taboo against humping tables, table-humping porn suddenly would be exploding in popularity. Instead, we have deep societal taboos against incest, racism . . . and violence against women.

Evan Agostini/Getty Images

Intensifying the High

Kink dot com is one of the top brands in porn. The studio’s specialty is extreme fetishes related to BDSM. Its trajectory is telling. The site was founded all the way back in the dark ages of the internet, in 1997. Sado-masochism as a sexual fetish is as old as man, of course—the 2nd-century Roman poet Juvenal mocks it in his Satires, for example. But, as best as we can tell, like most fetishes it has only ever appealed to a small minority throughout human history. And indeed, Kink spent the better part of its first decade in existence humming along out of view, a little-known small business serving its niche. 

Then, some time in the mid-to-late 2000s, the site exploded in popularity, to the point of becoming as close to a cultural phenomenon as a porn site can be. You can trace its sudden growth in popularity—and mainstream appeal. In 2007, the New York Times Magazine profiled the company. In 2009, it received its first mainstream adult industry award. In 2013, the Hollywood actor James Franco produced a documentary about the company.

That same year, the writer Emily Witt wrote a long, meditative first-person essay for the intellectual progressive magazine n+1 on modern sexuality. For her report, among other things, she attended a shoot for “Public Disgrace,” one of Kink’s “channels” that features, as its tagline says, “women bound, stripped, and punished in public.” The filmings happen in public places like bars or shops that the company rents out for the occasion, and strangers off the street are invited to perform sexual acts on the “bound, stripped” actress. 

Kink has expanded and expanded to match its sudden success, going from a handful of channels to, as of this writing, 78, and spawning an array of copycats (many even more extreme, naturally). While the company’s PR materials boast of a feminist, egalitarian, empowering view of sexuality, almost all of its actual content features men degrading women rather than the other way around.

Kink’s rise from niche to marquee just happens to coincide with the arrival of Tube sites in 2006, which are uniquely effective at triggering the Coolidge Effect and turning porn addicts into novelty-seeking machines. It’s important to note that, while an attraction to what you might call “light kink”—fluffy pink handcuffs, a rhinestone-bedazzled blindfold, that sort of thing—has been hovering around in our popular culture for decades, and therefore some version of this has been part of pornography for ages, Kink is the real article. It’s not just acting. Women are caned and whipped until they are bruised and red. Not only are the sex acts themselves extreme (you name it, it’s there), but scenes are scripted around the psychological and symbolic, not just physical, degradation of the woman. Fifty Shades of Grey is to Kink as a Hitchcock movie is to a snuff film. 

When the films have a storyline, it can usually be summed up with one word: rape. Or two words: brutal rape. It’s one thing to be aroused by a sadomasochistic scene where the sub (as the term of art goes) is shown visibly enjoying the treatment; it’s quite another to be aroused by watching a woman scream in agony and despair as she is held down and violently raped. 

One series of Kink videos is based on the following concept: the pornstar is alone in a room with several men; the director explains to her (and we watch) that if she can leave the room, she gets cash; for each article of clothing she still has on at the end of the scene, she gets cash; for each sex act that one of the men gets to perform on her, he gets cash and she loses money. One has to grant them a devilish kind of cleverness: it lets them enact an actual violent rape with legal impunity. The woman really resists; the men really force themselves brutally on her. Of course, she “consented” to the whole thing, which, somehow, makes it legal. 

Kink is a revealing example because of its particular focus on degradation, and its sudden, inexplicable, overnight jump from a little-known niche site to one of the most popular media brands of any kind on the planet, right after Tube sites appeared. But the key phenomenon is that virtually all pornography, very much including the “vanilla stuff,” has grown more extreme, and specifically more violent, and specifically more misogynistic and degrading towards women. Oh, nonviolent pornography still exists, if you can find it. What used to be mainstream is now niche, and vice versa. 

I want to carefully unpack this so that what I’m saying isn’t misunderstood. For whatever reasons, male fantasies around female reluctance, around power, coercion, and domination, are as old as life itself (as indeed are female fantasies on these themes). Genres of pornography, and sexual fantasy more broadly, that happen in the grey areas, even dark grey areas, of female consent to sex, have always been around and have always been popular. It’s therefore tempting to look at something like Kink, and the general rise in degrading porn, as simply just another manifestation of that age-old proclivity, and not some new thing. But this is just not true. 

Once you are addicted to online porn, the thing that provides the biggest dopamine jolt is whatever is most shocking.

Historically, sexual fantasies that involved some measure of coercion may have aroused many men, but those same men were disgusted by violent rape and brutal degradation. The point is not to “defend” the former or to deny that they represent something dark and condemnable in the human soul—of course they do. The point is simply to say that something has changed, seriously, dramatically, and seemingly overnight. 

We are told that people’s sexual proclivities are hard-wired from birth or perhaps from early childhood experiences, but science says they can and do change. In a famous experiment, researchers sprayed female rats—yes, rats again—with the odor of a dead rat body, which rats instinctively flee from, and introduced virgin male rats. The male rats mated with the females nonetheless—so far, so mammalian. But, crucially, when those same male rats were later placed in a cage with various toys, they preferred to play with the ones that smelled like death. The sexual stimulus had rewired their reward system. In a scientific survey of online porn users in Belgium, 49 percent “mentioned at least sometimes searching for sexual content or being involved in [online sexual activities] that were not previously interesting to them or that they considered disgusting.”

Once you are addicted to online porn, the thing that provides the biggest dopamine jolt is whatever is most shocking. And the reward cycle means you need a bigger dopamine boost every time—something newer, more shocking. And each time, DeltaFosB rewires your brain, creating and strengthening the Pavlovian mechanism by which you do become attracted to those shocking images, and in the process overwriting the neural pathways which link normal sex—you know, nonviolent, non-incestuous—to the reward center. 

Crucially, this overturns the prevailing narrative on porn’s impact on our sexuality. This says that the only problem with deviant porn is viewers thinking “it’s normal,” and therefore, as long as they are educated that it is not, they can safely enjoy their fantasy without harming themselves or their partners. It would be better if it were so, but the evidence shows that this is dead wrong. Alcoholics don’t drink themselves to an early grave because they somehow haven’t been made aware of enough facts about the dangers of drinking—indeed, they know all too well, and the shame this causes is a classic trigger for more bingeing. 

Porn works at the same fundamental level, the level of our primal, rat-like, reward center, the part of our brain honed by millions of years of evolution to be the wellspring of our most powerful urges. Porn doesn’t change what we think, at least not directly, it changes what we crave.

Changing What We Crave

In 2007, two researchers tried to do an experiment, initially unrelated to porn, studying sexual arousal in men in general. They tried to induce the subjects’ arousal in a lab setting by showing them video porn, but ran into a (to them) shocking problem: half of the men, who were aged 29 on average, couldn’t get aroused. The horrified researchers eventually identified the problem: they were showing them old-fashioned porn—the researchers presumably were older and less internet-savvy than their subjects.

“Conversations with the subjects reinforced our idea that in some of them a high exposure to erotica seemed to have resulted in a lower responsivity to ‘vanilla sex’ erotica and an increased need for novelty and variation, in some cases combined with a need for very specific types of stimuli in order to get aroused,” they wrote

Incredibly, porn can even affect our sexual orientation. A 2016 study found that “many men viewed sexually explicit material (SEM) content inconsistent with their stated sexual identity. It was not uncommon for heterosexual-identified men to report viewing SEM containing male same-sex behavior (20.7 percent) and for gay-identified men to report viewing heterosexual behavior in SEM (55.0 percent).” Meanwhile, in its “2018 Year in Review,” PornHub disclosed that “interest in ‘trans’ (aka transgender) porn saw significant gains in 2018, in particular with a 167 percent increase in searches by men and more than 200 percent with visitors over the age of 45 (becoming the fifth most searched terms by those aged 45 to 64).” 

When this phenomenon is discussed at all, the prevailing narrative is that these men are repressed and discover their “true” sexual orientation through porn—except that the men report that the attraction goes away when they quit online porn. 

This is astonishing. The point is not to try to start a moral panic about the internet turning men gay—the point is that it’s not turning them gay. 

But perhaps it’s turning at least some men into something else. Andrea Long Chu is the name of an American transgender writer, who writes with admirable honesty about her gender transition and experience. For example, Chu braved criticism from trans activists by writing in a New York Times essay about the links between her gender transition and chronic depression, and denying that her transition operation will make her happy. In a paper at an academic conference at Columbia, Chu asked: “Did sissy porn make me trans?” Sissy porn is a genre—again, once extremely obscure and inexplicably, suddenly growing into the mainstream—where men dressed like women perform sex acts with men in stereotypically submissive, female roles. Sissy porn is closely related to the genre known as “forced feminization,” which is pretty much just what it sounds like. In a recent book, Chu essentially answers her own question: “Yes.” 

It’s unclear—unknowable, perhaps—to what extent Chu’s experience matches up with the increasing rate of sexual transitions, but even if her example is purely anecdotal, it should serve to underscore the point: porn rewires our brain at a fundamental level and changes what we crave. And that should alarm us regardless of what we believe about transgender issues.

iStock/Getty Images

Porn Also Affects Relationships 

Let’s pause and review: we’ve established that today’s porn is neurochemically addictive like a hard drug, and that this addiction is having a widespread and alarming impact on sexuality, from never-before-seen rates of erectile dysfunction to the growing popularity of extreme fetishes to (potentially) the “sex recession.” That’s surely bad. 

But, to play devil’s advocate, is it really that bad? 

Alcoholism or heroin addiction, say, will not just wreck someone’s sexuality—which they will—but their entire lives and those of people around them. Directly and indirectly, they are responsible for countless deaths every year. It sounds like we should be concerned about porn, sure, but should we really hit the panic button? 

Well, one preliminary answer is that porn addiction affects our lives beyond just sexuality—which makes intuitive sense since, after all, sex touches all areas of our lives.

First, porn affects addicts’ views of women. The idea that porn is “just a fantasy”—that watching degrading porn doesn’t make one more likely to develop misogynistic or sexual pathological tendencies any more than watching a Jason Bourne movie means you’re likely to start punching and shooting people—may or may not have been true in the Playboy era, but it’s definitely not true now. 

A 2015 literature review looked at 22 studies from seven different countries and found a link between consumption of online pornography and sexual aggression.

An academic review of no less than 135 peer-reviewed studies found “consistent evidence” linking online porn addiction to, among other things, “greater support for sexist beliefs,” “adversarial sexist beliefs,” a “greater tolerance of sexual violence toward women,” as well as “a diminished view of women’s competence, morality, and humanity.” 

To repeat: a diminished view of women’s . . . morality, and humanity. What have we done?

Given all of that, from endemic ED to increased sexual fetishism and even misogyny, it should come as no surprise that porn addiction is having a negative impact on relationships. 

A 2017 meta-analysis of 50 studies, collectively including more than 50,000 participants from 10 countries, found a link between pornography consumption and “lower interpersonal satisfaction outcomes,” whether in cross-sectional surveys, longitudinal surveys, or laboratory experiments. 

Another study of nationally representative data found that porn use was a strong predictor of “significantly lower levels of marital quality”—the second strongest predictor of all the variables in the survey. This effect showed after the authors controlled for confounding variables like dissatisfaction with sex life and marital decision-making: this suggests that porn use correlates with marital unhappiness not because spouses who become unhappy turn to porn, but rather that porn is the cause of the unhappiness. 

Yet another study, using representative data from the General Social Survey, polling thousands of American couples every year from 2006 to 2014, found that “beginning pornography use between survey waves nearly doubled one’s likelihood of being divorced by the next survey period.” Most terrifying, the study found the group whose probability of divorce increased the most was couples who initially reported being “very happy” in their marriage and began using porn afterward. 

The rebound effect of porn addiction on girlfriends and wives is very real. Popular culture is adamant that a liberated, open-minded woman must be relaxed about her partner’s use of porn. On “Friends,” that Rosetta’s Stone of American culture, Chandler’s chronic masturbation during his relationship with Monica was a recurring gag, and each time the show’s writers made the point of showing us Monica approved. In fact, despite the brainwashing, surveys say that large numbers of women disagree with their men using pornography while in a committed relationship. Finding out that your partner uses porn is often experienced, if not as a form of betrayal, then at least as a form of rejection—probably made worse by the fact that she “knows” she “can’t” object, and also by the fact that (unlike in the “Friends” era) she also knows that porn almost certainly means violent, degrading, misogynistic stuff (or worse). 

The most obvious negative impact is on body image and self-esteem. A majority of women in one study described the discovery that their man uses porn as “traumatic“; they not only felt less desirable, they reported feelings of lower self-worth. Some women can experience symptoms of anxiety, depression, and even post-traumatic stress disorder.

A 2016 survey of men aged 18 to 29 found

the more pornography a man watches, the more likely he was to use it during sex, request particular pornographic sex acts of his partner, deliberately conjure images of pornography during sex to maintain arousal, and have concerns over his own sexual performance and body image. Further, higher pornography use was negatively associated with enjoying sexually intimate behaviors with a partner.

We can’t prove a direct causal link between porn addiction and the “sex recession,” but come on: even putting aside skyrocketing ED, given what porn addiction does to male sexuality, from the female perspective, sex with a male porn addict sounds like an experiment you don’t want to repeat—and at this point, it’s a fair bet that most young men are porn addicts.

Given all this, while we don’t have enough research yet to make a scientifically-conclusive judgment, I highly suspect a link between male (especially teen) porn use and the widely-reported and sudden increase in depression and other neuropathologies among young women. Writing as a former teenage male, I will posit that even in the best of times most teenage males are not the best kinds of human beings, especially for teenage girls; I can scarcely imagine what it must be like to be a teenage girl when close to 100 percent (as we might safely assume) of the potential relationship pool is porn-addicted.

Not that pornography only affects sexual and romantic relationships. Porn causes loneliness. In part, this is because it is true of all addiction, which typically causes powerful feelings of shame that make us want to avoid or even push away other people. And addiction causes us to engage in antisocial behavior: though I wasn’t able to find a study, there are many online testimonies of people losing their job because they couldn’t stop themselves visiting porn sites at work. 

According to a study by Ana Bridges, a University of Arkansas psychologist who focuses on porn’s impact on relationships, online porn users report “increased secrecy, less intimacy and also more depression.”

iStock/Getty Images

Porn Addiction Causes Brain Damage

Once we understand today’s porn, it makes intuitive sense that it would negatively affect relationships, given its impact on sexuality, views of women, and the impact of any addiction on social life and well-being generally. But what about its effects on the rest of human life? Again, porn is the new smoking—and what smoking does to your lungs, porn does to your brain. How could that not affect everything we do?

How does that work? Remember, compulsive porn use causes the release of the substance DeltaFosB, whose job is to rewire our brains. This is how over time, addiction doesn’t just make someone crave more and more of something, but also insidiously turns him into a different person. 

Perhaps the most striking and far-reaching discovery in neuroscience over the past 20 years has been the idea of neuroplasticity. Scientists used to think of the brain as a kind of machine, like an extremely intricate clock or circuit board, whose structure is basically settled once and for all, at birth or at some time in early childhood. 

It turns out that our brain is much more complex and organic. It is constantly changing, constantly rewiring itself, constantly transforming. The various functions of our brains are performed by neural pathways, and the analogy is that they are like muscles. Aristotle was right—you are what you repeatedly do. That is largely good news, but there is one downside: neuroplasticity is a competitive process. When you “work out” one part of your brain intensely, it will essentially steal resources from nearby areas of the brain to “pump itself up” if these are left dormant.

It’s easy enough to see how that works out when someone suffers from addiction. Every time you light up, or shoot up, or watch porn, that is like an intense “workout” for one set of neural “muscles”—which drains resources away from the rest of the brain. 

Specifically, the release of DeltaFosB that comes with porn use weakens our prefrontal cortex. The prefrontal cortex is everything the rat brain is not; it is because humans have a big prefrontal cortex that we have civilization. This is the thinking part of the brain, which calculates risk, controls impulses, allows us to project ourselves into the future and therefore plan, and handles abstract and rational thinking. In terms of Plato’s famous chariot allegory, which describes reason as a charioteer whose job is to lead the two unruly horses, Thymoides, our temperament, and Epithymetikon, our base instincts, the prefrontal cortex is the charioteer. 

Neuroimaging studies have shown that addicts develop “hypofrontality,” the technical term for an impaired prefrontal cortex. People with hypofrontality exhibit lower amounts of gray matter, abnormal white matter, and a reduced ability to process glucose (which is the brain’s fuel) in the prefrontal cortex. 

Given what we know porn does to the brain, and given that we know that the younger the brain the more plastic it is, it is a near certainty that whatever porn addiction does to adults, it’s going to do to minors—except much worse.

Hypofrontality manifests in a decline in what psychologists call executive function. As the name executive function suggests, this is a pretty important feature of our minds. Executive function includes our decision-making faculties, our ability to control impulses, to evaluate risk, reward, and danger. Yes, just that. Scientists don’t fully understand how addiction causes hypofrontality, but it makes intuitive sense that the two should be linked. Addiction is such a bane because even as our urges for the next hit get stronger, our capacity to control urges weakens. The horses get carried away even as the charioteer’s arms go weak. 

I have found close to 150 brain studies that find evidence of hypofrontality in internet addicts—which, it’s safe to assume, is nearly synonymous with internet porn addicts, at least for males—and more than a dozen that have found signs of hypofrontality in sex addicts or porn users. 

That’s right: porn addiction literally atrophies the most important part of our brain.

A 2016 study split current porn users into two groups: one group who abstained from their favorite food for three weeks, and one group who abstained from porn for three weeks. At the end of the three weeks, porn users were less able to delay gratification. Because this is a study with a randomly-assigned control group, it’s solid evidence for a causal link (rather than just a correlation) between porn use and lower self-control. 

Here are some other cognitive problems that scientific studies have linked to porn use: decreased academic performance, decreased working memory performance, decreased decision-making ability, higher impulsivity and lower emotion regulation, higher risk aversion, lower altruism, higher rates of neurosis. These are all symptoms related to hypofrontality. 

Other studies have found links between porn and high stress, social anxiety, romantic attachment anxiety and avoidance, narcissism, depression, anxiety, aggressiveness, and poor self-esteem. These aren’t direct symptoms of hypofrontality, but it’s easy to see how someone with impaired executive function would be at greater risk of developing any number of those pathologies. The studies generally find that the more porn use, the greater these problems. 

So neuroplasticity means that porn addiction, by strengthening certain neural pathways in the brain, weakens others, especially those related to executive function. 

But there’s another alarming implication for what neuroplasticity means for porn addiction: while we now know that, at any age, the brain is much more plastic than we previously thought, there is still no doubt that, all else being equal, the younger we are the more plastic our brains. You can learn, say, a foreign language or a musical instrument at any age, but there is a level of skill that you will only ever achieve if you start young. Our brains are always plastic, but they are still much more plastic when we are young. Furthermore, when certain pathways are solidified at a young age, they tend to stay that way, because while it is still possible to change them later on in life, it is much harder. 

The Impact of Porn on the Child Brain

This brings us to another enormous taboo related to porn: say whatever you will about adults consuming it, in theory we all agree that children shouldn’t be exposed to it—yet in reality, we all know just as well that they are. In prodigious amounts. Just as we know that the porn sites do absolutely nothing to prevent kids from consuming it. 

The statistics are terrifying. According to a 2013 Spanish study, “63 percent of boys and 30 percent of girls were exposed to online pornography during adolescence,” including “bondage, child pornography, and rape.” According to the British Journal of School Nursing, “children under 10 now account for 22 percent of online porn consumption under 18.”

A 2019 literature review found the following negative effects, drawing from more than 20 studies: “regressive attitudes towards women,” “sexual aggression,” “social maladjustment,” “sexual preoccupation,” and “compulsivity.” One study found “an increase in incidents of peer sex abuse among children and that the perpetrator commonly had been exposed to pornography in many of these incidents.” The review also found that “studies of girls’ exposure to pornography as children suggest that it has an impact on their constructs of self.” Among other negative effects, studies of teens more specifically found a “relationship between pornography exposure and . . . social isolation, misconduct, depression, suicidal ideation, and academic disengagement.” 

Furthermore, “children of both sexes who are exposed to pornography are more likely to believe that the acts they see, such as anal sex and group sex, are typical among their peers.”

It’s harder to show a direct causal link scientifically, but it still stands to reason that there should be a link between the porn explosion and the widely documented explosion in mental health problems among teenagers.

While the causes of what’s been called a mental health crisis among teenagers are hotly disputed, the actual facts are not: according to the National Survey on Drug Use and Health, an official government survey which looks at a very broad cross-section of Americans—over 600,000— “from 2009 to 2017, major depression among 20- to 21-year-olds more than doubled, rising from 7 percent to 15 percent. Depression surged 69 percent among 16- to 17-year-olds. Serious psychological distress, which includes feelings of anxiety and hopelessness, jumped 71 percent among 18- to 25-year-olds from 2008 to 2017. Twice as many 22- to 23-year-olds attempted suicide in 2017 compared with 2008, and 55 percent more had suicidal thoughts,” writes San Diego State University psychologist Jean Twenge. 

So the teenage mental health crisis began around 2009, right after smartphones and Tube sites changed the nature of porn. Again, not scientific proof of a causal link, but certainly suggestive.

The bottom line is this: given what we know porn does to the brain, and given that we know that the younger the brain the more plastic it is, it is a near certainty that whatever porn addiction does to adults, it’s going to do to minors—except much worse. This is something we must conclude simply from knowing about the basic facts of human neurobiology, even without taking into account any negative psychological effects of exposure of children to hardcore pornography. 

iStock/Getty Images

Might Porn Cause Societal Collapse?

I have tried to be as careful as possible and only to lay out carefully-drawn scientific arguments. We can, and should, debate morality, but we should be clear about facts. And in a world where a million articles claim everything and its opposite on the basis of some “study,” I wanted to be as precise as possible about what we can know scientifically about porn, with a high degree of certainty, versus things we can strongly suspect, albeit not prove. 

We know what porn does to the brain, because the medical science is solid. Because social science is much softer, we can’t know for certain what causal impacts porn has on society, if any. But once we realize that we have to be much more humble in this area, we can still make prudential judgments.

Remember the sex recession? It seems that Japan is a precursor in all kinds of recession: just as it went first into the zero interest rate economic environment that the rest of the rich world has been experiencing since 2008, and which looks more like a new permanent state with each passing day, Japan also entered its sex recession a decade before us. Japan also got broadband internet earlier than the rest of the world. Could it be that Japan is an example of what’s likely to happen to us if we don’t do something about porn addiction? 

Since Japan got broadband internet, the younger generations have gone through significant social changes. “In 2005, a third of Japanese single people ages 18 to 34 were virgins; by 2015, 43 percent of people in this age group were, and the share who said they did not intend to get married had risen, too. (Not that marriage was any guarantee of sexual frequency: A related survey found that 47 percent of married people hadn’t had sex in at least a month.),” The Atlantic’s Kate Julian wrote in her article on the sex recession. 

In Japan, this new generation of sexless men—and the Japanese sex recession is caused by men’s lack of interest, to the vocal dismay of young Japanese women, if media reports are to be trusted—are known as soushoku danshi, literally “grass-eating men”—in a word, herbivores. The epithet was originally coined by a frustrated female columnist but, incredibly, the herbivores aren’t offended and most of them are happy to identify as such. 

Given Japan’s population decline, the herbivores, who have become a massive subculture, are a subject of national debate in Japan, Slate’s Alexandra Harney reports. And what seems to define the herbivores is not just that they have no interest in sex, it’s that they don’t seem to be interested in much of anything at all. 

They tend to live with their parents. After all, it’s hard to find a place to live when you don’t have a steady job, which herbivores say they don’t look for, because they’re not interested in a professional career. Not that they’re opting out of productive society to focus on, say, art, or activism, or some other form of creativity or counter-culture. Apparently, one of the few hobbies that seem to be popular among herbivores is . . . going on walks. To be fair, walking is an important part of digestion for ruminants. 

What herbivores do seem to be interested in is spending the vast majority of their time alone, on the internet. Herbivores who have a social life keep it restricted to a small circle of friends. While the Japanese used to be notorious for their national obsession with tourism, they don’t like to travel abroad. They have created a new market for yaoi, a Japanese genre of bodice ripper-style romance portraying homoerotic relationships between men; while yaoi’s audience has traditionally been female, the male herbivores like yaoi

Countless explanations are proffered for the herbivore phenomenon, from cultural to economic, and it makes intuitive sense that some of those factors would be at play. Nevertheless, I find it striking that everything we know about the herbivores matches with what we know about online porn addiction, in particular reduced libido and overuse of the internet. We also know that Japan has growing markets for sex toys for men, but not for women, as well as for extreme and homoerotic pornography, which is consistent with a population that’s been desensitized to normal sex stimulus by online porn addiction. 

Beyond sexuality, the herbivores seem strikingly like a generation of men suffering from hypofrontality, the neurological disease caused by porn addiction. It seems that their key problem is an inability to commit, whether to a career or a woman. Commitment requires abilities enabled by the prefrontal cortex, like self-mastery, correctly weighing risk and reward, and projecting oneself into the future. Becoming financially independent, visiting a foreign country, moving out of your parents’ apartment, going to parties, meeting new people, asking a girl out—what all these things have in common is that while young men generally want to do them, they can also be intimidating; and it is the executive function of the brain located in the prefrontal cortex that makes it possible to get over the hump of initial reluctance that comes from the lower parts of the brain. 

With Japan on the road to self-extinction in part as a result of its males’ lack of interest in sex or marriage, it’s hard not to think of Nietzsche’s parable of the Last Man, his nightmare scenario for the fate that would await Western civilization after the Death of God if it did not embrace the way of the Übermensch: the last man lives a life of comfort, has all his appetites satisfied, embraces conformity and rejects conflict, and seeks nothing more, incapable as he is of imagination, or initiative, or creativity, or originality, or risk-taking. The Last Man, in short, is man returned to something like an animal state, though not that of a carnivore. Nietzsche compares him to an insect, but herbivore fits quite well. In Nietzsche’s terrifying phrase, the Last Man believes he has discovered happiness. 

Again, it’s impossible to prove scientifically that the herbivore phenomenon is caused by widespread porn addiction. But one thing is certainly very suggestive: there’s no explanation for why, if the herbivore trend is caused by some broader cultural or socioeconomic trends, it should be such an overwhelmingly male phenomenon. Anyone? Anyone? Bueller?

Is Japan a harbinger of the future? Are we on the road to becoming a herbivore civilization? Or, to take another analogy, becoming like the helpless people on the spaceship in “WALL-E,” except we never got around to actually creating the AI and robots that enabled their pointless, ghastly lives of fake pleasure?

Perhaps it sounds hyperbolic. But what we do know is that large numbers of our civilization are hooked on a drug that has profound effects on the brain, which we mostly don’t understand, except that everything we understand is negative and alarming. And we are just ten years into the process. If we don’t act, pretty soon the next generation will be a generation that largely got hooked on this brain-eating drug as children, whose brains are uniquely vulnerable. It seems perfectly reasonable and consistent with the evidence as we have it to be deeply alarmed. Indeed, what seems supremely irrational is our bizarre complacency about something which, at some level, we all know to be happening.

A Massive Experiment On Our Brains

Another way to approach the question of how to respond is to note that we—the entire advanced world, and soon the whole world, as the prices of smartphones and broadband in developing countries keep dropping—are running a massive, unprecedented experiment on our own brains. Scientists do understand a few things about the brain, but only a few. The human brain is by far the most complex thing in the known Universe, and we are subjecting half of the human population at best, to an unprecedented kind of drug. 

As I write this, the FDA is reportedly considering a complete ban on e-cigarettes. Imagine if, say, a popular health supplement was shown to, oh, increase the rate of ED among young men by some percentage, let alone several orders of magnitude, or be as addictive as cocaine in large segments of the population. Surely some spotlight-hogging prosecutor would have the company’s owners doing a nationally-televised perp walk before you could say “Four Loko”—unless, of course, he was himself getting high on the stuff and was too ashamed to take a public stand.

An analogy might be in order here: climate change. There are some things we know scientifically to be true: we know that greenhouse gases lead to higher temperatures all else equal; we know that humans are emitting more and more greenhouse gases; we know that temperatures are increasing; we know that greenhouse gases are increasing to unprecedented levels. 

We don’t know, scientifically, precisely, what that means for the future. Earth is much too complex an organism for us to be able to predict with high confidence what climate change will mean, specifically—indeed, the best justification for alarm is precisely the fact that we are in uncharted territory when it comes to levels of greenhouse gases and temperatures. This is why the U.N.’s Intergovernmental Panel on Climate Change, which represents the scientific consensus on climate change, provides not predictions of the future impact of climate change, but probability distributions (read them if you don’t believe me). 

On the basis of the current state of science, we have a preponderance of evidence leading to a rationally justified belief that never-before-seen levels of greenhouse gases and temperature increases create an unacceptable level of risk of negative outcomes, including catastrophic outcomes, so that some kind of collective action (putting aside angry debates on what kind of action) is justified to curb greenhouse gas emissions. The Earth is much too complex for us to understand it completely, and this is actually the best argument for why it’s reckless to pump it full of chemicals at unprecedented levels. After all, we don’t have an Earth 2. (And yes, paradoxically given conservatives’ reluctance to embrace ambitious action on climate change, this is an inherently conservative argument.)

You can see where I’m going: however precious Earth is, so are our brains; however complex Earth is, so much so are our brains, which are the most complex artifacts in the known universe. I don’t see why the same logic doesn’t apply. 

The stakes are comparably high, the logic for action is the same, and yet these respective causes get wildly divergent levels of public attention and political capital. 

It took a long time between the moment when the evidence for smoking’s link to lung cancer and a whole host of negative health outcomes became incontrovertible. And it took a long time between that moment and when we as a society accepted that evidence and decided to act. This was in part due to legitimate scientific questions early on, in part due to the influence of greedy, monied interests, and in part because of misguided pseudo-libertarian rhetoric. But also in part because so many people were reluctant to admit that their beloved, pleasurable habit, was in reality a destructive addiction—and they were all the more reluctant to admit it because they knew, deep down, that it was the truth. 

I still smoke. But, at least, I have stopped lying to myself about why I do it. It’s time we as a society stopped lying to ourselves about what has become the biggest threat to public health.

Weekend Long Read

The Tortoise and the Hare of Modernity Reconsidered

Hares do not countenance irrational impediments such as “taboos.” Their response to the tortoises who deploy them is a mixture of loathing, hysteria, and contempt. But as a wise man put it, “The madman is not the man who has lost his reason. The madman is the man who has lost everything except his reason.”

Not to be overly paradoxical about it, but the names Donald Trump, Adam Schiff, and Jerry Nadler will not appear in this essay. Like you, I am weary of that shrill and unproductive static. Let us, then, take a brief holiday and consider a different sort of problem, a problem that stands behind—admittedly pretty far behind—that static I mentioned and which we might do well to think about. Let us, for lack of a better phrase, call it “Modernity and Its Discontents.”

No educated person in the English-speaking world can hear that phrase and fail to think of the memorable English title that James Strachey gave to Freud’s late masterpiece: Civilization and Its Discontents. Pressed to give a single word summary of what Freud concluded about those Unbehagen, those “discontents,” one could do worse than offer the brief imperative “No.” “Sad is eros, builder of cities,” W. H. Auden wrote, and that sadness, Freud thought, followed inevitably from the basic instinctual denials that made civilization possible. 

“Modernity,” it will be pointed out, is not quite, or perhaps I should say “not hardly,” coterminous with “civilization.” If pressed to give a one-word précis to describe the Unbehagen in die Modernität, I might venture to suggest that it centrally involved the diminishment, the attenuation, the abandonment of that imperative denial that Freud analyzed. 

Unfortunately, the loss or—more to the point—the active repudiation of “no” does not necessarily get you to any positive “yes.” 

That’s the idea, of course: that by kicking over the traces, by saying “no” to all those inherited constraints, habits, structures, customs, prejudices, and dispositions that made us who we are, we thereby emerge into a glorious sunlit upland in which we enjoy the cities but dispense with the sadness. 

The reality has been somewhat different. George Orwell gave dramatic expression to one set of differences when he noted that “For two hundred years we had sawed and sawed and sawed at the branch we were sitting on. And in the end, much more suddenly than anyone had foreseen, our efforts were rewarded, and down we came. But unfortunately there had been a little mistake. The thing at the bottom was not a bed of roses after all, it was a cesspool full of barbed wire.”

Now, Orwell was a gloomy chap, and it would certainly be possible to instance plenty of other less dire eventualities that await the civilizational carpentry he describes. Everyone reading this, I’d wager, has done time with his own sort of saw, and the vast majority of us find ourselves in situations far less horrible than the barbed wire bedizened pit Orwell imagined. 

And yet, and yet . . . It’s hard to deny that Orwell was on to something. At the center of modernity’s discontents, I believe, is a widespread sensation that we are precariously suspended over something minatory and unfathomable, combined with the nagging suspicion that it wasn’t always thus, that there was a time when one’s feet touched the ground and the landscape around us bore the familiar traces of human habitation. Why that is no longer the case, or, at least, can no longer be taken for granted as a cultural given, has been a matter for speculation at least since Matthew Arnold spoke of the melancholy, long withdrawing roar of the Sea of Faith on ebb tide

It might be pointed out that Arnold’s image is potentially rather cheering. After all, a tide that ebbs is also presumably a tide that eventually floods, though it is not at all clear that we’ve had much evidence of that uplifting periodicity. 

Indeed, part of what makes the withdrawal Arnold evokes so melancholy is that it impresses most of us more as an irrefragable feature of modernity than a contingent option. It is not, if we are honest, something we can just dispense with. It is, for better or worse, probably for better and worse, part of who we moderns are. 

That said, I also believe that there are several features of our situation that are often presented as inevitable which in fact are matters of contention, deliberation, remediation, even, if I may employ an unfashionable term, choice. 

Anything like a full survey of this waterfront would take a book, or rather a shelf full of books; For now, I’d simply like to touch briefly and incompletely on what I take to be three negotiable elements of that dispensation we call modernity. 

iStock/Getty Images

The Velocity of Modernity

The first, most politically malleable element I call “velocity.” It is adumbrated by a slight variation on title of this essay, “The Tortoise and the Hare Reconsidered.” I don’t have a lot to add to the wisdom of Aesop, namely, that the fastest way to your goal may be the slowest and most leisurely. 

Now, that is not an insight greatly appreciated by the modernness of modernity. On the contrary. Faster. Brighter. Sooner. Newer. More efficient. More innovative. More tomorrow and less yesterday. More hares, that’s to say, and fewer tortoises. That’s what is wanted. 

Velocity is a signal marker of modernity—a glory, perhaps, as well as a discontent, but certainly a leading characteristic. 

There are many sides to modernity’s love affair with velocity. Here I’d like to consider just one: its assault on deliberateness. Back in 2008, when what has come to be called the Great Recession was just beginning, Rahm Emanuel, then Barack Obama’s chief of staff, gleefully said, “You never let a serious crisis go to waste.” 

What he meant was that a crisis makes people anxious and therefore vulnerable, and that it is easier in periods of crisis to exploit that vulnerability and push through initiatives to enlarge government and meddle more. 

I’d like to urge caution about that impulse. In politics, I would say, deliberateness is an undervalued virtue, particularly in periods of crisis. Pace Rahm Emmanuel, I’d say that you never want to let a serious crisis upset your judgment. As the British writer Daniel Hannan once observed, “Most disastrous policies have been introduced at times of emergency.” How often have you heard a politician or government bureaucrat tell you that “Doing nothing is not an option”? In fact, as Hannan rightly noted, “Doing nothing is always an option, and often it is the best option.” This was something that Calvin Coolidge acknowledged when he said to a busybody aide: “Don’t just do something; stand there!”

In other words, I’d like to put in a word for what Walter Bagehot celebrated as “slow government.” The American socialist and perpetual presidential candidate Norman Thomas defined socialism as “democracy in a hurry.” Socialism’s velocity, Thomas thought, was a major part of what recommended it. Bagehot disagreed. “The essence of civilization,” he wrote, “is dullness.”

In an ultimate analysis [Bagehot explained], it is only an elaborate invention . . . for abolishing the fierce passions, the unchastened enjoyments, the awakening dangers, the desperate conflicts, . . . the excitements of a barbarous age, and to substitute for them indoor pleasures, placid feelings, and rational amusements. That a grown man should be found to write reviews is in itself a striking fact. Suppose you asked Achilles to do such a thing, do you imagine he would consent?

Bagehot’s point was that, in an advanced civilization, deliberateness, circumspection, and adherence to process are virtues that save us from the myopia of impulsiveness, not to mention the rage of Achilles.

Bagehot was not, I hasten to add, advocating quietism or inaction. If the English had mastered the art of slow, deliberate government, that mastery did not hinder an energetic pursuit of their own interests. The achievement was moderation, yes, but it was what Bagehot called “animated moderation,” moderation chastened by deliberateness but underwritten by energy. “When we have a definite end in view,” Bagehot wrote, “we can act well enough. The campaigns of our soldiers are as energetic as any campaigns ever were; the speculations of our merchants have greater promptitude, greater audacity, greater vigor than any such speculations ever had before.” But all that action takes place in a framework of circumspection. It is the deliberate animation of the tortoise, not the frenzied gesticulations of the hare.

As an aside, let me mention that James Madison would have approved of Bagehot’s recommendation and aligned himself on the side of the tortoise. As Greg Weiner shows in Madison’s Metronome, the great Federalist author regarded the Constitution as an instrument for regulating the tempo of politics, a means of putting the brakes on those “fierce passions” of which Bagehot spoke. 

In this context, Weiner draws a fruitful distinction between “governing,” on the one hand, and “policymaking,” on the other. Governing, he observes, “calibrates the extent of policymaking to public needs,” while the ethos of policymaking assumes that the current state of affairs, whatever it is, is wanting. For the policymaker, Weiner notes, “The drive for perfection and ‘rational results’ . . . makes satisfaction a vice and change a constant need. Political leaders are thus judged by the volume of policymaking rather than outcomes.”

Weiner’s distinction between governing, which starts with an affirmation of present realities, and policymaking, which begins with an assumption of their inadequacy, brings us close to what the philosopher Michael Oakeshott analyzed as “rationalism” in politics. “The evanescence of imperfection,” Oakeshott observed, “may be said to be the first item of the creed of the Rationalist.” In Weiner’s terms, the Rationalist is on the side of policymaking, not governing; he is a hare, not a tortoise, an apostle of velocity in politics and constant innovation in life.

Albert Ceolan/De Agostini via Getty Images

Living With Imperfection (or Needing the Dragon)

This brings me to the second negotiable element in the dispensation of modernity. I’m not sure that there is a single word to describe what I mean. It has something to do with the place of imperfection in the metabolism of human life, or, more precisely, it has something to do with our attitude towards imperfection and the extent to which we believe that our well-meaning interventions can eradicate it. The hares of the world look forward to all sorts of splendid futures when humanity has shuffled off the accumulated incapacities that have hitherto blighted life and forestalled utopia. 

The tortoises, on the other hand, wonder whether dispensing with our imperfections, assuming that were possible, would not also mean dispensing with those features that made us capable of whatever moral and political achievements we had in fact made. 

Oakeshott touched on the distinction I am trying to get at when he noted that while the Rationalist might acknowledge that there were problems that were intractable, “what he cannot imagine is politics which do not consist in solving problems, or a political problem to which there is not ‘rational’ solution.”

The late Kenneth Minogue, a past president of the Michael Oakeshott Society, had a marvelous image for the rationalist disposition that Oakeshott describes. It is, Minogue said, a bit like the story of St. George and the dragon. 

After many centuries of superstition, St. George, having donned the mantle of beneficent Rationality, finally appears somewhere about the 16th century. He slays the monsters of kingship and religious intolerance before moving on to such social evils as prison conditions, slavery, inherited privilege, patriarchy, and environmental insensitivity. “But unlike St. George,” Minogue observed, “he didn’t know when to retire.”

The more he succeeded, the more he became bewitched with the thought of a world free of dragons, and the less capable he became of ever returning to private life. He needed his dragons. He could only live by fighting for causes—the people, the poor, the exploited, the colonially oppressed, the underprivileged, and the underdeveloped. As an aging warrior, he grew breathless on his pursuit of smaller and smaller dragons.

It should be noted that the smallness of the dragons pursued does nothing to mitigate the ferocity of St. George’s sallies or the formidable nature of the weapons he deploys. 

The example of John Stuart Mill is instructive in this context. One of the curious features of Mill’s thought was the assumption that the concerted application of reason to social problems would simultaneously lead to a greater unanimity of opinion on all important issues and, at the same time, an increased eccentricity, individuality, and more robust “experiments in living.” 

“As mankind improve,” Mill said, “the number of doctrines which are no longer disputed or doubted will be constantly on the increase; the well-being of mankind may almost be measured by the number and gravity of the truths which have reached the point of being uncontested. The cessation, on one question after another, of serious controversy is one of the necessary incidents of the consolidation of opinion.”

The process Mill envisions slyly undertakes the destruction of inherited custom and belief precisely in order to construct a bulwark of custom and belief that can be inherited. As Mill put it in his Autobiography,

I looked forward . . . to a future . . . [in which] convictions as to what is right and wrong, useful and pernicious, [will be] deeply engraven on the feelings by early education and general unanimity of sentiment, and so firmly grounded in reason and in the true exigencies of life, that they shall not, like all former and present creeds, religious, ethical, and political, require to be periodically thrown off and replaced by others.

So: a “unanimity of sentiment” is all well and good as long as it is grounded in the “true exigencies of life”—as defined, nota bene, by J. S. Mill. It’s nice work if you can get it, and boy did Mill get it!

Mill was a proselytizing rationalist on the pattern Oakeshott described in his famous essay and Minogue evoked with his fable about a modern St. George and the miniature dragons. 

Politics for Mill was essentially a matter of “solving problems.” His great rhetorical feat was to convince the world that this process was synonymous with the operation of reason itself. “We’re all socialists now,” said the future King Edward VII in 1895. He may have been a bit precipitate in that declaration, but it certainly seems that we’re all Millians now, at least with respect to our basic assumptions about the nature of politics and that evanescence of imperfection of which Oakeshott spoke. 

But just as we can decline to celebrate the cult of velocity that aligns modernity with the party of the hares, so we can question Mill’s identification of “rational” with his program of liberal calculation. The English historian Maurice Cowling was on to something when he observed that to argue with Mill, on Mill’s terms, was to concede defeat. “Rational,” Cowling pointed out, does not have to mean conclusions reached by critical self-examination as Mill insisted. “Prejudice” is not, as Mill taught, necessarily synonymous with bigotry: it may, on the contrary, be used to describe commitments about which argument has been declined: but to decline argument is not in itself irrational.

Another day, it would be interesting to talk more about the word “prejudice” and ask what happened between the time that Edmund Burke could speak of that “just prejudice” which rendered a man’s virtue his habit and today when “prejudice” has been firmly enrolled in the index of reactionary vices. Rescuing the word “prejudice” from its seemingly unredeemable membership in the roll call of liberal obloquy would be a great victory for the conservative spirit. For now, however, I’d like to touch very briefly on what I take to be a central antinomy of that species of overbearing rationalism for which Mill was such a beguiling spokesman. 

The antinomy is this: the liberalism Mill advocated is built fundamentally on openness to other points of view, even to those points of view whose success would destroy liberalism. Tolerance to those points of view is a prescription for suicide. But intolerance betrays the fundamental premise of liberalism, i.e. openness. Rock, meet hard place. 

Mill aimed to achieve a society that is maximally tolerant. But he thereby at the same time gave maximum scope to the activities of those who have set themselves to achieve the maximally-intolerant society. Maximum tolerance, it turns out, leads to maximum impotence. The refusal to criticize results in a moral paralysis. That paralysis is the secret poison at the heart of Mill’s liberalism. Mill’s “one very simple principle”—that coercive public opinion ought to be exercised only for self-protective purposes—was, as Mill’s great critic James Fitzjames Stephen noted, “a paradox so startling that it is almost impossible to argue against.”

Note the adverb. Stephen managed to fill about 300 pages arguing against it. It pains me to report that although he won the argument—as the philosopher David Stove put it, Stephen made “mincemeat” out of Mill’s polemic in On Liberty—nevertheless, he lost the battle, which is to say that the hares of Millian liberalism with their addiction to moral velocity and a species of rationalism that is as arrogant as it is abstract have triumphed in the hearts of established opinion. That attack on customary, conventional wisdom has become the new customary, conventional opinion. That prejudice against prejudice has become the new and unspoken prejudice the public everywhere embraces without quite acknowledging. 

Dennis Stacey/Getty Images

Can We Really be Disinterested Spectators?

There are, however, cracks in the carapace of this rationalist consensus, cracks that can be worked at to reveal what I take to be a third negotiable element in Modernity’s Discontents. We glimpse this third element wherever Mill’s doctrine brushes against the moral or spiritual exigencies of human reality. In his book Utilitarianism, Mill writes that “as between his own happiness and that of others, justice requires [everyone] to be as strictly impartial as a disinterested and benevolent spectator.” 

But can that be? Or is it merely benevolent sounding twaddle? If Mill were right about this, Stephen observed, “I can only say that nearly the whole of nearly every human creature is one continued course of injustice, for nearly everyone passes his life in providing the means of happiness for himself and those who are closely connected with him, leaving others all but entirely out of account.” And this, Stephen argues, is as it should be, not merely for prudential but for moral reasons:

The man who works from himself outwards, whose conduct is governed by ordinary motives, and who acts with a view to his own advantage and the advantage of those who are connected with himself in definite, assignable ways, produces in the ordinary course of things much more happiness to others . . . than a moral Don Quixote who is always liable to sacrifice himself and his neighbors.

As Stephen implies, there is an aroma of unreality, a quality of the fantastic, that rises from the anemic currents of the rationalism Mill advocates. 

There is also an unavoidable spiritual thinness. Mill prided himself in overcoming or superseding the religious and social prejudices of the past. The question is whether man can jettison the ground that made him what he is without at the same time jettisoning his humanity. The sorts of questions that formed the title of Gauguin’s famous picture: “Where Do We Come From? What Are We? Where Are We Going?