Great America

As America Burns, the Left’s Racial Resentment Is Exposed

For the most part, the subtext of the riots and those justifying them is as simple as this: the violence is actually the fault of white people, who are just getting a taste of what they have inflicted on others for centuries.

As America burns, millions of people are asking: why?

The proximate cause is known to everyone. No civilized person denies that the killing of George Floyd in Minneapolis was a horrific injustice. But in spite of unanimous consent on this matter, America has turned into a powder keg.

The truth is, the “protests” aren’t about Floyd. They never really were. America is living through a violent insurrection that has been driven by racial resentment towards white people.

Whenever a racially charged incident like this occurs, the event very quickly ceases to be about the killing itself and becomes contextualized into a grand conspiracy narrative that makes America, as a whole, culpable in racial terrorism. It’s a story that has been told a million times: America, according to our moral betters, is a land of hooded Klansmen and cross-burners, in which white racists commit violence against blacks with impunity on a daily basis. Fundamentally, nothing has changed since Reconstruction, according to this neat story.

But you can tell, just by watching TV, that none of this is actually true. Would government officials in a racist country let black and white radicals burn cities to the ground, in the name of anti-racism? In a racist country, would Hollywood, and the mainstream media, and politicians and pundits on the Left and Right, indulge rioting and arson as a form of “expression?”

Nevertheless, the conspiracy narrative is the “official story,” and powerful people remind Americans over and over again to never forget it, in spite of all evidence. They’re saying it louder than ever, it seems.

As a matter of fact, America has never been more multiracial, more diverse, more politically correct, or less white, frankly, than today. America in 2020 considers bigotry the gravest of all crimes: Americans have never been more sensitive to the feelings of the “protected classes,” and those who lack the sincere tolerance to get along understand that there are ruinous consequences for failure to censor prejudiced thoughts, consequences that are borne unequally by “oppressor” and “victim.” 

Meanwhile, white supremacist groups have never been more marginal, irrelevant, or powerless. There has never been a time when “white supremacist” could be more ridiculous a description for America and its political system.

It is not white racism, but something else, that is ascendant in America today. Here’s Van Jones’ take on what the Floyd riots are really all about:

It’s not the racist white person who is in the Ku Klux Klan that we have to worry about. It’s the white liberal Hillary Clinton supporter walking her dog in Central Park who would tell you right now, you know, people like that, ‘I don’t see race, race is no big deal to me, I see us all as the same, I give to charities,’ but the minute she sees a black man who she does not respect or who she has a slight thought against, she weaponized race like she had been trained by the Aryan nation. A Klansmember could not have been better trained to pick up her phone and tell the police a black man, African-American man, come get him. (Emphasis added.)

Even the “most liberal well-intentioned white person has a virus in his or her brain that can be activated at an instant,” he said.

Aspersions to the contrary, America today is a country that pretty much treats white Americans like there is something inherently wrong with them. They are a problem. Like the majority of Americans of every group, most white Americans are not bigots. But in spite of their best efforts to be fair, tolerant, and follow the ever-evolving dictates of political correctness, they have been branded racists anyway, just for existing.

Of course racism is a problem, and some white Americans do commit racially motivated violence. But crimes inspired by racism are not limited to whites. Prejudice is human, and every person and group of people is naturally predisposed to feel most at home among others like them. 

For the most part, Americans can control those instincts and treat others decently. That’s part of what it means to be an American, after all. But sometimes, some do not. No race has a monopoly on racism or racial violence, however, and it would be wrong to blame every white person for the actions of a white man who kills a black person, just as it would be wrong to blame every black person when a black person kills a white person. As far as the Left and media are concerned, however, it is only the first kind of crime that matters.

Whites, and all those who are not part of a politically “protected” group, carry a presumption of guilt with them that makes any adversity or violence they may face at the hands of someone in a protected group trivial. 

On the other hand, even pandemic shutdowns will be put on hold for the “protected” groups. As Leftist domestic terrorists are given free rein to destroy America’s cities, a general rule has been observed that the law cannot be enforced because the anger of a protected class takes precedence over the rule of law. “Law and order” is now racist; violent insurrection is now justice. This response can only be described as “the soft bigotry of low expectations” on steroids.

Is this so surprising, though, in a nation whose leading newspaper has adopted the mission of teaching the next generation of students that America was founded, not on freedom, but slavery and genocide?

The young black and white malcontents who are destroying America’s cities are driven by a powerful, collective resentment against white Americans that is borne out of a belief, spread by academics for decades, that white terrorism is at the heart of America’s identity and social structure. White Americans collectively are said to be responsible for some kind of ancient crime, and the debt that crime incurred has not yet been paid.

Those culpable for Floyd’s death are not just the cops in Minneapolis, then, but all cops and all white Americans, who are culpable for slavery also, and the decades of racial violence that followed the Civil War. All white Americans, we are ceaselessly told, knowingly or not, participate in a system of social repression from which they benefit, and are therefore enemies of justice.

For the most part, the subtext of the riots and those justifying them is as simple as this: the violence is actually the fault of white people, who are just getting a taste of what they have inflicted on others for centuries.

Great America

We Should’ve Killed the Deconstructionist Monster Decades Ago

Now that it’s grown into a million-headed creature whose tentacles reach into every corner of the United States, it’s too big to kill.

This monster should have been strangled in its cradle 40 years ago.

Those of us who were in college humanities and social science departments at the time witnessed its birth. I was a grad student in English when postmodern schools of literary criticism took root in my department. The deconstructionists had bought into an approach to literature—or to all “texts,” as they liked to put it—and in fact to reality itself, that seemed calculated ultimately to undermine any concept of objective truth and justice. The feminist critics, for example, based their entire enterprise on the firm belief that the history of humanity was one of non-stop oppression of women.

Meanwhile, much the same thing was happening in the other humanities and social studies departments. Many of the people who’d been college students in the 1960s were now, in the years around 1980, young assistant professors at the beginning of their academic careers. As students, between their marches and sit-ins and riots, they’d eagerly passed around copies of books by such ardent haters of the West as Antonio Gramsci, Paulo Freire, and Frantz Fanon.

From those books, they learned to see America as the very incarnation of evil and to admire the Chinese cultural revolution. In their dorms they hung posters of Mao Zedong and Ché Guevara. Now they were spreading their poisonous ideologies in college classrooms, encouraging contempt for everything of value—civilization, freedom, high art, capitalism, and social order.

As I say, some of us witnessed the start of this takeover. I was just a powerless student. But the senior faculty had power. They despised this stuff. Aside from being potentially very destructive, it had nothing whatsoever to do with the serious study of—well—anything. On the other hand, the older professors believed in diversity of thought. So they let it happen.

And soon they retired, the Marxists and anarchists, now in charge of the departments—tenured radicals, as Roger Kimball famously called them—began hiring others like themselves and nobody else, because that’s what radicals do. And before long, the humanities and social studies departments of many universities, especially the most prestigious of them, were little more than indoctrination centers for the far Left.

The graduates they turned out went into a variety of fields—from law to the news media, from the entertainment industry to government. The quintessential product of these far-Left factories was Barack Obama.

Yes, from the time he appeared on the scene, Obama was seductive. He cut an attractive, dignified figure. He was well-spoken. His self-presentation was cool, smooth. He looked and sounded like the kind of guy you’d want to be the first black president of the United States. 

But behind the façade, his intimate ties to people like Jeremiah Wright should have made it obvious what he was all about. Ditto a close reading of his autobiography. Yet millions who should have known better voted for him anyway, thinking that it was time for a black president and that, at the very least, his election would put an end, once and for all, to America’s racial divisions.

And so he was anointed. Far from healing racial wounds, however, he managed to pull off all the scabs. In his first speech outside the United States, he painted a picture of America that would have had Frantz Fanon nodding vigorously and served up a glorious image of the Islamic world that bore not the slightest resemblance to reality. He reached out a hand of comradery to the Castro regime and repeatedly kicked Israel, the only democracy in the Middle East, in the teeth.

Throughout his eight years in the White House, Obama pursued a dangerous agenda, showing contempt for American workers and America’s allies while courting our enemies. Millions failed to realize just how appalling he was, thanks to his allies in the mainstream media and Hollywood. They’d all been taught the same postmodern claptrap that Obama had been taught, and consequently felt everything he did and said as president was just peachy. And so they had his back.

When Hillary Clinton referred to Americans in flyover country as “deplorables,” maybe it was unfair of those Americans to focus all their ire on her; because Obama had made it clear over and over again that he felt exactly the same way about them. So did any number of big cultural and political names in New York, L.A., and Washington, who’d learned in college that the proper reaction, when one espied a patriotic, hard-working, law-abiding middle American, was utter contempt.

Which, of course, is why all those deplorables voted for Trump, the only GOP candidate in 2016 who dared to take on coastal-elite America-hatred. And after his election—given these elites’ toxic hatred of American values, and given that they’d tasted what it was like to have one of their own in the White House for eight years—it was only to be expected that Trump would spend his first term as president fighting to stay in office. Naturally, they made up bald-faced lies about him—they’d learned that duplicity in the cause of their ideology was a virtue.

And when the anti-Trump lies failed and the liars began to be exposed one by one—what to do, then, with an election only a few months away? What to do, when your decades-long quiet revolution is threatened? What to do, when their own candidate was a feeble, dithering shell of the corrupt, uninspiring mediocrity that he used to be?

Well, of course, the last remaining option, in that desperate moment, was violence. Vandalism. Fires. Mass destruction of property.

And then blaming it all on white nationalists. Or suggesting that it’s not yet clear who the real perpetrators are.

Make no mistake: the near-unanimity with which the mainstream media have tried to cover for the rioters and deny the truth is stunning. The New York Times on Sunday published a “news analysis” in which Neil MacFarquhar declared that Antifa doesn’t exist—an outrageous assertion contradicted by numerous Times news stories over the last few years. He also quoted Keith Ellison, the former congressman and current Minnesota attorney general, who said “nobody really knows” who’s behind the violent protests. MacFarquhar should have asked for the thoughts of Ellison’s son Jeremiah—who on the same day, in the midst of all the mayhem—tweeted his support for Antifa

Jeremiah Ellison wasn’t the only child of a prominent politician who made clear his loyalties in the matter. To no one’s surprise, the daughter of Rep. Ilhan Omar (D-Minn.) also tweeted her support for Antifa. On May 31, New York mayor Bill de Blasio’s daughter, Chiara, was arrested at an Antifa riot in that city. And let’s not forget CNN’s Chris Cuomo, who is the brother of the governor of New York, and who has also defended Antifa from critics. All of which helps remind us that many of the politicians who are supposed to protect ordinary citizens from violence are, in fact, on the same side as the rioters who are out to destroy.

No, we should have strangled this monster in the cradle 40 years ago. Now that it’s grown into a million-headed creature whose tentacles reach into every corner of the United States, it’s too big to kill. 

Great America

No, Trump Should Not Rescue Cities Again

Not one drop of Marine blood should be shed on the streets of Minneapolis or Chicago or Los Angeles to save Democrats from themselves yet again.

When looters threatened to bring their mayhem and destruction from the city of Chicago to my suburb on Sunday, our town’s leadership acted quickly. Snow plows were used to block access to the major shopping mall and big-box outlets; police SUVs and undercover cars patrolled heavily-traveled streets carrying cops with riot gear. Stores and restaurants, which had just reopened on Friday after a nearly three-month shutdown, closed again. Suspicious cars congregating in vacant parking lots adjacent to strip malls were hustled away.

Our police chief—Tim McCarthy, the secret service agent who took a bullet for Ronald Reagan in 1981 and saved the president’s life—could be heard on the police scanner calmly managing a multifaceted operation to keep people and property safe. While cities across the country, including the nation’s capital, continued to burn, my suburb of 58,000 residents, like many throughout the Chicagoland area, was unharmed.

Why? Because our law enforcement officers knew they had the support of our political leadership and the residents. Further, people were prepared to defend their businesses and families if necessary.

The Party of Lawlessness

But that is not the case in many urban areas, big and small, across the country. Police have been ordered to stand down; there was no plan or preparation as the thugs advanced. (The police chief of Raleigh, North Carolina said she would “not put an officer in harm’s way to protect the property inside a building.”)

The same Democrats who just a few weeks ago threatened beach walkers, churchgoers, and other disobedient social distancers are giving a wink and a nod to arsonists, looters, and attempted murderers.

Democrats belong to the party of lawlessness—and by law, I mean rules that have been subjected to a recorded vote by elected officials, not CDC guidance on mask-wearing. Open borders, empty jails, voting rights for noncitizens, uncollected debt are just a few features of their anarchist utopia. No wonder Democratic politicians only give lip service in condemning the looters; they share a mutual contempt for the ideals on which the country was founded and a common mission to fundamentally transform the country and all its institutions to yield to their America-hating point of view.

Why, then, should we be compelled to rescue them as violence continues? Why should we not watch in smug satisfaction as they are hoisted with their own petard? Of course, it tugs at the heartstrings to see shop owners in Minneapolis weep at their looted livelihood. Yes, innocent children are caught in the fray. But this is the culture Democrats not only created but want to put on steroids after the 2020 election.

The people who live in American cities—a mix of very wealthy whites and low-income minorities is the common demographic—elected far left political leadership and not by a slim margin. Jacob Frey, the mayor of Minneapolis, won in 2017 with 57 percent of the vote by defeating another Democrat. Philadelphia Mayor Jim Kenney won last year with more than 80 percent of the vote; ditto for Los Angeles Mayor Eric Garcetti. Republicans don’t even bother to run. This is the story in nearly every city in America.

To flip the script: you bought it, you broke it.

Invoke the Insurrection Act? 

Some folks on the Right, however, are urging the president to use his authority to quell inner-city chaos by deploying the military. 

“Anarchy, rioting, and looting needs to end tonight,” Senator Tom Cotton tweeted Monday morning. “If local law enforcement is overwhelmed and needs backup, let’s see how tough these Antifa terrorists are when they’re facing off with the 101st Airborne Division. We need to have zero tolerance for this destruction.”

Cotton isn’t alone. My friend Andrew McCarthy offers a weighty argument as to why Trump should act unilaterally and militarily to defend cities under siege. McCarthy makes a case, a strong one, that the Insurrection Act of 1807 empowers the president to send in the troops to protect the citizenry of any state, federalism be damned. 

“What has happened over the last few nights in major cities of the United States is unacceptable,” McCarthy writes. “It has gotten worse because the federal and state governments have failed to convey the signal that order will be maintained and the rule of law enforced. That must end. The president and governors must work together to restore order, including by deployment of the military where that is necessary.”

There’s no doubt Cotton and McCarthy make a solid, legal point. And under normal, or even semi-normal conditions, that sort of intervention would be considered dire but necessary. But we are living in the Age of Trump, a time when all pretense of political civility, largely on the Left, is gone.

Simply put, it’s impossible to restore order with disorderly people. A just, properly-functioning nation requires some modicum of trust and respect on all sides—that is nonexistent today.

No method, no tactic, no law-breaking has been too extreme in service of destroying Donald Trump, and with him, his supporters. And all of it—illegal investigations, impeachment attempts, the use of spies and secret courts, relentless media harassment—has been egged on by the politicians now completely impotent, either congenitally or purposely, in the face of crisis with the full-throated consent of their like-minded constituents.

The Future Is Now

Luckily, one has to look no further than a few weeks ago for evidence of the Left’s mendacity. Governors such Andrew Cuomo, after failing to prepare for New York City’s coronavirus surge, repeatedly assigned their failures to the president. While thousands of senior citizens died of the disease based on Cuomo’s fatal order to readmit COVD-19 patients into nursing homes, Cuomo has played the media-embraced foil to the president. Trump essentially bailed Cuomo and other blue-state leaders out of the mess they created, one they initially denied and then mishandled from day one, but the president nonetheless stands accused of killing 100,000 Americans, coronavirus’ alleged body count.

The rest of the country, sadly, has paid the price for the Democrats’ incompetence and refusal to act until it was too late—yet their constituents overwhelmingly support them. More than 80 percent of New Yorkers approve of how Cuomo managed the coronavirus emergency according to a May poll; only 35 percent approved of the president. There is no winning with these folks.

Consequently, not one drop of Marine blood should be shed on the streets of Minneapolis or Chicago or Los Angeles to save Democrats from themselves yet again. (My sole exception is Washington D.C.: The president last night imposed a curfew and deployed “thousands and thousands of heavily armed soldiers” to the nation’s capital to stop the chaos.)

If Trump orders the military to salvage what’s left of inner-city America, even if it provides a sweet victory over Antifa or bolsters his reelection, it gives Democrats cover. They can feign innocence, express outrage at the uninvited usurpation of their authority, deflect from their own ineptitude and cowardice, and blame Bad Orange Man one more time.

If the violence spreads outside of inner-cities, prompting suburban leaders to seek federal help, Trump should consent. But city officials, who aren’t requesting help, and the people who put them in power must live with what they have wrought. The future, as Democrats love to wax, is now. We certainly wouldn’t want to get in the way.

Great America

Quelling Insurrection

If weak and feckless mayors and governors fail to take steps to suppress this insurrection, the federal government is obliged to act.

It’s time to call what is going on in American cities by its proper name: insurrection. Its goal is the destruction of republican government. Let us be clear: the death of George Floyd was horrendous. The image of a Minneapolis police officer kneeling on Floyd’s neck made me physically ill, and I have seen men die before. All the officers involved must be held accountable.

But what is going on now has transcended Floyd’s death. What started in Minneapolis and has spread to other cities is no longer a “protest.” Rioting, arson, and looting are not protesting injustice. It is no longer about race. It is no longer about civil rights. It has become insurrection, pure and simple.

The insurrectionists may use Floyd’s death as a fig leaf for their actions but their actions are an affront to the rule of law, upon which republican government rests. The First Amendment to the Constitution guarantees “the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” It does not protect rioting, arson, and looting. It does not protect mob rule, which as Abraham Lincoln warned in his Lyceum Speech of 1838, is the very antithesis of republican government.

Section 4 of Article IV of the Constitution reads: “The United States shall guarantee to every state in this union a republican form of government, and shall protect each of them against invasion; and on application of the legislature, or of the executive (when the legislature cannot be convened) against domestic violence.” Today, domestic violence threatens the very foundations of republican government.

The United States is a federal republic, which means that the first line of defense against insurrection is the government of the various states. But if governors are unable or unwilling to quell domestic insurrection, the federal government has the responsibility to act.

In the wake of rioting and arson in Minneapolis that followed Floyd’s death, Mayor Jacob Frey failed to stop the violence. He apparently adopted the stance of Baltimore Mayor Stephanie Rawlings-Blake, who, in the wake of violence in the wake of Freddie Gray’s death several years ago, instructed her city’s police officers to allow protestors to “express themselves” and to give “those who wished to destroy, space to do that as well.”

But as the violence continued, Minnesota Governor Tim Walz finally deployed the National Guard. This is the proper sequence in response to domestic disorder. If the local authorities are unable to restore order, the state provides its militia, in this case the National Guard. Only then does the federal government step in.

News outlets have reported that in response to President Trump’s request for military options, the Pentagon has alerted several military police units, which can be deployed in short order to back state and local forces. On Monday, the president vowed to mobilize federal resources in the event that state efforts failed to quell the ongoing violence. It has also been reported that the president has scolded governors for their inaction as violence has spread. Of course, Trump’s critics have taken umbrage, claiming that what he has done violates the American tradition, that it is anti-constitutional, anti-federal, an example of pro-big stick executive power, and an irresponsible threat to use the professional military as cops. This is not only nonsense, it is nonsense on stilts.

Although there are many very good reasons to limit the use of the regular U.S. military in domestic affairs, U.S. troops have been so employed since the beginning of the republic. Under Article II of the Constitution, the president, as commander-in-chief of the Army and the Navy, and of the militia (National Guard) when under federal control, has the authority to act against enemies, both foreign and domestic. In addition, in 1807, Congress explicitly authorized the U.S. Army to enforce domestic law by passing the Insurrection Act.

Although intended as a tool for suppressing rebellion when circumstances “make it impracticable to enforce the laws of the United States in any State or Territory by the ordinary course of judicial proceedings,” presidents have used this power on five occasions during the 1950s and ’60s to counter resistance to desegregation decrees in the South. It was also the basis for federal support to California during the Los Angeles riots of 1992, when elements of a U.S. Army division and a Marine division augmented the California National Guard. More recently, active-duty forces have deployed in response to Hurricane Katrina.

Nonetheless, the United States has sought to limit the use of regular troops in domestic affairs, the most important of which was the Posse Comitatus Act (PCA) in 1878. Unfortunately, the PCA is widely misunderstood. The concept of the posse comitatus is an inheritance from Britain. It literally means the “power of the county.” Thus the first line of defense against disorder was made up of law-abiding citizens called together by the “shire-reeve” or sheriff.

When Congress passed the Insurrection Act, it created a problem for the Army. It meant that in practice, a federal marshal could tap a local Army detachment to support local law enforcement by designating it as a part of the posse comitatus. Accordingly, troops were often used in the antebellum period to enforce the fugitive slave laws and suppress domestic violence. The Fugitive Slave Act of 1850 permitted federal marshals to call on the posse comitatus to aid in returning a slave to his owner, and Attorney General Caleb Cushing issued an opinion that included the Army in the posse comitatus. Troops were also used to suppress domestic violence between pro- and anti-slavery factions in “Bloody Kansas.” Soldiers and Marines participated in the capture of John Brown at Harpers Ferry in 1859.

The PCA prohibited the use of the military to aid civil authorities in enforcing the law or suppressing civil disturbances unless expressly ordered to do so by the president. After the Civil War, the U.S. Army was involved in supporting the Reconstruction governments in the southern states. The legislation was introduced by southern Democrats in the House of Representatives who resented the employment of federal troops for this purpose. Some northern congressmen supported it because of the Army’s role in suppressing the railroad strike of 1877.

While the PCA is usually portrayed as the triumph of the Democratic Party in ending Reconstruction, the Army welcomed the legislation. The use of soldiers as a posse removed them from their own chain of command and placed them in the uncomfortable position of taking orders from local authorities who had an interest in the disputes that provoked the unrest in the first place. For instance, during the railroad strikes, soldiers took orders from state governors and even municipal officers. As a result, many officers came to believe that the involvement of the Army in domestic policing was corrupting the institution.

While I am on record opposing proposals to use the military in the “war” on drugs and the like, the fact is that President Trump, if he finds it necessary, has the constitutional authority to employ U.S. troops in response to domestic disorder. Indeed, he might look to the various Force Acts passed by Congress after the Civil War to suppress domestic terrorism by the Ku Klux Klan. If weak and feckless mayors and governors fail to take steps to suppress this insurrection, the federal government is obliged to act. Republican government itself is at stake.

Great America

Whatever Happened to Law and Order?

The contemporary obsession with racism says that nothing in America is good or worth conserving. That’s a recipe for national suicide.

Race riots are underway again; beginning in Minneapolis but now spread to many major cities. These things always follow the same trajectory. First, there is a questionable incident involving law enforcement and a black man. Next, the ugly images, questions, protests, and agitators.

Then the protests turn violent. The logic of collective punishment is afoot.

Pretty soon every opportunistic criminal joins in: TVs walk out of stores, and buildings burn down. Soon, other crimes like rapes and murders happen. A diffident police response further encourages the rioters. Eventually, things either peter out or the National Guard is called in. 

It’s been this way since the Watts riots in the 1960s. Almost every major urban area has experienced race riots. After a hiatus, they returned with gusto as with the Los Angeles Riots in 1992. Then a string of them happened during the Obama Administration: Ferguson, Baltimore, and Milwaukee

If you’ve seen one, you’ve seen them all. 

Is America Worthy of Conserving?

In their account of these riots, the Right and Left typically part ways, as they do on crime more generally. The Left likes to talk about justified anger, institutional racism, and poverty—the “root causes.” This has the effect of justifying the violence. 

People on the Right, by contrast, consider themselves to be part of the party of law and order. They think laws against theft and arson are good laws, without exceptions for “rage.” As individuals, they tend to have good relations with police, whom they see as protectors of themselves and their communities. While they may agree that a particular incident shows misbehavior, they tend to consider such incidents examples of “bad apples” rather than something systemic. 

While the Left likes to look at broad patterns involving the whole society—so-called institutionalized racism—people on the Right see a different and more salient pattern: criminals resisting arrest leading to predictable consequences. 

Tapping into anxiety about crime and disorder has long been good right-wing politics. Nixon famously invoked the need to restore law and order and won in both 1968 and 1972, after years of disorder, riots, and rising crime. 

This was not exclusively a Republican strategy. Mayor Richard Daley of Chicago, to the horror of the media, called for his police to “shoot to maim” during riots at the Democratic National Convention in 1968. He won re-election six times. 

Is the Nation Legitimate?

Anxiety about crime does have a relation to race. While today the civil rights movement and the anti-Vietnam War movements are cast as gauzy moments of national unity, they were controversial at the time, not least because they were coincident with a massive, sustained rise in riots and other violent crime. Why exactly this was so is mostly a matter of speculation, but consider that both the civil rights and anti-war movements were critical of our society at its core. 

These were not just criticisms of particular laws or a misguided foreign policy, but a reimagination of America as an illegitimate regime, whose popular myths perpetuated that illegitimacy. As the civil rights movement and anti-war movements became intertwined, the dominant theme became that America itself was evil because of its foundational racism. 

That view is now so common as to be unremarkable. CNN’s Van Jones recently said Hillary Clinton-supporting, white liberals were racist. The New York Times has given top billing to the revisionist view that America’s real founding happened not in 1776 but in 1619, when American slavery began. 

If a society and its people are painted as fundamentally evil, it’s pretty hard for people to go on respecting their laws. “Collective guilt is one of the legacies of the 1960s that is still with us,” Thomas Sowell observed. “We are still seeing a guilt trip for slavery being laid on people who never owned a slave in their lives, and who would be repelled by the very idea of owning a slave.”

After the loss in Vietnam and the sustained critique of racism that began in the 1960s, American elites lost their nerve. They looked at the self-confident America of their youth with a jaundiced eye and no longer felt comfortable telling minorities that they needed to follow the same laws and conform to the same expectations as everyone else. They certainly didn’t want to be labeled racists, as they themselves had spent a lot of energy condemning what they deemed déclassé Southern whites, as well as working-class whites in urban areas. Softness on crime and hostility to law enforcement thus became a central defining principle of the liberal elite. 

This toxic viewpoint bled into other issues, keeping the nation divided in what should have been a time of national reconciliation. Republican leaders favored the death penalty and long sentences, while the Left wanted a greater focus on rehabilitation. The Right supported long criminal sentences, and the Left said this was “mass incarceration.” 

The running debate on whether America was basically good or basically evil became an enduring feature of the Right/Left political divide. The loss of confidence arising from the civil rights movement led to a full-blown crisis of legitimacy in other areas. American economic dominance became neo-colonialism. Foreign policy primacy was neo-imperialism. The Left said America was bad and deserved to be weaker, while the Right wanted the nation and its institutions to be strong and self-confident. The Right saw themselves as the stewards of the nation, whose institutions their ancestors built. 

Conservatism Inc. Concedes the Left’s Core Premise

For many years, conservative intellectuals tended to match the tone of their core constituency in this debate. National Review had a lot of articles on the topic of crime in the 1970s and 1980s, when Ernest van den Haag used to wax poetic on the necessity of the death penalty. 

Then the “movement” started to change. Many of those deemed too far to the Right were purged. To the extent National Review was the center of an intellectual movement, it became less intellectual. Mediocrities like Jonah Goldberg and David French rose to prominence, replacing men like Russell Kirk and James Burnham. 

The lion’s share of official conservatism’s efforts was spent defending policies that helped big businesses, even as these businesses off-shored jobs or supported policies that should have been deemed hostile to the free-market ethos of Conservatism, Inc.. 

The movement also began to run away from issues like immigration and affirmative action, even though both tend to be hot buttons for the average right-leaning voter. Through the prism of Conservatism, Inc., and its well-funded magazines and think tanks, the passions of their core constituencies were refracted into a slightly more pro-business, jingoistic version of mainstream liberalism. 

The one thing these self-appointed gatekeepers didn’t spend a lot of time or energy on was crime, particularly black crime, even though this is a latent anxiety behind ongoing “white flight” and the hunt for “safe communities” with “good schools.” Instead, anxieties about “urban problems” became a topic discussed in whispers among close friends and family. 

These concerns do not reveal some deep reservoir of irrational hate, but rather a rational response by ordinary people to verifiable reality. Instead of sympathizing with these legitimate worries, the official Right either ignored them or condemned them using the same shaming language as the leftist mainstream media. 

Now, they trip over themselves to condemn a Minneapolis cop, even before all the facts are known, including important ones, like the fact that the victim didn’t suffocate. Even if the knee-on-the-neck was bad tactics, it kind of matters whether it caused a death to classify this as a murder or merely a firing offense. 

Obviously, people aren’t angels. Cops make mistakes. Sometimes bad people have power and there is corruption. But this is different from saying the police generally are a problem, that police shootings of black men are a national epidemic, or that disproportionate black crime is not a larger issue for black and white communities alike. 

America and Its People Are Worthy of Defense

The obsession with racism misunderstands the country and its contemporary problems. It defines any friction between a black and white person as proof of racism, as if whites are not also sometimes victims. Neither does this obsession give any consideration to the fact that sometimes people have violent encounters for reasons having nothing to do with race. The racism critique lately has evolved into quasi-genocidal ideas like “white privilege” and “abolishing whiteness.” 

Race obsession also forces conservatives into silly roundabout arguments, such as the popular trope that Democrats are the Real Racists, as if this alone could render the Democrats unfit for leadership. So much for the Democrats’ hostility to the Constitution and the pro-life cause, or their support for affirmative action and gun control. The Right will always lose when the Left is permitted to define respectability. 

An authentic right-wing politics can succeed only when it appeals to reasonable middle-class anxieties on immigration, political correctness, jobs moving overseas, a general coarseness of life, and, of course, crime and disorder. 

This is how Nixon won. 

This is how Reagan won. 

And this probably had something to do with how Trump won. Trump’s tough talk and throwback mantra of “making America great again” came during Obama’s second term, after the country endured half a dozen high-profile race riots reminiscent of the 1960s. Sadly, the president has been muted and mealy-mouthed in the face of these recent riots in Minneapolis and elsewhere.

While riots have become common again, they are the product of choices made by our political leaders. These choices, including failing to back the police, happen when a majority is morally disarmed from imposing the same standards on everyone, whether white or black.

People on the Right are naturally inclined to be on the side of law and order. The Right’s core is in the middle class, people with jobs and families, who own property and businesses. They recognize the need for law to be backed by force, violence even. This flows from the premise that civilization and peace are not the natural state of men, but rather a delicately constructed edifice, dependent on myth and memory and social solidarity, as well as law and order. 

No conservative movement worthy of the name would accept that rioting is acceptable in response to police brutality, even when it is demonstrably proven. But Conservatism, Inc. gave away the game by accepting the Left’s warped moral compass, which says America is fundamentally illegitimate due to endemic racism. Their condemnation of the riots is ineffectual, because it stands alongside a more wide-ranging condemnation of America itself as racist and illegitimate. Until the Right finds a way out of this straight jacket, nothing can be conserved, because the contemporary obsession with racism declares that nothing in our nation is good or worth conserving.

Great America

The Los Angeles County Office of Education Loses Its Damn Mind

Is the risk of death so high that we must imprison our children?

The school-to-prison pipeline in Los Angeles County just got a whole lot shorter. In fact, for the coming school year, it will disappear entirely!

Now they will waste no time. The schools themselves will become prisons. That, at least, is the gist of a recently released document, “A Planning Framework for the 2020-2021 School Year” by Dr. Debra Duardo, superintendent of L.A. County Schools. There she lays out her COVID-19 guidance for the fall. It’s filled with educational jargon about “stakeholders” and “bargaining units,” brightly color-coded tables, and 44 pages of decision matrices, templates, and flow charts.

But the core point remains: Duardo’s “best practices” for schools recommend treating children like prisoners—assuming the schools reopen at all.

She calls for placing “seats 6 feet apart,” installing “floor markings to illustrate social/physical distancing,” and providing students with “lunch and/or meals in supervised, non-congregate settings” that “avoid sharing tables whenever possible.” 

Staggered recesses and different entry and exit points for all students and staff are a must. “Cloth face coverings” should be worn all day long, of course. Duardo imagines a typical classroom stripped of all furniture to allow for a maximum number of 16 students. 

In a particularly jarring line, Duardo explains why this must be done: “to limit people encounters.” The annihilation of basic human interaction and social life is a small price to pay to confront the coronavirus! 

This thinking forces children to live in perpetual fear under onerous mandates and regulations. It treats them as prisoners. Even walking from one place to another will not be done freely in schools under the “new normal” in the nation’s most populous county. 

Children should not live this way. Americans cannot allow bureaucrats to treat our children in such a despicable and unwarranted manner. 

These students need friendship and playtime. They need the ability to see each other, to smile and laugh. They need to be able to share books and toys. They absolutely do not need to be subjected to the unnecessary fear of neurotic adults with no sense of proportion or reality.

In Los Angeles County there have been some 2,000 COVID-19 deaths. That works out to roughly 1 in every 4,600 Angelenos having died this year from the illness. For reference, roughly 1 in every 140 Americans dies annually. Coronavirus, therefore, makes up only a minute percentage of overall deaths in Los Angeles County. And that’s among the whole population. When it comes to children, the coronavirus is virtually no threat at all.

In 2018, 186 American children died from seasonal influenza. As of May 28, there have been 90 coronavirus deaths in Americans aged 0 to 24. Even with an expanded age range, the number of coronavirus deaths among young Americans is still far lower than pediatric deaths from the annual flu! Put another way, according to the Centers for Disease Control, 15,050 Americans between the ages of 0 and 24 have died this year—which means this group is 167 times more likely to die from any cause other than COVID-19.

According to the California Department of Health, zero percent of the coronavirus deaths in L.A. County is attributable to children under the age of 18.

Zero percent.

The risk of death to children from this illness in the county isn’t just minuscule—it is nonexistent. 

None of the “health and safety” measures being put in place by Los Angeles County has anything to do with the health or safety of students or even teachers and staff. There is no conclusive evidence that placing desks six feet apart, wearing face masks, or establishing specified “ingress and egress points” will reduce the transmission of the coronavirus in the general population or among students. 

But even if they did, it still would not answer the question of whether we must sacrifice the spiritual, mental, and educational well-being of our children to this illness. Is the risk of death so high that we must imprison our children? 

Absolutely not. What Duardo and her staff want to do to our children is insane. 

Americans should not submit to their crazed whims. We certainly should not subject our children to them. 

First Principles

Virtue Signaling Theater at the Mouth of Hell

The hypocrite is not the person who says one thing and does another. He is the play-actor, the man or woman who puts on a show of righteousness, to fool himself first of all, because he is his own favorite audience.

“Woe to you, scribes and Pharisees, hypocrites! For you tithe mint and dill and cumin, and have neglected the weightier matters of the law, justice and mercy and faith; these you ought to have done, without neglecting the others. You blind guides, straining out a gnat and swallowing a camel!” 

—Matthew 23:23-24

One thing that strikes me as I consider the backwash of political commentary that characterizes our time is its innocence. I mean the word both intellectually and spiritually. The commentators seem never to have made a serious examination of conscience. They do not think of the words of Jesus above. Perhaps they have never heard them, or his stern warning, that the measure we give will be the measure we get. In their own estimation, they are perfect children, sure of the rightness of what they believe and of their moral stature.

When the boy from Kentucky last year stared blank-faced at the native American badgering him, he was reviled by millions of such moral children, who were ready to string him up. A young computer programmer makes the rather obvious point that women might not be attracted to that kind of work in the same numbers as men, he too is reviled by millions, and he loses his job.

Earlier this week, a woman with a loose dog in Central Park, she behaving badly, was confronted by an African American bird watcher, he behaving badly. When he seemed to threaten her and her dog, saying, “I’m going to do something and you won’t like it,” making reference to a plan to offer the dog a treat he said he carried with him for just such encounters, she called the police, and she mentioned his race. For doing that, she has lost her job, and the dog, too, while she gave millions of moral children the opportunity to dance and hug themselves for how good they are.

Children, we are not good. Shall we take the Ten Commandments, one by one?

“I am the Lord thy God: thou shalt not have strange gods before me.” The commandment enjoins piety and forbids idolatry. Are our churches and synagogues full? Or are we a people who snicker at piety, while turning the word “idol” into a term of praise? “That is all well for you,” someone may say, “but I think that God is a fable.” Nay, think so still, said Mephistopheles to Faustus, till experience teach thee otherwise.

Every injunction of the moral law is a fable to those who are set upon ignoring or violating it. Yet deep within our hearts, we believe that justice is no mere word. If we hang a man for saying an unkind thing, while we ourselves dance along in sins a thousand times worse, where will the justice be? Do we not bring down condemnation upon ourselves, by the dreadful measure we mete out to our brothers?

Should I enumerate these sins? Try, children, to imagine a people not quite so blithe about profanity, neglect of parents and offspring, fornication and adultery and all manner of sexual uncleanness, breach of promise, lying and cheating, theft by quick hands and the strokes of a pen; bloodshed—and let us not spread the oil of self-pity over the murder of children in the womb; gossip, detraction, slander, placing the words and deeds of our brothers in the worst conceivable light; envy and covetousness; pride and wrath and spiritual torpor, the glutton’s and the lecher’s preoccupation with the body, and avarice, including the avarice of ambition.

Perhaps a law of inversion applies here. Jesus condemned the Pharisees for gagging upon gnats while they swallowed camels whole. We may think that if you are going to gag on a small thing like a gnat—the stray word that your brother inadvertently speaks because he is tired or under pressure or not thinking clearly—you will never be able to get to the camel. But that may not be true.

It may be that if you strain at gnats, you will swallow camels, and if you swallow camels, you will strain at gnats. It is as if we each were supplied with a certain fund of moral condemnation. The less we spend upon our own sins, which are the only sins we can punish immediately and with a clear conscience, the more will we spend upon other people’s sins, which we can easily make appear far worse than ours; sins we perhaps do not commit, for the simple reason that we happen not to enjoy them.

Our tastes run to others. Or the less of that fund we spend upon grave sins, those which deprive someone of life, as abortion does, or which corrupt the innocence of children, as pornography does, or which strike at the heart of the family, as divorce does, the more will we spend upon the pardonable sins that everyone is prone to, or even upon what is no worse than a breach of etiquette.

Dismember that child in the womb, doctor, but make sure you extend the correct pinky while you do so.

All this is mere theater for our own delectation, as the words of Jesus suggest. The hypocrite is not the person who says one thing and does another. He is the play-actor, the man or woman who puts on a show of righteousness, to fool himself first of all, because he is his own favorite audience. What is called “virtue signaling” is pure hypocrisy, pure play-acting, unfurling your banner of righteousness and dancing before it, dancing at the mouth of hell.

What we should say, always, urgently, is only the truth: that we are the worst of sinners, and that on account of us the world is as it is, that we are the cause of despair in others, and that if it were not for the grace of God, we should never see salvation.

Books & Culture

This essay is adapted from “United States of Socialism: Who’s Behind It. Why It’s Evil. How to Stop It,” by Dinesh D’Souza (All Points Books, 304 pages, $29.99)

Identity Socialism

Herbert Marcuse’s toxic legacy.

Socialism, a system for raising up the working class, has now largely abandoned the working class. A program for raising the condition of ordinary citizens and workers has turned into a coordinated effort to make those very citizens and workers feel unwelcome and demonized in their own country. Socialism in America today has turned black against white, female against male, homosexual and transsexual against heterosexual, and illegals against legal immigrants and American citizens.

The typical socialist today is not a union guy who wants higher wages; it is a transsexual eco-feminist who marches in Antifa and Black Lives Matter rallies and throws cement blocks at her political opponents. American socialism is concerned less with worker exploitation by the bourgeoisie and more with the race, gender and transgender grievances of identity politics. I call it identity socialism.

If Franklin D. Roosevelt were alive today, he would not recognize the modern Democratic Party he created. Nor would he recognize the progressivism and socialism that formed the ideological pillars of his party. For FDR, as for Marx, socialism was primarily a matter of class. It was the rich versus the poor. Its political base was the working class—specifically the white working class that to this day forms the majority of working class people in America. While the socialist Lleft still employs the old rhetoric of class warfare, it seems something of a relic. Contemporary socialism is no longer rooted in class, and moreover its oldest allies—working class white males—are now its villains and enemies.

If FDR had attended the 2016 primary debates among Democratic Party contenders, he would have heard Hillary Clinton jibe at Bernie, “If we broke up the big banks tomorrow, would that end racism?” Recently he would have encountered this outlandish tweet from Elizabeth Warren: “Thank you @BlackWomxnFor! Black trans and cis women, gender-nonconforming, and nonbinary people are the backbone of our democracy.”

FDR would probably would have no idea what she was talking about. Who are these people and how could they be the “backbone of our democracy”? They certainly seem to be the backbone of the socialist Left. At a recent meeting of the Democratic Socialists, FDR would have encountered a strange menagerie of activists calling themselves eco-socialists, Afro-socialists, Islamosocialists, Chicano socialists, sanctuary socialists, #MeToo socialists, disability socialists, queer socialists, and transgender socialists.

Typical of the new type of socialist is Stacey Abrams, the Democrat who narrowly lost the governor’s race in Georgia. “My campaign,” she says, “championed reforms to eliminate police shootings of African Americans, protect the LBGTQ community against ersatz religious freedom legislation, expand Medicaid to save rural hospitals, and reaffirm that undocumented immigrants deserve legal protections.” Only one of these four planks—the one about saving rural hospitals—would be even remotely recognizable to FDR as part of the progressive agenda.

During this year’s Democratic primary, each candidate tried to play his or hertheir diversity card. Pete Buttigieg was white and male, but hey, at least he was gay! Cory Booker, regrettably, was a man, but fortunately for him he was a black man. Julian Castro affirmed his Latino status despite his inability to speak Spanish. This put him a notch above Beto who, after all, only had the Mexican nickname. Elizabeth Warren had the woman card but she also wanted the native American card—making her a “two-fer”—so she faked her Indian ancestry. Kamala Harris, the winner of this sweepstakes, is both female and a person of color.

The great irony, of course, is despite this identity parade the guy who got the nomination is the whitest male of them all, Joe Biden. Given his semi-comatose state, he almost qualifies as a “dead white male.” As if to remedy this predicament, Biden has pledged to nominate a woman as his running mate, and he will get extra points if he nominates a minority woman like Stacey Abrams. Identity socialism now defines the Democrats and carries even an old white geezer like Biden in its wake.

The implications of this go beyond party politics; they involve how the Lleft views the country itself. For FDR, America was an “imagined community.” I get this term from sociologist Benedict Anderson. Anderson points out that a nation is imagined because it is made up of people who have never met and don’t know each other. Yet nations seek to create a “deep horizontal comradeship” in which we identify with people we’vre never heard of. They are our “fellow citizens.”

This identification is critical because, without it, who would be willing to die for his or her country? Anderson points out that no one is willing to die for the Labor Party, the American Medical Association, or the United Way. Not only soldiers but even cops and firemean who risk their lives for “strangers” must have an imagined comradeship with those strangers. Lincoln understood this. Memorial Day was created immediately following the Civil War, and it was during that era that the American flag becoame a symbol of quasi-religious national devotion.

Even socialist redistribution within a country relies on some sense of solidarity among the citizens; otherwise why should my hard-earned money go to pay the medical expenses of someone I couldn’t care less about? In India I learned a proverb that may seem somewhat heartless, “The tears of strangers are only water.” The basic idea, however, is that we have an obligation to help only our own; if others have a problem we wish them well, but it’s their problem.

For FDR the New Deal was a patriotic project. He routinely defended his programs in terms of “the greater good of the greater number.” Moreover, he appealed to this same patriotic solidarity during World War II. Martin Luther King, Jr. also spoke in terms of restoring the “beloved community.” The basic idea here is that America is a good country, based on noble ideals. The political task is to fully to integrate and assimilate everyone—blacks, women, immigrants—into that America.

Today’s socialist Lleft, however, wants an American that integrates the groups seen as previously excluded while excluding the group that was previously included. “If you are white, male, heterosexual, and religiously or socially conservative,” writes blogger Rod Dreher, “there’s no place for you” on the progressive lLeft. On the contrary, it should now be expected that in society “people like you are going to have to lose their jobs and influence.”

In other words, for identity socialists and for the Lleft more generally, blacks and Latinos are in, whites are out. Women are in, men are out. Gays, bisexuals, pansexuals and transsexuals, together with other, more exotic, types are in; heterosexuals are out. Illegals are in, native-born citizens are out. One may think this is all part of the politics of inclusion, but to think that is to get only half the picture. The point, for the Lleft, is not merely to include but also to exclude, to estrange their opponents from their native land.

Consider how normalcy has been defined in America. Since whites have been a clear majority, whiteness was the norm. Since the structure of society was, however loosely, patriarchal, maleness was also seen as normative. And of course the same applied to heterosexuality, since most people are heterosexual. For the socialist lLeft, it’s vital to overturn this hierarchy not by leveling the playing field but by creating an inverse hierarchy. Whiteness, maleness, and heterosexuality are now viewed as pathological, as forms of oppression. In this way, the Left by design seeks to demonize white male heterosexuals and thus to make a large body of Americans feel like aliens in their own country.

How did we get here? This, I believe, is the story of the 1960s, because that was when this great shift occurred. One man, whose name few people know, was the prophet of the change. He is the one who posed the big question: how do you get socialism when the people who are supposed to want it the most don’t want it? How do you create a proletariat when the original proletariat opts out? And where do you find the replacements? To answer these questions is to discover the roots of the socialist Lleft that defines and directs today’she Democratic Party.

Marcuse’s Marxist Conundrum

To understand identity socialism, we must go back several decades and meet the man who figured out how to bring its various strands together, Herbert Marcuse. A German philosopher partly of Jewish descent, Marcuse studied under the philosopher Heidegger before escaping Germany prior to the Nazi ascent. After stints at Columbia, Harvard and Brandeis, Marcuse moved to California, where he joined the University of San Diego and became the guru of the New Left in the sixties.

Marcuse influenced a whole generation of young radicals, from Weather Underground co-founder Bill Ayers to Yippie activist Abbie Hoffman to Tom Hayden, president of the activist group Students for a Democratic Society (SDS). Angela Davis, who later joined the Black Panthers and also ran for vice president on the Communist Party ticket, was a student of Marcuse and also one of his protéegeés. It was Marcuse, Davis said, who “taught me that it is possible to be an academic, an activist, a scholar and a revolutionary.”

Marcuse egged on the activists of the 1960s to seize buildings and overthrow the hierarchy of the university, as a kind of first step to fomenting socialist revolution in America. Interestingly, it was Ronald Reagan—then governor of California—who got Marcuse fired. Still, Marcuse retained his celebrity and influence over the radicals of the time. He did not, of course, create the forces of identity socialism but he saw, perhaps earlier than anyone else, how they could form the basis for a new and viable socialism in America. That’s the socialism we are dealing with now.

To understand the problem Marcuse confronted, we have to go back to Marx. Marx saw himself as the prophet, not the instigator, of the advent of socialism. We think of Marx as some sort of activist, seeking to organize a workers’ revolution, but Marx emphasized from the outset that the socialist revolution would come inevitably; nothing had to be done to cause it. The Marxist view is nicely summed up by one of Marx’s German followers, Karl Kautsky, who wrote, “Our task is not to organize the revolution but to organize ourselves for the revolution; it is not to make the revolution, but to take advantage of it.”

But what happens when the working class is too secure and contented to revolt? Marx didn’t anticipate this; in fact, the absence of a single worker revolt of the kind Marx predicted, anywhere in the world, is a full and decisive refutation of “scientific” Marxism. In the early 20twentieth century, Marxists across the world—from Lenin to Mussolini—were fully aware of this problem. Fascism or national socialism represented one way to respond to it; Leninism represented another.

I’ll focus on Lenin, because his was the approach that influenced Marcuse and the New Left in the 1960s. Basically Lenin argued that the working class was never going to revolt; they might join trade unions, but that was about it. In Lenin’s diagnosis, workers could develop “trade union consciousness” but not “revolutionary consciousness.” So then what? In his famous work What Is To Be Done? Lenin insisted that the socialist revolution would not be done by the working class; it would have to be done for them.

In other words, a professional class of activists and fighters would be required to serve as a revolutionary vanguard. Lenin assembled a varied group of landless farmers, professional soldiers, activist intellectuals and attorneys, and criminals to collaborate with him in overthrowing the czar and introducing Bolshevik socialism to Russia. Although Lenin presented his approach as continuous with Marxism, it represented, as socialists around the world recognized, a radical break with and revision of Marxism.

Around the same time, in the early 1920s, the Italian Communist Antonio Gramsci made his own revision of socialist theory by introducing the concept of culture. “Hegemony” was Gramsci’s key concept. He insisted that the capitalists did not rule society solely on the basis of economic power. Rather, they ruled through “bourgeois values” that permeated the cultural, educational, and psychological realm of society. Economics, Gramsci insisted, is a subset of culture. Economics is shaped by culture no less than culture is shaped by the economic basis of society.

For Gramsci, socialist revolution under current conditions was impossible because the working class had internalized bourgeois values. The ordinary worker had no intention of toppling his employers; his aspiration was to become like them. Gramsci’s solution was for socialist activists to figure out a way to break this hegemony, and to establish a hegemony of their own. To do this they would have to take over the universities, the art world, and the culture more generally. In this way they could combat bourgeois culture “from within.”

Lenin and Gramsci provided Marcuse’s starting point. He agreed with both of them that the working class had become a conservative, counterrevolutionary force. But his greatest early influence was a third man, Heidegger. Marcuse read Heidegger’s magnum opus Being and Time and it inspired him so much he apprenticed himself to Heidegger, becoming first his student and then his faculty assistant at the University of Freiburg. Marcuse found in Heidegger a way to ground socialism in something more profound than better salaries and working conditions, in something that transcended Marx’s materialism itself.

The basic idea of Heidegger’s magnum opus Being and Time is that we are finite beings, “thrown,” as Heidegger puts it, into the world, with no knowledge of where we came from, what we are here for, or where we are going. We live in a present, yet we are constantly aware of multiple future possibilities, in which we must choose even though we can only know in retrospect whether we chose wisely and well. This radical uncertainty about our situation, Heidegger argued, produces in us anxiety—anxiety that is heightened by our knowledge of death. “Being,” in other words, is bracketed by “time.” Humans are perishable beings that for the time being are.

Yet how should we “be”? That, for Heidegger, was the big question. Not “what is it good to do?” but “how is it good to be?” Typically, we have no answer to this question; we are barely even aware of it as a question. We go through life like a twig in a current, steered by a tide of sociability and conformity. Thus we lose ourselves; we cease to be “authentic.” Authenticity, for Heidegger, means coming to terms with our mortality and living the only life we get on our own terms. We cannot rely on God to show us the way; we are alone in the world, and have to find a way for ourselves. Frank Sinatra’s song, “I did it my way,” expresses a distinct Heideggerian consciousness.

Marcuse eventually broke with Heidegger when he heard that Heidegger had both joined the Nazi Party and become an apologist for Hitler. Marcuse seems to have had no objection to Heideggers’—or Hitler’s—national socialism, although being partly Jewish, he was naturally less enthusiastic about the accompanying anti-Semitism. Even so, Marcuse continued to draw from Heidegger’s philosophy to illuminate the political problems he was dealing with.

Essentially his problem was the same as the one Lenin faced: if the working class isn’t up for socialism, where to find a new proletariat to bring it about? Marcuse knew that modern industrialized countries like America couldn’t assemble the types of landless peasants and professional soldiers—the flotsam and jetsam of a backward feudal society—that Lenin relied on. So who could serve in the substitute proletariat that would be needed to agitate for socialism in America?

Marcuse looked around to identify which groups had a natural antipathy to capitalism. Marcuse knew he could count on the bohemian artists and intellectuals who had long considered hated industrial civilization, in part because they considered themselves superior to businessmen and shopkeepers. In Germany, this group distinguished “culture”—by which they meant art—from “civilization”—by which they meant industry—and they were decidedly on the side of culture. In fact, they used art and culture to rail against bourgeois capitalism.

These were the roots of bohemianism and the avant garde. “Bohemia,” wrote Henry Murger, “leads either to the Academy, the Hospital or the Morgue.” Elizabeth Wilson in her book Bohemians concurs. “Bohemia offered a refuge to psychological casualties too disturbed to undertake formal employment or conform to the rules of conventional society. It was a sanctuary for individuals who were so eccentric or suffered from such personal difficulties or outright psychological disorder that they could hardly have existeding outside a psychiatric institution other than in Bohemia.”

These self-styled “outcasts” were natural recruits for what Marcuse termed the Great Refusal—the visceral repudiation of free market society. The problem, however, was that these bohemians were confined to small sectors of Western society: the Schwabing section of Munich, the Left Bank of Paris, Greenwich Village in New York, and a handful of university campuses. By themselves, they were scarcely enough to hold a demonstration, let alone make a revolution.

A New Proletariat

So Marcuse had to search further. He had to think of a way to take bohemian culture mainstream, to normalize the outcasts and to turn normal people into outcasts. He started with an unlikely group of proles: the young people of the 1960s. Here, finally, was a group that could make up a mass movement.

Yet what a group! Fortunately, Marx wasn’t around to see it; he would have burst out laughing. Abbie Hoffman? Jerry Rubin? Mario Savio? Joan Baez? Bob Dylan? How could people of this sorry stripe, these slack, spoiled products of postwar prosperity, these parodies of humanity, these horny slothful loafers completely divorced from real-world problems, and neurotically focused on themselves, their drugs and sex lives and mind-numbing music, serve as the shock troops of revolution?

Marcuse’s insight was Heideggerian: by teaching them a new way to be “authentic.” By “raising their consciousness.” The students were already somewhat alienated from the larger society. They lived in these socialist communes called universities. They took for granted their amenities. Ungrateful slugs that they were, they despised rather than cherished their parents for the sacrifices made on their behalf. They sought “something more,” a form of self-fulfillment that went beyond material fulfillment.

Here, Marcuse recognized, was the very raw material out of which socialism is made in a rich, successful society. Perhaps there was a way to instruct them in oppression, to convert their spiritual anomie into political discontent. Marcuse was confident that an activist group of professors could raise the consciousness of a whole generation of students so that they could feel subjectively oppressed even if there were no objective forces oppressing them. Then they would become activists to fight not someone else’s oppression, but their own.

Of course it would take some work to make selfish, navel-gazing students into socially conscious activists. But to Marcuse’s incredible good fortune, the sixties was the decade of the Vietnam War. Students were facing the prospect of being drafted. Thus they had selfish reasons to oppose the war. Yet this selfishness could be harnessed by teaching the students that they weren’t draft-dodging cowards; rather, they were noble resisters who were part of a global struggle for social justice. In this way bad conscience could itself be recruited on behalf of left-wing activism.

Marcuse portrayed Ho Chi Minh and the Vietcong as a kind of Third World proletariat, fighting to free itself from American hegemony. This represented a transposition of Marxist categories. The new working class were the Vietnamese “freedom fighters.” The evil capitalists were American soldiers serving on behalf of the American government. Marcuse’s genius was to tell leftist students in the 1960s that the Vietnamese “freedom fighters” could not succeed without them.

“Only the internal weakening of the superpower,” Marcuse wrote in An Essay on Liberation, “can finally stop the financing and equipping of suppression in the backward countries.” In his vision, the students were the “freedom fighters” within the belly of the capitalist beast. Together the revolutionaries at home and abroad would collaborate in the Great Refusal. They would jointly end the war and redeem both Vietnam and America. And what would this redemption look like? In Marcuse’s words, “Collective ownership, collective control and planning of the means of production and distribution.” In other words, classical socialism.

Okay, so now we got the young people. Who else? Marcuse looked around America for more prospective proles, and he found, in addition to the students, three groups ripe for the taking. The first was the Black Power movement, which was adjunct to the civil rights movement. The beauty of this group, from Marcuse’s point of view, wais that it would not have to be instructed in the art of grievance; blacks had grievances that dated back centuries.

Consequently, here was a group that could be mobilized against the status quo, and if the status quo could be identified with capitalism, here was a group that should be open to socialism. Through a kind of Marxist transposition, “blacks” would become the working class, “whites” the capitalist class. Race, in this analysis, takes the place of class. This is how we get Afro-socialism, and from here it is a short step to Latino socialism and every other type of ethnic socialism.

Another emerging source of disgruntlement was the feminists. Marcuse recognized that with effective consciousness- raising they too could be taught to see themselves as an oppressed proletariat. This of course would require another Marxist transposition: “women” would now be viewed as the working class and “men” the capitalist class; the class category would now be shifted to gender.

“The movement becomes radical,” Marcuse wrote, “to the degree to which it aims, not only at equality within the job and value structure of the established society…but rather at a change in the structure itself.” Marcuse’s target wasn’t just the patriarchy; it was the monogamous family. In Gramscian terms, Marcuse viewed the heterosexual family itself as an expression of bourgeois culture, so in his view the abolition of the family would help hasten the advent of socialism.

Marcuse didn’t write specifically about homosexuals or transgenders, but he was more than aware of exotic and outlandish forms of sexual behavior, and the logic of identity socialism can easily be extended to all these groups. Once again we need some creative Marxist transposition. Gays and transgenders become the newest proletariat, and heterosexuals—even black and female heterosexuals—become their oppressors.

We see here the roots of “intersectionality.” As the Left now holds, one form of oppression is good but two is better and three or more is best. The true exemplar of identity socialism is a black or brown male transitioning to be a woman with a Third World background who is trying illegally to get into this country because his—oops, her—own country has allegedly been wiped off the map by climate change.

These latest developments go beyond Marcuse. He didn’t know about intersectionality, but he did recognize the emerging environmental movement as an opportunity to restrict and regulate capitalism. The goal, he emphasized, was “to drive ecology to the point where it is no longer containable within the capitalist framework,” although he recognized that this “means first extending the drive within the capitalist framework.”

Marcuse also inverted Freud to advocate the liberation of eros. Freud had argued that primitive man is single-mindedly devoted to “the pleasure principle,” but as civilization advances, the pleasure principle must be subordinated to what Freud termed “the reality principle.” In other words, civilization is the product of the subordination of instinct to reason. Repression, Freud argued, is the necessary price we must pay for civilization.

Marcuse argued that at some point, however, civilization reaches a point where humans can go the other way. They can release the very natural instincts that have been suppressed for so long and subordinate the reality principle to the pleasure principle. This would involve a release of what Marcuse termed “polymorphous sexuality” and the “reactivation of all erotogenic zones.”

We are a short distance here from the whole range of bizarre contemporary preoccupations: unisexuality—people falling in love with themselves—group sexuality, pansexuality—people who do not confine their sexuality to their species—and people who attempt to have sex with trees.

Marcuse recognized that mobilizing all these groups—the students, the environmentalists, the blacks, the feminists, the gays—would take time and require a great deal of consciousness- raising or reeducation. He saw the university as the ideal venue for carrying out this project, which is why he devoted his own life to teaching and training a generation of socialist and left-wing activists. Over time, Marcuse believed, the university could produce a new type of culture, and that culture would then metastasize into the larger society to infect the media, the movies, even the lifestyle of the titans of the capitalist class itself.

Marcuse, in other words, foresaw an America in which bourgeois culture would be replaced by avant garde culture. He foresaw a society in which billionaires would support socialist schemes that took away a part of their wealth in exchange for social recognition conferred by cultural institutions dominated by the socialists. Bill Gates, Warren Buffett, and Mark Zuckerberg are three owlish geeks who were probably ridiculed in junior high school; they don’t seem to mind paying higher taxes if they can now hobnob with comedians, rock stars and Hollywood celebrities. Why only be rich when you can also be rich and cool?

Marcuse’s project—the takeover of the American university, to make it a tool of socialist indoctrination—did not succeed in his lifetime. In fact, as mentioned above, he got the boot when Governor Reagan pressured the regents of the university system not to renew Marcuse’s contract. In time, however, Marcuse succeeded as the activist generation of the 1960s gradually took over the elite universities. Today socialist indoctrination is the norm on the American campus, and Marcuse’s dream has been realized.

Marcuse is also the philosopher of Antifa. He argued, in a famous essay called “Repressive Tolerance,” that tolerance is not a norm or right that should be extended to all people. Yes, tolerance is good, but not when it comes to people who are intolerant. It is perfectly fine to be intolerant against them, to the point of disrupting them, shutting down their events, and even preventing them from speaking.

Marcuse didn’t use the term “hater,” but he invented the argument that it is legitimate to be hateful against haters. For Marcuse there were no limits to what could be done to discredit and ruin such people; he wanted the Left to defeat them “by any means necessary.” Marcuse even approved of certain forms of domestic terrorism, such as the Weather Underground bombing the Pentagon, on the grounds that the perpetrators were attempting to stop the greater violence that U.S. forces inflict on people in Vietnam and other countries.

Our world is quite different now from what it was in the 1960s, and yet there is so much that seems eerily familiar. When it comes to identitfy socialism, we are still living with Marcuse’s legacy.

Great America

If We Can Just Print Money, Why Are We Still Paying Taxes?

Now that the politically connected have gorged on public money, perhaps the people who really make this economy work can have a share of the next bonanza.

In the bowels of the Bureau of Engraving and Printing, a middle-aged bureaucrat with a clipboard and a hardhat pilots an electric cart past a row of gigantic printing machines as sheets of $100 bills blur through the rollers. He reaches a neatly-arranged cube of $100 bills looking for a Post-it note he placed earlier that day. He locates the note, which reads: “Property of the Federal Reserve.”

Placing a checkmark on the top sheet of his clipboard, he removes the note and drives to a second, smaller cube of treasury bills. The bureaucrat finds a second Post-it note, which says, “Property of the U.S. Treasury” and exchanges the first Post-it note. The scene, which I describe metaphorically, will repeat many, many, many times as the United States pays the trillions of dollars Congress authorized to spend in excess of tax revenue.

But those are just the constitutionally authorized expenditures. Beyond those outlays, the Federal Reserve embarked on a spending spree of $2.77 trillion in just 10 weeks. Add that to the approximately $3 trillion Congress appropriated, and we have nearly $6 trillion in printed money being added to the money supply. Is that a lot? Look at this chart, which shows an explosion in the money supply in just the last few weeks. Since late last year, the money supply is now on its way to doubling.

Common sense and history suggest that printing more money does not change the stock of goods and services that can be purchased with that money. With more dollars chasing relatively unchanged or diminished goods and services, prices must rise to balance the supply of money with the supply of things that can be bought with money.

Not so, say adherents to a new faith called “Modern Monetary Theory.” According to this theory, “Governments with a fiat currency system can and should print . . . as much money as they need to spend because they cannot go broke or be insolvent . . .”

Investopedia explains, “While supporters of the theory acknowledge that inflation is theoretically a possible outcome from such spending, they say it is highly unlikely, and can be fought with policy decisions in the future if required. They often cite the example of Japan, which has much higher public debt than the U.S.”

In the short run, the voodoo magic of MMT seems to be working. The price of gold, which sometimes acts as a proxy for gauging the stability of a currency’s value, has increased from just under $1,500 to around $1,700 an ounce in roughly the same period that the money supply has exploded.

But if the stimulus from 2008 is any guide, the new money will not be evenly distributed. Following the massive post-2008 stimulus of trillions of Fed dollars and government spending, the wealth disparity within the United States dramatically increased. All signs suggest that it’s happening again. A recent piece in Forbes shows that billionaires are getting even richer as a result of the COVID-19 panic. Unless you’re politically connected, you’re probably not near the front of the line for the bags of cash being tossed around right now.

One study found that the government loses 7 percent of spending to waste, fraud, and abuse. We’ve already seen early indicators that thieves have pounced on the money Congress appropriated for small businesses. Politically connected Planned Parenthood received $80 million in loans even though virtually no abortion operations were shuttered during the quarantine. Even the Chinese have made claims to taxpayer money intended for American businesses.

We could have done this differently.

In all of 2019, the federal government collected $3.46 trillion in tax revenue, far less than the spending spree of the last two months. Instead of printing all this money we could have said, “no taxes for the rest of 2020.” Instead of public money being carried-off by parasitic fraudsters and politically connected zombie companies that will never again be profitable, we could have targeted financially viable jobs and businesses.

Only profitable businesses and gainfully-employed workers pay taxes, so a universal tax holiday would have targeted the money to its best purpose. But we didn’t do that because that plan wouldn’t have scratched the political itch to hand out favors to connected friends and donors. Tax relief would have given a hand up to the only forces capable of leading us out of the massive recession we are now experiencing.

And, by the way, the Fed isn’t supposed to be spending money like this. The Constitution provides that, “No Money shall be drawn from the Treasury but in Consequence of Appropriations made by Law.” Washington, D.C. loves spending money without authorization from Congress because it shields elected leaders from political accountability. It’s highly undemocratic to let the money supply be hijacked by unaccountable interests.

So why are we paying taxes at all? Whether you believe in MMT or not, our government is following it. Instead of handing out bags of cash to thieves and politically-connected interests (I apologize for the redundancy), we could be stimulating the economy in a meaningful and fraud-free way.

Now that the politically connected have gorged on public money, perhaps the people who really make this economy work can have a share of the next bonanza.

Books & Culture

This essay is adapted from “The Coming of Neo-Feudalism: A Warning to the Global Middle Class,” by Joel Kotkin (Encounter Books, 288 pages, $28.99)

A New Age of Feudalism for the Working Class?

If too many of the American working class lack any hope of improving their condition, we could face dangerous upheaval in the near future.

In the past, fears of job losses from automation were often overstated. Technological progress eliminated some jobs but created others, and often better-paying ones. In the early days of the high-tech revolution, many of the pioneering firms—such as Hewlett-Packard, Intel, and IBM—were widely praised for treating their lower-level workers as part of the company and deserving of opportunities for advancement, as well as benefits including health insurance and a pension.

The labor policies of the newer generation of tech giants tend to be vastly different. Firms like Tesla have been sued for failing to pay contract workers the legally mandated overtime rates, and for depriving them of meal and rest breaks. The Tesla plant has wages below the industry average, according to workers, and risk of injury higher than the industry average, notes a pro-labor nonprofit. Given that the high housing prices keep them living far from the workplace, some workers sleep in the factory hallways or in their cars.

“Everything feels like the future but us,” complained one worker.

The largest tech employer today is Amazon, with 798,000 employees worldwide in 2019. Amazon tends to pay its workers less than rivals do. Many employees rely on government assistance, such as food stamps, to make ends meet. When the company announced it was adopting a minimum wage of $15 an hour, it also cut stock options and other benefits, largely wiping out the raises, at least for long-term employees.

The average Amazon worker in 2018 made less than $30,000 annually, about the same as the CEO made every 10 seconds.

Working conditions at Amazon are often less than optimal. Warehouse workers in Britain reportedly were urinating in bottles to avoid being accused of “time-wasting” for taking breaks. Amazon has also patented wristbands that track employee movements, described as a “labor-saving measure.” Those who can’t keep up the pace are written up and then fired, said one British worker. “They make it like the ‘Hunger Games.’ That’s what we actually call it.”

Apple manufactures virtually all its products abroad, mostly in China, although medical concerns and political factors might change that. In addition to its own employees there, the company relies on the labor of more than 700,000 workers—roughly 10 times its U.S. employment—to build Apple products at contractors like Foxconn. These workers suffer conditions that have led to illegal strikes and suicides; workers often claim they are treated no better than robots.

From Proletariat to Precariat

In the old working-class world, unions often set hours and benefits, but many low-status workers today are sinking into what has been described as the “precariat,” with limited control over their working hours and often living on barely subsistence wages.

One reason for this descent is a general shift away from relatively stable jobs in skill-dependent industries or in services like retail to such occupations as hotel housekeepers and home care aides.  People in jobs of this kind have seen only meager wage gains, and they suffer from “income volatility” due to changing conditions of employment and a lack of long-term contracts.

This kind of volatility has become more common even in countries with fairly strong labor laws. In Canada, the number of people in temp jobs has been growing at more than triple the pace of permanent employment, since many workers who lose industrial jobs fail to find another full-time permanent position. The same patterns can be seen in traditionally labor-friendly European countries. From 20 to 30 percent of the working-age population in the EU15 and the United States, or up to 162 million individuals, are doing contract work. A similar trend shows up in developing countries such as Kenya, Nigeria, South Africa, Vietnam, Malaysia, and the Philippines.

Even in Japan, long known as a country of secure long-term employment, the trend is toward part-time, conditional work. Today, some 40 percent of the Japanese workforce are “irregular,” also known as “freetors,” and this group is growing fast while the number of full-time jobs is decreasing. The instability in employment is widely seen as one reason for the country’s ultra-low birth rate.

Many of today’s “precariat” work in the contingent “gig” economy, associated with firms such as Uber and Lyft. These companies and their progressive allies, including David Plouffe (who managed Barack Obama’s presidential campaign in 2008), like to speak of a “sharing” economy that is “democratizing capitalism” by returning control of the working day to the individual. They point to opportunities that the gig economy provides for people to make extra money using their own cars or homes. The corporate image of companies like Uber and Lyft features moonlighting drivers saving up cash for a family vacation or a fancy date while providing a convenient service for customers—the ultimate win-win.

Yet for most gig workers there’s not very much that is democratic or satisfying in it. Most are not like the middle-class driver in Uber ads, picking up some extra cash for luxuries. Instead, they depend on their “gigs” for a livelihood, often barely making ends meet. Almost two-thirds of American gig workers in their late 30s and 40s—the age range most associated with family formation—were struggling to pay their bills. Nearly half of gig workers in California live under the poverty line. One survey of gig workers in 75 countries including the United States found that most earned less than minimum wage, leading one observer to label them “the last of Marx’s oppressed proletarians.”

The reasons for their precarious situation are not hard to locate. Gig workers lack many basic protections that full-time workers might have, such as enforcement of civil rights laws. Workers without representation, or even set hours, do not have the necessary tools to protect their own position; they are essentially fungible, like day laborers anywhere. Robert Reich, former U.S. secretary of labor, has gone so far as to label the “sharing” economy a “share-the-scraps” economy. Rather than providing an “add on” to a middle-class life, gig work for many has turned out to be something closer to serfdom.

Cultural Erosion in the Working Class

The downward economic trajectory of the working class has been amplified by cultural decline. The traditional bulwarks of communities—religious institutions, extended family, neighborhood and social groups, trade unions—have weakened generally, but the consequences are most damaging for those with limited economic resources.

Social decay among the working class echoes what occurred in the first decades of the industrial revolution, when family and community structures and bonds of religion buckled and often broke. Rampant alcoholism spread “a pestilence of liquor across all of Europe,” wrote the Marxist historian E. J. Hobsbawm. In the mid-19th century, 40,000 prostitutes plied their trade in London. The physical condition of British workers was horrible: most were malnourished and suffered various job-related maladies. As late as 1917, only one-third of young males were considered to be in good health.

In America and elsewhere today, the working classes lag behind the affluent in family formation, academic test scores, and graduation rates. Marriages may be getting more stable in the upper classes, as the sociologist Stephanie Coontz has shown, but as many as 1-in-3 births in the nation occurs outside matrimony. In some working-class neighborhoods, particularly those with a large proportion of ethnic minorities, four-fifths of all children are born to unmarried mothers. The rate of single parenting is the most significant predictor of social immobility across the United States and in Europe as well.

These social patterns parallel changes in economic trends. A detailed study in the United States published in 2017 shows that when towns and counties lose manufacturing jobs, fertility and marriage rates decrease, while out-of-wedlock births and the share of children living in single-parent homes increases. In addition, a variety of health problems—obesity, diabetes, disease of the heart, kidney, or liver—occur at much higher rates when family income is under $35,000 than when it is over $100,000. Between 2000 and 2015, the death rate increased for middle-aged white Americans with a low educational level. Anne Case and Angus Deaton say this trend owes primarily to “deaths of despair”: suicides as well as deaths related to alcohol and drugs, including opioids. In Europe likewise, a health crisis including drug addiction and drug-related deaths has emerged in old industrial areas, especially in Scotland.

In East Asia, traditionally known for strong family structures, the working class is showing signs of social erosion. Half of all South Korean households have experienced some form of family crisis, mostly involving debt, job loss, or issues relating to child or elder care, notes one recent study. Japan has a rising “misery index” of divorces, single motherhood, spousal and child abuse—all of which accelerate the country’s disastrous demographic decline and deepen class division.

An even greater social challenge may emerge in China, where some authorities are concerned about the effects of deteriorating family relations, particularly in care for aging parents. The government has started a campaign to promote the ideal of “filial piety,” a surprising revival of Confucian ideals by a state that previously attempted to eradicate them.

The problem of family breakdown is especially severe in the Chinese countryside. The flow of migrants into the cities in search of work has resulted in an estimated 60 million “left behind children” and nearly as many “left behind elderly.” The migrants themselves suffer from serious health problems, including venereal disease at rates far higher than the national norm, but the children left behind in rural villages face especially difficult challenges. Scott Rozelle, a professor at Stanford University, found that most of these children are sick or malnourished, and as many as two in three suffer from anemia, worms, or myopia. Rozelle predicts that more than half the left-behind toddlers are so cognitively delayed that their IQs will never exceed 90. This portends a future of something like the Gammas and Epsilons of Brave New World.

The Gentrification of the Left

In developed nations, as the middle classes are being proletarianized and the working classes fall further behind, the longstanding alliance between the intellectual Left and the working class is dissolving.

Already in the 1960s, New Left radicals such as C. Wright Mills and Ferdinand Lundberg disparaged the mental capacity of average Americans. Most of the population, according to Lundberg, were “quite misinformed, and readily susceptible to be worked upon, distracted.” The general acceptance of capitalism by the working class, as well as questions of race and culture, led many on the Left to seek a new coalition to carry the progressive banner. For its part, the working class has moved away from its traditional leftist affiliation not only in the United States but also across Europe and the United Kingdom.

“The more than 150-year-old alliance between the industrial working class and what one might call the intellectual-cultural Left is over,” notes Bo Rothstein, a Swedish political scientist. He suggests that a “political alliance between the intellectual left and the new entrepreneurial economy” could replace the old “class struggle” model and provide a way to “organize public services in a new and more democratic way.”

Across Europe, traditional parties of the Left now find their backing primarily among the wealthy, the highly educated, and government employees. Germany’s Social Democrats, France’s Socialists, and the British and Australian Labor parties have been largely “gentrified,” as has America’s Democratic Party, despite the resurgence of “democratic socialism” as part of its ideology. They have shifted their emphasis away from their historic working-class base, toward people with college and graduate degrees.

Even more than disagreements over immigration and cultural values, differences in economic interests have driven a wedge between the established Left and the working class. The agenda promoted by the leftist clerisy and the corporate elite—on immigration, globalization, greenhouse gas emissions—does not threaten their own particular interests. But it often directly threatens the interests of working-class people, especially in resource-based industries, manufacturing, agriculture, and construction. Environmental policy in places like California and western Europe has tended to ignore the concerns of working-class families.

The continuing heavy use of coal, oil, and other fossil fuels—still increasing in countries like India and China—may present a danger to humanity’s future, but it has contributed greatly to wealth creation and the comfort of the working class since the 18th century. Plans for a drastic reduction in the use of carbon-based energy by 2050 would force middle-class Americans to be more like North Koreans in their energy consumption.

In Europe, green energy mandates have caused a spike in energy costs. As many as one in four Germans and over half of Greeks have had to spend 10 percent or more of their income on energy, and three-fourths of Greeks have cut other spending to pay their electricity bills, which is the economic definition of “energy poverty.” These mandates have far less impact on the wealthy.

In their zeal to combat climate change, the clerisy have taken aim at things like suburban homes, cars, and affordable airfare. The lifestyles of the middle and working classes are often criticized by the very rich, who will likely maintain their own luxuries even under a regime of “sustainability.” A former UK environment minister said that cheap airfare represents the “irresponsible face of capitalism.” Apparently the more expensive travel done by the wealthy, including trips by private jet to conferences on climate change, is not so irresponsible. New regulations and taxes on fuel imposed by France’s aggressively green government sparked the gilets jaunes uprising, as well as the previous bonnets rouges protests in Brittany.

Those in today’s intellectual Left are concerned about the planet and about international migrants, but not so much about their compatriots in the working class. The French philosopher Didier Eribon, a gay man who grew up in a struggling working-class family in provincial Reims, describes a deep-seated “class racism” in elite intellectual circles toward people like his family.

Working-class voters in France were joyful at the socialist victory in the 1981 election, but then found themselves supporting a government whose priorities turned out to be “neoliberalism,” multiculturalism, and modernization. One result is widespread cynicism toward the political establishment. Eribon recalls his socialistically inclined mother saying, “Right or Left, there’s no difference. They are all the same, and the same people always end up footing the bill.”


As the major left-leaning parties in high-income countries have become gentrified, the political orientation of working-class voters is realigning. Populist and nationalist parties in Sweden, Hungary, Spain, Poland, and Slovakia have done particularly well among younger votes. In fact, many of the right-wing nationalist parties are led by millennials. American millennials too are surprisingly attracted to right-wing populism. In November 2016, more white American millennials voted for Donald Trump than for Hillary Clinton. Their much-ballyhooed shift toward the Democratic Party has reversed, and now less than a majority identify as Democrats.

More broadly, a sense of betrayal among those being left behind by progress is leading to defections from mainstream parties of both Right and Left. Among the working classes and the young, there is a steady growth of far-Left opposition to the established liberal order, as well as strong support for the far Right. This increasing movement away from the center and toward the fringes is not an ideal formula for a stable democratic society.

As Tocqueville put it, we may be “sleeping on a volcano.”

Peasant Rebellions

Will the world’s working classes accept their continuing decline? We are already seeing what might be described as “peasant rebellions” against the globalist order that is being constructed by the oligarchs and their allies in the clerisy. In recent years, an insurrectionary spirit has surfaced in the Brexit vote, the rise of neonationalist parties in Europe and authoritarian populists in Brazil and the Philippines, and of course the election of Donald Trump.

At the core of these rebellions against the political mainstream lies the suspicion among the lower classes that the people who control their lives—whether corporate bosses or government officials—do not have their interests at heart. The slow-growth economy that emerged from the Great Recession benefited the financial elite and property speculators, but did little for the vast majority of people. Firms like Apple have profited from soaring stock prices and low-wage Chinese production while less capital-rich businesses have struggled.

These lopsided economic results have prompted attrition from the traditional mainstream political parties in many countries.

In multiparty democracies, a reaction against economic globalization and mass immigration, among other policies, has resulted in pronounced movement to the political fringes. One Harvard study found that anti-establishment populist parties across Europe expanded their share of the electorate from 10 percent in 1990 to 25 percent in 2016. At the same time, center-Left parties are losing ground to far-Left parties or candidates.

Is this only a prelude to a more serious kind of rebellion—one that could undermine democratic capitalism itself?

A Brief History of Peasant Rebellions

Admirers of medieval feudalism highlight the concept of mutual obligation between the classes. The upper clergy and the military aristocracy practiced a kind of noblesse oblige that provided a floor (albeit often insufficient) for the lower classes. But the obligations of the lower to the higher classes may have been no more voluntary than those binding the Cosa Nostra.

The medieval poor did not always accept their miserable situation quietly. Uprisings broke out as early as Charlemagne’s reign in the 9th century, and became more common in the later Middle Ages. Violent peasant armies actually bested aristocratic knights in the Low Countries in 1227, in Northern Germany in 1230, and in the Swiss Alps in 1315. The brutal 14th century brought a rash of peasant rebellions and urban insurrections. French peasants burned down manors of the wealthy in the Jacquerie of 1358, aiming to “destroy all the nobles and gentry in the world and there would be none any more.” After being routed by armies of nobility and gentry, the insurgents were subjected to a campaign of reprisal that cost an estimated 20,000 lives.

In England, a labor shortage following the great plague resulted in higher pay and more mobility for laborers, but Parliament and big landowners took measures to hold down wages and keep peasants on their estates. Then, a new poll tax sparked a large-scale uprising led by Watt Tyler in 1381. A radical priest named John Ball traveled up and down England stirring up peasants, and in a speech outside London he famously asked: “When Adam delved and Eve span, who was then the gentleman?” The rebels’ demands included abolition of serfdom and feudal service, an end to market monopolies and other restrictions on buying and selling, and confiscation of clerical property.

Violent uprisings of peasants or urban poor also broke out in many other places, including Flanders, Florence, Lübeck, Paris, Transylvania, Croatia, Estonia, Galicia, and Sweden. But the biggest social upheaval before the French Revolution was the great Peasants’ Rebellion of 1525 in Germany. Among the demands presented in the “Twelve Articles of the Peasantry” were the abolition of serfdom, restrictions on feudal dues, the right to fish and hunt, and the right of peasants to choose their own priest. The rebels took inspiration from Martin Luther’s doctrine of a “priesthood of all believers,” but Luther himself became horrified by their violence. The rebellion was put down so savagely that it dissuaded further uprisings in Germany.

Only rarely did such rebellions prove successful, like the one by the Swiss peasants. The ruling powers sometimes used treachery to quell uprisings by offering pardons that were eventually revoked. In 17th-century England, Cromwell’s “respectable revolution” quashed the efforts of the Levellers to extend Parliament’s war against the monarchy into a radical egalitarian reordering of society. Southern and western France endured frequent rural protests through much of the seventeenth century.

Peasant rebellions also occurred in other parts of the world, often with greater ferocity. Japan had numerous ikki or peasant uprisings, particularly in the fifteenth century; the consolidation of power under the shogun in 1600 finally put an end to the disturbances. There were numerous uprisings and revolutions in Mexico, but it was only in the early 20th century that the peones finally overturned the quasi-feudal regime left over from the Spanish legacy. They achieved significant land reform, but at the cost of well over 1 million lives.

In Russia, with its overwhelmingly rural society, peasant rebellions were commonplace by the 17th century. A revolt among Ural Cossacks under Emelian Pugachev threatened the czarist regime in 1773, during the reign of Catherine the Great. The rebellion failed, as did some 550 others, but in 1917 the peasants rose up to support Lenin’s seizure of power. When the Soviet regime began to confiscate land for collectivization, the property-loving muzhiks rebelled, only to be put down ruthlessly.

Arguably the most powerful peasant rebellion occurred in China, in 1843. After failing civil service exams several times, Hung Hsiu-ch’uan read some Christian tracts and connected their message with hallucinations he had experienced. He designed his own religion, in which he was part of the Holy Trinity, but with doctrines based mainly on the Ten Commandments, and he preached it to destitute laborers. His Taiping Rebellion called for the overthrow of the Manchu Ch’ing dynasty, land reform, improving the status of women, tax reduction, eliminating bribery, and abolishing the opium trade. The rebellion was finally put down more than a decade later, with massive loss of life. Some of the Taiping program would later be adopted by Sun Yat-sen, who would overthrow the imperial regime, and then by Mao Tse-tung and the Communists.

The Revolt Against Mass Migration

The contemporary versions of peasant rebellions, particularly in Europe and the United States, are in large part a reaction against globalization and the mass influx of migrants from poor countries with very different cultures. The numbers of international migrants worldwide swelled from 173 million in 2000 to 258 million in 2017; of these, 78 million were living in Europe and 50 million in the United States.

Mass migration from poorer to wealthier countries seems all but unstoppable, given the great disparities between them. According to a Gates Foundation study, 22 percent of the people in sub-Saharan Africa live in extreme poverty, defined as subsisting on less than $1.90 a day. By 2050, the region will be home to 86 percent of the world’s poorest people, and about half that number will live in just two countries, Nigeria and the Democratic Republic of the Congo. For the extremely poor in such countries, who see little to no chance of improving their condition at home, a dangerous trek to Europe or some other wealthy place would seem worth the risk.

Many people in Europe have welcomed migrants from poorer countries, including former colonies. Political and cultural elites in particular have elevated cosmopolitanism and “diversity” above national identity and tradition. Tony Blair’s “Cool Britannia” was an effort to highlight cultural diversity as a central part of modern Britain’s identity. Herman Lebovics, in Bringing the Empire Back Home: France in the Global Age (2004), pondered how to redefine what it means to be French in a multicultural age.

When Germany’s chancellor, Angela Merkel, flung the doors wide open to a huge wave of refugees and migrants from the war-ravaged Middle East in 2015, many ordinary Germans were eager to show Gastfreundschaft, or hospitality, as were many people elsewhere in Europe. By the end of that year, nearly a million refugees had entered Germany alone, and the public welcome turned cold. Merkel’s decision came to be widely unpopular with Germans and the vast majority of Europeans.

A year after the rapid influx of refugees began, Pew Research found that 59 percent of Europeans thought immigrants were imposing a burden on their country, while only a third said that immigrants made their country a better place to live. Among Greeks, 63 percent said that immigrants made things worse, as did 53 percent of Italians. In 2018, Pew found 70 percent of Italians, almost 60 percent of Germans, half of Swedes, and 40 percent of French and British citizens wanting either fewer or no new immigrants; barely 10 percent wanted more.

In the years following Merkel’s decision to set out the welcome wagon, virtually all European countries—including such progressive ones as the Netherlands, France, Denmark, Norway, and Germany itself—have tightened their immigration controls. This has been done chiefly to counter the populist (and at times quasi-fascist) nativist movements growing in many countries: Hungary, Poland, Austria, France, the Netherlands, Sweden, Finland, Slovakia, and most importantly in Germany.

Much of the support for populist parties comes from the working class and lower-middle class, who are more exposed to the disruptions and dangers that the migrants have often brought, and are generally more burdened by the public expense of accommodating them. Even in Sweden, where the citizens have long prided themselves on tolerance, there is widespread anger about rising crime and an unprecedented level of social friction in a formerly homogeneous country.

Some of the anti-immigrant movements that have sprung up espouse racist views, but others are far less odious, being simply opposed to the globalizing policies of elites and their indifference to the concerns of average citizens. Some have found inspiration in the Middle Ages, such as the example of the Frankish king Charles Martel, who defeated Muslim invaders in the 8th century. Fans of Donald Trump presented images of him as a Crusader clad in chainmail with a cross embroidered on the front.

The conflict over immigration divides largely along class lines. There is a huge divergence between elite opinion, which generally favors mass immigration, and that of majorities in the working and middle classes. France’s president, Emmanuel Macron, acknowledged this divergence in 2015 when he said, “The arrival of refugees is an economic opportunity. And too bad if [it] isn’t popular.”

If political elites in Europe regard open borders as good for the economy, corporate elites in the United States are eager to import skilled technicians and other workers, who typically accept lower wages. The tech oligarchs in particular like to hire from abroad: in Silicon Valley, roughly 40 percent of the tech workforce is made up of noncitizens. Steve Case, the former CEO of America Online, has suggested that immigrant entrepreneurs and workers could offset middle-class job losses from automation. Some conservative intellectuals have even thought that hardworking newcomers should replace the “lazy” elements of the working class. Some of the earliest opposition to the Trump Administration focused on his agenda of curtailing immigration.

Somewheres vs. Anywheres

Ironically, the people who most strongly favor open borders are welcoming large numbers of immigrants who do not share their own secular, progressive values. That is particularly true in Europe, where migrants and refugees from Muslim countries often hold very conservative or reactionary views on things such as homosexuality and women’s rights; many even support female genital mutilation. Some European politicians and other leaders, including the archbishop of Canterbury, have proposed that elements of Muslim sharia law, such as a prohibition of blasphemy, could be applied on top of existing national standards.

Giles Kepel, one of France’s leading Arabists, observes that Muslims coming to Europe tend to possess “a keen sense” of cultural identity rooted in religion, while the media and academia tend to promote the “erasing of identities,” at least for the native population. Rather than defend their own values, Europeans and others in the West have been told by their leaders that “they must give up their principles and soul—it’s the politics of fait accompli.” This “erasing of identities” is not widely popular among the working and middle classes.

The British writer David Goodhart describes a cultural conflict between the cosmopolitan, postnational “anywheres” and the generally less educated but more rooted “somewheres.” If the media and most high-level government and business leaders in Europe have an “anywhere” perspective, people in less cosmopolitan precincts outside the capital cities tend to remain more strongly tied to national identities, local communities, religion and tradition. These divisions were particularly evident in the vote on Brexit and the Conservative sweep in 2019.

The “somewhere” sentiment has repeatedly been expressed in votes concerning the European Union. In addition to the Brexit referendum of 2016, French, Danish, and Dutch voters have opted against deeper or broader EU ties, preferring a stronger national “somewhere.” Less than 10 percent of EU residents identify themselves as Europeans first, and 51 percent favor a more powerful nation-state, while only 35 percent want power in Brussels to be increased.

As long as the political and economic elites ignore these preferences, populist rebellions against establishment parties will likely continue and could become more disruptive. Elite disdain for traditions of country, religion, and family tends to exacerbate class conflict around cultural identity. “Liberalism is stupid about culture,” observed Stuart Hall, a Jamaican-born Marxist sociologist.

In the United States, discontent with the globalist and open-borders agenda of the oligarchs and the upper clerisy resulted in strong working-class support for Donald Trump in 2016. He won two out of every five union voters and an absolute majority among white males. Like his European counterparts, Trump ran strongest in predominantly white, working-class and lower-middle-class areas—precisely the areas hardest hit by globalization. He appealed most to people who work with their hands, own small shops, or are employed in factories, the logistics industry and energy sector; those who repair and operate machines, drive trucks, and maintain our power grid. Among white voters at least, he did poorest with well-educated professionals.

To many voters, Trump was “a champion for forgotten millions.” When surveyed, these voters put a high priority on bringing back manufacturing jobs, protecting Social Security and Medicare, and getting conservatives on the Supreme Court—ahead of building a wall to keep out undocumented immigrants, who are widely seen as cutting into labor wages for American citizens. Even though he came from the business elite, Trump met almost universal opposition from the dominant classes. Instead, he won over voters who see big corporations as indifferent to the well-being of working people. Like some of the populist movements in Europe, the American populist Right has adopted many of the class-based talking points, although usually not the policies, associated with the pre-gentrified Left.

In the higher echelons of the clerisy, the response to the populist revolt has mostly been revulsion. It’s Time for the Elites to Rise Up Against the Ignorant Masses” was the title of an article by James Traub in Foreign Policy in the summer of 2016. A former New York Times writer, Traub asserted that the Brexit vote and the nomination of Donald Trump, among other developments, indicate that the “political schism of our time” is not between Left and Right, but “the sane vs. the mindless angry.” Larry Summers, a former Obama Administration official, took a more astute view of the matter: “The willingness of people to be intimidated by experts into supporting cosmopolitan outcomes appears for the moment to have been exhausted.”

Is There a Mass Insurrection in the Making?

In the late 1920s and early 1930s, the proletarianization of the middle class resulted in widespread support for Communism, Fascism, and National Socialism. Today, as in Europe before World War II, people on both right and left often blame financial institutions for their precarious situation. Anger at the financial services sector gave rise to the Occupy Wall Street movement in New York City and the many spinoff Occupy protests in 2011-12. Marching under the slogan “We are the 99 percent,” protesters around the world decried the heavy concentration of wealth in a few hands.

Alienation from the political mainstream today is resulting in strong support for far-Left parties and candidates among youth in various high-income countries. In France’s presidential election of 2017, the former Trotskyite Jean-Luc Mélenchon won the under-24 vote, beating the more youthful Emmanuel Macron by almost two to one among that age group. In the United Kingdom, the Labour Party under the neo-Marxist Jeremy Corbyn in 2018 won more than 60 percent of the under-40 vote, while the Conservatives got just 23 percent. He won the youth vote similarly in 2020, even amidst a crushing electoral defeat. In Germany, the Green Party enjoys wide support among the young.

A movement toward hard-Left politics, particularly among the young, is also apparent in the United States, which historically has not been fertile ground for Marxism.

In the 2016 primaries, the openly socialist Bernie Sanders easily outpolled Hillary Clinton and Donald Trump combined among under-30 voters. He also did very well among young people and Latinos in the early 2020 primaries, even as other elements of the Democratic Party rejected him decisively. Support for socialism, long anathema in America, has gained currency in the new generation. A poll conducted by the Communism Memorial Foundation in 2016 found that 44 percent of American Millennials favored socialism while 14 percent chose fascism or Communism. By 2024, Millennials will be the country’s biggest voting bloc by far.

The core doctrines of Marxism are providing inspiration for labor unrest in China today, particularly among the younger generation of migrants to the cities. Activists often find themselves prosecuted for threatening “the social order.” Communist officials have been put in the awkward position of cracking down on Marxist study groups at universities, whose working-class advocacy conflicts with the policies of the nominally socialist government.

Democratic capitalist societies need to offer the prospect of a brighter future for the majority. Without this belief, more demands for a populist strongman or a radical redistribution of wealth seem inevitable. A form of “oligarchic socialism,” with subsidies or stipends for working people, might stave off destitution while allowing the wealthiest to maintain their dominance. But the issue boils down to whether people—not just those with elite credentials and skills—actually matter in a technological age.

Wendell Berry, the Kentucky-based poet and novelist, observed that the “great question” hovering over society is “what are people for?” By putting an “absolute premium on labor-saving measures,” we may be creating more dependence on the state while undermining the dignity of those who want to do useful work.

The future of the working class should concern us all. If too many lack any hope of improving their condition, we could face dangerous upheaval in the near future.

Great America

How COVID-19 Plagues Religious Freedom

This is a brave new world for the uncontrolled and inexplicably self-confident forces of belligerent American secularism.

As the coronavirus crisis unfolds and the 2016 election and post-electoral scandals ooze into the open, God is affronted and false gods disintegrate. The discussion over the opening of churches is generally presented as a public health issue, coupled with a First Amendment freedom of religion argument. But, in many cases, it is an outright assault on the practice of religion generally.

Compared to other advanced Western democracies, the United States is a country that practices religion. But the media, academia, and conventional wisdom embedded in the contemporary ethos of America’s governing elites is, estimating very roughly, one-quarter religious communicants or sympathizers, one-quarter agnostic, one-quarter atheist, and one-quarter anti-theist. All of these groups, of course, are entitled to have and to express their opinions—but they are not entitled to impose their opinions on others. 

What Is “Essential”?

At the beginning of the coronavirus national shutdown in March, an argument was made that as religious congregations are frequently quite large and generally the congregants are physically close together, there was a legitimate public health claim to suspend services in all houses of worship. This was a legally vulnerable endeavor from the beginning because of the First Amendment prohibition on infringement of free exercise of religion, and because the criterion for essential services was arbitrarily decided by secular leaders. 

Liquor stores and gun stores were deemed to be essential even though houses of worship, judged to be essential by a large part of the population, were not so deemed by officials. Practically all of the religious denominations gamely went along in the time of crisis and offered virtual services over the internet.

But as data accumulated revealing that the risk of this virus was minimal to all but those with challenged immune systems, especially the elderly, it was clear that religious congregations, composed almost entirely of relatively purposeful and responsible people, would only pose a hazard to elderly congregants, and that these could normally be relied upon to be prudent about their own health and in any case to take responsibility for their own actions. 

There was thus no public health reason to attempt to continue the ban on religious services after it became clear that such measures were unnecessarily restrictive. The United States reached this point of public awareness by the middle of April and what has happened since has been harassment of religious practice by secular authorities masquerading as defenders of public health.

Our Anti-Theist Elites

It is impossible to be precise about the motives of individual governors and mayors diligently hiding behind masks of concern for public health. But it safely may be assumed that in some cases—such as the egregious Democratic governor of Virginia, Ralph Northam, who advocates infanticide and a semi-permanent lockdown—that a dislike for any acknowledgment of the possible existence of a divine intelligence or any spiritual forces active in human life partially informs the authoritarian regime he has inflicted upon his fellow Virginians. 

Though most of the influential founders of the institutions of American government were not diligent adherents to particular religious sects and had a somewhat “Age of Reason” perspective, they were concerned in the First Amendment to assure that one religious group did not oppress other religious groups. This was the burden of President Washington’s famous addresses to the Roman Catholics of Maryland and to the congregation of the Newport synagogue, that whatever oppressions they or their ancestors faced in the old world would not be replicated in the United States of America. 

In general, the United States has been faithful to Washington’s pledge and it would be an exaggeration to imply that it has departed from it in the last couple of months. But what American society is facing now was an unforeseen problem: today the anti-theists, a coalition of materialists, pagans, and both world-weary and militant cynics is tentatively seeking to suppress the practice of or indulgence of any religion. The United States has been a tolerant rather than a pious country, and this is what is under threat. 

Jefferson was a deist of very diluted faith in anything other than the wonders of the world. Franklin was more or less of a good-humored agnostic. Though his funeral train in 1790 was followed by every clergyman in Philadelphia, he did not have a religious funeral. Even the much-celebrated but rather desiccated Justice Oliver Wendell Holmes apologetically stated that he was a Unitarian because in the Boston of his youth, “You had to be something religiously, and Unitarian was the least you could be.” 

Of course, Jefferson, Franklin, and Holmes did not try to prevent those who wished to practice a religion from doing so. This is a brave new world for the uncontrolled and inexplicably self-confident forces of belligerent American secularism.

The Gods That Failed

There is no acceptable argument for seeking to prevent people from practicing a religion. It is an impulse that man has had since his earliest days, and is prompted by an awareness of the inability of the human mind to grasp the notions of how things began and what their physical and temporal extent might be, and how miraculous events and spiritual insights can be assimilated into any concept of cosmic order. 

There is the further complication that once any notion of supernatural, otherworldly, spiritual, or miraculous forces are dispensed with, a vacuum is created which is filled at best by vacuous imposters but more often and more infamously by people elevating themselves to the stature of gods and celebrating themselves in impressive but often repulsive pagan festivals. This tradition can be traced from the mists of antiquity to giants of classical history such as Alexander the Great and Julius and Augustus Caesar who raised themselves to the status of gods. Their festivals were replicated by Robespierre, Hitler, and Stalin, but modern anti-theistic self-elevated deities have naturally magnified the oppressions of the ancient world with the application of what Winston Churchill described, in reference to the Third Reich as tyranny “made more sinister, and perhaps more protracted, by the lights of perverted science.”

Even an absurd state governor like Northam is not in any sense reminiscent of any of the individuals just mentioned. But if the militant atheists of the American media-academic complex who have practically taken over the Democratic Party are any more overt in their ambitions, they will be confronted by the great majority of Americans who either practice a religion or acknowledge that people have a right to do so. 

Those who aggressively despise religion in America are going to have to resign themselves to its imperishability or erupt from their fetid closet and acknowledge what their true aims are, and take their chances with the pluralistic system that they have deviously attempted to manipulate.

As this subplot unfolds, America’s most recent secular demigod, Barack Obama, is every day exposed as an empty legend crumbling into sawdust. He is not the idealist hailed in 2008 seeking American brotherhood and a Socratic state with less materialism and greater intellectual elevation. He is a failed and corrupt president whose principal achievements were a thoroughly inadequate reform to the healthcare system, insane and unconstitutional commitment to ecological fantasies, and an almost catastrophic attempt to appease America’s enemies, particularly the demented theocracy of Iran. 

As false gods fail, the believers in a real God persevere.

First Principles

Americans Deserve Open Debate About Big Tech and Free Speech

The tech industry and its advocates may not think this debate should happen, but lawmakers certainly do.

President Trump on Thursday signed an executive order regarding Section 230 of the Communications Decency Act, the provision of the law that shields tech companies from liability for censoring content on their platforms.

This comes on the heels of Twitter deciding it would “fact-check” one of the president’s tweets about voting by mail. Chinese propaganda and outright falsehoods from the World Health Organization, meanwhile, remain unmolested by Twitter, self-appointed guardians of truth that they are.

I’ve been writing and speaking about this question for a while, most recently in Newsweek, because it has stirred internecine conflict on the Right between individuals who think social media companies should remain free from policy intervention (ignoring, of course, that they thrive as a result of Section 230, itself a government policy) and those, like me, who believe that these corporations have accumulated a troubling amount of power over our lives, data, behavior, and the free market.

What the Debate Is Really About

This dispute was on display recently, when my Newsweek piece was countered as a “right-wing attack on Sec. 230” by Patrick Hedger, a research fellow at the Competitive Enterprise Institute, a libertarian think tank.

Hedger’s rebuttal parses my opinion piece on the technical merits of how I’ve described the liability shield given to the tech industry by Congress. But Hedger fails to address the key points of my argument: the societal and speech ramifications of private corporations acting to censor protest content and, more broadly, the fact that the tech industry needs more accountability in exchange for the government protection it receives.

The argument advanced by proponents of Section 230 as it relates to the First Amendment grows louder as concerns about tech continue to grow. The short version generally distills to this: these companies have First Amendment rights to remove whatever content they want. Go pound sand, you ignorant fool. (If you think I’m exaggerating about that last bit, see this rant from Mike Masnick of the tech industry blog, TechDirt.)

It’s a straw man for the conversation many of us are actually trying to have, which is one about the consequences of the growing power of the tech industry and whether or not our federal policy toward the industry should be reformed as a result. It is important, not just as it relates to Section 230, but also as it relates to Big Tech and individual liberty, data privacy, and market access.

In other words, the conversation I am attempting to have is not one that goes back and forth about the merits of policy minutiae as it is currently written. It’s about what should be done. And a rebuttal to that requires a counterargument, rather than a repeated exegesis about how Section 230—a statute whose broad interpretation has been stretched “outlandishly”—is currently interpreted.

And when it comes to Section 230 specifically, it’s a conversation worth having. The provision, snuck in the back door of the Communications Decency Act of 1996, was never actually debated by the Congress that passed it—in part because the concept of “being online” was still so nascent and social media did not exist in its current form. Its application since then largely has been determined by aggressive litigation from the tech industry, not by public debate.

Congress Has Taken Notice

But, regardless of original intent, it’s a conversation worth having because so many Members of Congress are interested in having it. Republican Senators Ted Cruz (R-Texas), Marsha Blackburn (R-Tenn.), John Kennedy (R-La.), Joni Ernst (R-Iowa), Kevin Cramer (R-N.D.), Josh Hawley (R-Mo.), Lindsay Graham (R-S.C.), Marco Rubio (R-Fla.), Representatives Louie Gohmert (R-Texas) and Paul Gosar (R-Ariz.) have all expressed interest in reexamining Section 230. Reps. Ken Buck (R-Colo.), Jody Hice (R-Ga.), and Senator Tom Cotton (R-Ark.) have all raised other concerns about Big Tech.

Libertarian Rep. Thomas Massie (R-Ky.) has thrown up a flare about “big business working very hard for big government” in apparent speculation over what Apple and Google may do with the contact-tracing technology they have developed.

The tech industry and its advocates may not think this debate should happen, but lawmakers certainly do.

So, too, do Americans across the political spectrum, 77 percent of whom told Gallup that they think Big Tech has too much power. An example I used to highlight this was Facebook’s removal of anti-lockdown protest content—content that is not illegal but based on nonbinding state government advisories. 

“So what?” replies Hedger. Facebook also removes content that tells you to eat Tide pods, even though that’s not technically illegal. And that’s a good thing.

As a legal matter, the two may be the same, but as a practical matter, conflating the two disregards how Americans prize their right to assemble. Facebook has the right to remove whatever content it wants but, at least in the minds of most people, suppressing the ability of certain users to organize otherwise constitutionally protected activities is subjectively quite different from removing content suggesting one poison oneself with detergent.

There’s a reason members of Congress weighed in about removing protest content but shrugged off the Tide pods. One type of content, regardless of its relative public health wisdom, hews very closely to a sacrosanct right in America. The other is just dumb.

In a similar way, these companies, somehow now relegated to the role of arbiter of debate in the public square, are making subjective determinations about what political content is true or just “misleading,” and labeling or banning it as such. (Brendan Carr, a commissioner at the FCC, has suggested the novel approach of just letting Americans decide for themselves.)

False Dichotomies

Hedger goes on to discuss Section 230 with the binary framing that requires maintaining the law as it is, lest the internet descend into a smutty chaos of porn and extremism. Because if Facebook cannot remove protest organization content, it then also cannot remove terrorist beheading videos. 

This is a false dichotomy. Mostly because few would suggest Section 230 be eliminated outright. Those who have argued for reform recognize that it is a point of leverage to compel more transparent behavior from the tech companies; a rhetorical point, rather than a literal one.

What I am suggesting, as many others have, is that there are reasonable steps Congress can take to generate both more accountability and less centralization from tech, while still maintaining the moderation that everyone deems important.

Other countries, for instance, don’t have a facsimile of a Section 230 policy but still manage to have a free and functional internet. And Congress has already amended Section 230 once, to make websites more accountable for the sex trafficking content that flourishes online. Last time I checked, the internet was still working. Albeit, with less sex trafficking. Most of us think that’s a good thing.

The conversations around Big Tech tend to take on the flavor of a national theology, and tech innovators the status of demigods.

A more realistic conversation would recognize Section 230 for what it is: a congressionally authorized subsidy for the tech industry (Eric Goldman, whom Hedger cites in his own defense, acknowledges as much while arguing that the subsidy should be kept). It would also recognize the tech industry for what it is: a collection of massive companies with unprecedented amounts of power, now being investigated by the federal government and 50 state attorneys general as monopolies.

And like other industry subsidies, Section 230 deserves reevaluation and debate as part of a larger conversation about the growing power of the tech industry over behavior, speech, individual data, privacy, market access—and even elections.

Section 230 reform may not be the silver bullet to any of these concerns, but it’s certainly a part. And while Hedger may not acknowledge that any of the above concerns are real, increasingly, a plurality of Americans do. And so does Congress.


What Do Progressives Want? It Depends on Which Ones You’re Talking About

Today’s progressives diverge from the originals in their professed interest in democracy.

Progressives are a chameleon-like group, having morphed from Republicans, to Bull Mooses (or is it Meese?) to Democrats. Now is a good time to evaluate what progressives want.

Today, they are more like the cuckoo than the male of the moose species. Having taken over the Democratic Party nest, the Progs pushed out the offspring of FDR, its traditional inhabitants, the tribunes of labor and small farmers the party once nurtured.

If we examine the record, we find on some issues today’s progressives hold fast to the precepts of the original movement, while on others, not so much. 

Let’s start with the similarities.

The original progressives, essentially, were an anti-(small-d) democratic movement. They believed modern industrial society had grown too complex to be ruled by the rabble, aka representative democracy. To their thinking, science had advanced to the point that technocrats schooled in the latest management techniques could run the country more efficiently than anyone else.

Frederick Winslow Taylor, a contemporary of the original progressives, captures the central conceit of the movement. Taylor was the father of time-and-motion studies, a leader of the Efficiency Movement and the original management consultant. He would go into the industrial workplace and, with stopwatch in hand, break each job into its component tasks and measure them to the split second with the goal of finding the “one best way” to do it. Managers would employ “methods based on a scientific study” in Taylor’s world.

Taylor’s technique, dubbed scientific management, was celebrated by progressives, taught at Harvard in 1908 and embraced by a University of Chicago professor named James McKinsey, founder of McKinsey & Company consultants.

The quest to bring scientific management to all aspects of society spawned the urban renewal movement. It’s important to remember that Robert Moses, the man who demolished New York City in order to save it, as Robert Caro has exhaustively documented, came out of the Progressive movement and exemplifies all its prejudices and conceits.

Moses saw the human scale, organic neighborhoods of New York as filthy and backward, ripe for replacement with large “efficient” standardized apartment blocks. This Progressive instinct for centralization, consolidation and efficiency could be seen in the growth of supermarket chain stores that began displacing mom-and-pop grocers and butchers in the 1920s and 30s. The “reformers”—in league with the chain stores—argued the small shops were filthy, backwards, and inefficient.

The same way Taylor sought scientific efficiency in the workplace, Margaret Sanger, another progressive luminary, advocated a more scientific approach to procreation in order to improve the human race. Vladimir Lenin, who brought “scientific socialism” to Russia, was another fan of Taylor’s scientific management.

Labor unions loathed Taylor (the feeling was mutual), accusing him of squeezing every ounce of sweat from workers. Today’s employees have the same fear and loathing when McKinsey & Co., Taylor’s heir, shows up to axe their jobs in the name of improving efficiency.

Progressives have kept their faith in the cult of the expert—paging Dr. Fauci—and scientific management. Whatever nonsense they perpetrate, whether on public health or climate trends, they buttress it with increasingly shrill invocations of “science.” Governor Cuomo proudly announces he has McKinsey & Company advising him on the state’s pandemic management. He should be glad they failed so dramatically or progressive New Yorkers might be asking why we need an elected government at all—or even deep state bureaucrats—when we have McKinsey experts.

Now, as then, progressives aren’t particularly keen on labor unions. They pay lip service to “working people” but the left wing of the Democratic Party is fixated on illegal immigration, LGBLT and climate change—concerns of the professional and managerial, rather than the working class. 

But today’s progressives diverge from the originals in their professed interest in democracy.

The progressives default to experts on every issue of governance, even demanding a multi-pronged multi-phase expert-conceived plan for every facet of every business’ operations before allowing an opening. But they have dispensed with the experts and expert discussion when it comes to how to vote in November. They’ve already figured it out—it’s by mail.

Today’s progressives say anyone skeptical about turning the mailbox into the ballot box just wants to suppress the vote. 

And that’s the turnabout, for the original progressives were eager to suppress the vote.

In the name of “good government” reform, those progressives purged the voter rolls, tightened voting requirements and disenfranchised millions. The original progressives eliminated measures such as universal automatic registration that today’s progressives want to reinstate

The good government reformers of the early 20th century were horrified by the political machines stuffing ballot boxes with the votes of illiterate immigrants and farmers.

Virginia progressive Carter Glass (he of the Glass-Steagall Act which ended stock speculation by banks) summed up their sentiment: “Nothing can be more dangerous to our republican institutions than to see hordes of ignorant and worthless men marching to the polls and putting in their ballots against the intelligence and worth of the land.”

In the 19th century, voters either registered in person once and for life or government registrars compiled lists of eligible voters and registered them automatically. “Good government” progressives replaced that with periodic in-person registration in order to deter repeat and out-of-state voters—the very fraud today’s progressives say doesn’t happen.

Political parties used to give voters ballots with the party’s candidates—and only those candidates—for voters to cast. Progressives replaced the party ballot with the government-issued secret ballot we know today, featuring multiple candidates for different offices. They did this not to offer voters more choices, but to discourage the illiterate from voting.  

Today, the progressives say they want to expand the franchise even though their affinity toward government-by-expert renders elected offices null.

One suspects if they felt they could legalize pre-marked ballots, they would.

But for now, they will settle for mailing a ballot to every name on the poorly maintained voter rolls, recruit an army like the newly hired contact tracers, help the less-motivated, illiterate, confused, deceased or non-existent recipients fill out the ballots, then collect and return them.

If you have a problem with that plan, you’re not one of today’s progressives.

You’re one of the originals.  

Great America

Elon Musk is the Greatest American Industrialist of the 21st Century

Shaming Elon Musk is easy, but it isn’t accurate. It’s based on half-baked libertarian theories that don’t work in the real world.

It has been fashionable to criticize Elon Musk as lacking the qualities of a true entrepreneur, or not being a genuine free market capitalist. His primary transgression: his companies have taken advantage of government subsidies.

Before considering whether or not these criticisms are fair or justified, or even terribly relevant, it might be a good idea to examine Musk’s body of work. Because so far, 20 years in, this 48 year old immigrant from South Africa arguably is the greatest American industrialist of the 21st century.

Musk’s early work, back in the 1990s, focused on software and online financial services, including PayPal. The sale of his stakes in these companies made Musk wealthy, but what he’s done since then is what secures his place in history.

Tesla, Musk’s best known affiliation, has brought electric cars into the mainstream. It’s easy to forget the risk Tesla’s founders, Martin Eberhard and Marc Tarpenning, endured back in 2003  when they first bundled laptop batteries into a storage package capable of powering an electric sports car. Recognizing the potential, Musk invested millions in the company and eventually took over as CEO.

Today, 17 years later, Tesla is valued at $151 billion. In 2019 Tesla reported sales of $26 billion and an operating cash flow of $2.6 billion. Its “levered free cash flow” in 2019 (surplus cash after paying interest on debt) was $1.6 billion. According to investors, for whom economic data is paramount, Tesla is the most valuable American car maker of all time.

Elon Musk has accelerated global adoption of all-electric vehicles by making them increasingly affordable and earning unprecedented consumer satisfaction. Critics of what Tesla has accomplished are invited to drive one. But Tesla’s accomplishments go beyond just manufacturing popular electric vehicles.

In Nevada, Tesla has built the largest battery manufacturing plant in the world. This single factory now produces half the global output of electric car batteries. The Tesla Gigafactory also produces the “Powerwall,” a stationary battery that allows homeowners to store surplus electricity.

There are a lot of reasons to remain skeptical regarding renewable energy. But massive investment in battery technology by companies like Tesla has paid off. Solar-battery power plants are now able to deliver continuous electricity at a wholesale price of four cents per kilowatt-hour, and make a profit.

Tesla has also developed a network of charging stations around the nation. If you aren’t driving a Tesla, you might not realize how ubiquitous charging stations have become. Typically installed in the outer reaches of shopping center parking lots, a Tesla driver can see with one tap on the vehicle’s control screen not only where the nearest charging stations are located, but also how many slots are vacant.

All-electric cars are not yet for everyone. But with recharge times down to 30 minutes, and range topping 300 miles, they are looking better every year. They require far less maintenance than gas-powered vehicles, and are becoming more competitive on price every day. Tesla is not just building cars, it is fundamentally transforming our transportation infrastructure.

With all the attention Tesla gets, it’s easy to forget about Space X. But Musk’s accomplishments with this company are even more impressive. Founded in 2002 by Musk, SpaceX is the “first private company to launch and return a spacecraft from Earth orbit and the first to dock a spacecraft with the International Space Station.” The engineering innovations pioneered by Space X are revolutionary, including fully reusable rockets. Vertical landings of booster rockets, after many failed attempts, are now becoming routine.

Before Space X came on the scene, in the late 20th century, the Space Shuttle could deliver payload into low earth orbit at a cost of over $25,000 per kilogram. For a while, the early Space X boosters competed with NASA’s mature Atlas V booster, with costs dropping below $10,000 per kilogram on these unmanned systems. But in 2017 Space X pulled ahead, way ahead, with the Falcon 9 booster profitably delivering cargo into space at a cost of under $2,000 per kilogram, and in 2020 the Falcon Heavy has brought the price under $1,000 per kilogram.

In just a few years, and compared to the best NASA could do, Space X has dropped the price of getting into space by an order of magnitude. And in a few days, American astronauts are going to blast into outer space on an American rocket, built by Space X, for the first time since the Shuttle was retired. How is this not historic?

Musk’s projects extend well beyond electric cars and electric batteries, or paving the way to the colonization of the solar system. His Boring Company aspires to revolutionize tunneling technology by achieving the benchmarks stated on their FAQ page: “(1) Triple the power output of the tunnel boring machine’s cutting unit, (2) Continuously tunnel instead of alternating between boring and installing supporting walls, (3) Automate the tunnel boring machine, eliminating most human operators, (4) Go electric, and (5) Engage in tunneling R&D.” And why not? If you can innovate above the earth, you can innovate beneath the earth.

In describing the Boring Company, Musk said, “the construction industry is one of the only sectors in our economy that has not improved its productivity in the last 50 years.” He’s right. The world needs more innovators who are not only able to envision how new technologies can coalesce to transform the world, but who also have the guts to do something with their ideas.

When people criticize Elon Musk, what are they trying to prove? Do they think that his companies aren’t part of a modern industrial revolution that rivals the great breakthroughs ushered in during the great age of steel and steam, or during this ongoing digital revolution? Do they think the railroads that opened up a continent weren’t subsidized? Do they think the internet, providing the backbone of a communications revolution, was not subsidized?

More to the point, does anyone think that if the total value of the subsidies awarded Space X were instead invested in NASA, it would still be possible to launch a payload into space for under $1,000 per kilogram?

Another reason Musk attracts criticism is his eccentric personality. Examples abound. Enigmatic tweets. Selling flamethrowers. Smoking pot (after California legalized it) during an interview. Naming his sixth son X Æ A-Xii! Fair enough. But what about other great American industrialists, equally creative, equally driven, equally eccentric? What about Thomas Edison, J. Paul Getty, Henry Ford, or Howard Hughes? Heck, what about Steve Jobs?

Maybe being eccentric is just a part of being brilliant, driven, creative, and willing to take extraordinary risks. Shall we shame all these great, and very eccentric Americans? If so, Musk’s critics may get in line behind every nihilistic Luddite and socialist pack animal determined to undermine everything and everyone that made America great.

Which brings us to a final criticism of Musk, that he is a “socialist.” Evidence for this is thin. It is primarily based on the idea that if you accept government subsidies, you are not a true capitalist. But Musk expressed his version of socialism very accurately in one of his tweets, writing “By the way, I am actually a socialist. Just not the kind that shifts resources from most productive to least productive, pretending to do good, while actually causing harm. True socialism seeks greatest good for all.”

While this is a tweet guaranteed to make libertarian heads explode, it appeals to common sense. Government, by definition, is to some degree socialist. The only thing separating a mixed-capitalist economy and a full-blown socialist economy is the degree to which the government controls the economy. The middle of Musk’s sentence is controversial, but not because it’s too socialist. It is because it exposes the uncomfortable choice that governments have to make. Shall they yield to the populist demands of demagogic Democrats, and spend government revenue on the “least productive, pretending to do good, while actually causing harm,” or shall government revenue instead be invested in public/private partnerships that secure technological preeminence and economic security, benefiting everyone?

A libertarian would emphatically argue neither, and this reflects an absurd naïveté for several reasons. First, other nations have no compunction about exporting subsidized products, thus making it impossible for American manufacturers to compete. When this happens in critical industries, from steel to pharmaceuticals, eventually our nation loses its independence. That’s reason enough to subsidize strategic industries. But there are more.

When libertarians argue against government spending in all sectors, they get strong support from the Left to stop spending on industry and infrastructure. This splits the Right, which then lacks the strength to prevent spending shifting to welfare entitlements at the expense of spending on industry and infrastructure. When the anti-socialist politicians are divided, the socialists win.

Shaming Elon Musk is easy, but it isn’t accurate. It’s based on half-baked libertarian theories that don’t work in the real world. As for accusing Elon Musk of not being a “conservative,” what does that even mean? “Conservatives” stood by for decades as American business exported jobs and imported unskilled laborers, killing jobs and wages. Anyone concerned about America’s future in this grim world should be utterly indifferent about being called a “conservative.”

Ultimately, how history judges Elon Musk may come down to forces beyond his control. What is going to happen between the United States and China? If there is a new cold war, how will Musk manage his overseas investments, his supply chain, his factories in Berlin and Beijing? Like many industrialists in the 21st century, he may soon face difficult decisions. But to-date, Elon Musk has played a vital role in maintaining American industrial leadership. He deserves better than cheap shots.

Great America

The Real COVID Numbers Should Factor Out Nursing Homes

The oversight of not separating statistics about long term care facilities from the totals has had enormous economic consequences.

The New York Times headline screams: “U.S. Deaths Near 100,000, Incalculable Loss.” It should have read “Nursing Home Deaths Near 50,000, Unfathomable Negligence.” Both headlines are true, but only the imagined one is about a loss that could have been mitigated by proper policies and with well-understood protective measures. We know far less about what could have been done differently to avoid the fate of the other 50,000 or so people who succumbed to COVID-19, caused by the SARS-CoV-2 virus. So, the Times was alerting us to nothing useful.

It is becoming more universally known that governors in several states, most notably the Democratic governor of New York, Andrew Cuomo, mandated that nursing homes accept residents being released from the hospital after being diagnosed with COVID-19. Combined, the top seven states (New Jersey, New York, Massachusetts, Pennsylvania, Michigan, Illinois, and Connecticut) account for half the deaths in long-term care facilities (LTCFs), which account for nearly half of the total U.S. COVID-19 deaths.

Historians and public health experts in the future will try to sort out to what extent the governors’ policies were responsible. It is clear that the resident population in LTCFs have a much higher incidence of comorbidities (conditions making them more susceptible to disease and death) and are far older than the general population (an appropriate allusion to prison these days). LTCFs are also generally ill-equipped to handle contagious diseases, and in far too many cases, are not maintained or sanitized as they should be. We will leave that debate to others for now.

But the refashioned headline does imply that there are really two statistical universes when it comes to COVID-19 mortality tracking: LTCFs and everywhere else. That means the relevant number for the average U.S. resident is the 50,000 or so deaths outside of the LTCF universe.

Examining that universe leads to different implications than would be reached using the gross number alluded to by the Times. As of May 22, the American Heart Association reports 94,708 deaths from COVID-19. Looking deeper into the data by state and combining it with data from about nursing home fatalities by state, the data allows us by subtraction to examine the impact of the virus on the general population. (For the 11 states not reporting on nursing homes, we applied the average of the others.)

It turns out that 34 states plus D.C. and Puerto Rico each had fewer than 500 fatalities from the virus over the course of three months; of those, 17 states have reported fewer than 100 deaths. In those 36 jurisdictions, it is unlikely that the healthcare system was ever threatened (although in some rural settings distance to care is a challenge, and Puerto Rico may be chronically underserved). 

It is also doubtful that the more severe restrictions imposed, especially lockdowns, were necessary given the small number of serious cases. The basic distancing, group limitations, hygiene, and masking protocols arguably would have been totally sufficient. Businesses without tightly packed workers could have continued with minor accommodations.

Then there is a middle tier of 10 states with deaths ranging between 500 and 1,600. Only the “top” six states had mortality over 2,000, with New York far ahead at nearly 25,000. It should have been obvious early on, even with only the initial fraction of cases and deaths, that the more draconian public health measures should be implemented first in the six or 16 most affected states. Those measures easily could have been held off in the remaining locations pending actual results.

This data was available in a timely fashion, so resorting to a “one-size-fits-all” approach was inappropriate. But the political pressure grew from those unwilling to accept the “flatten the curve” imperative as being sufficient and who believed that preventing every possible death was worth any price in economic harm. Also, the experts seem to have gotten sidetracked believing the novelty of this virus extended to methods of transmission, which now does not appear to be the case. 

What is novel, in fact, is the long incubation period and the ability of pre-symptomatic and asymptomatic individuals to infect others. In a sense, the experts panicked and advised almost universal draconian measures, while inexplicably overlooking the value of masks to prevent infected individuals from spreading the virus before they knew they had it.

The current fashion is to criticize loudly the people attempting to manage this pandemic. As expected, much of this is based on pure opinion and outright speculation. There is usually an underlying assumption that, had other people been involved from the outset, the decisions and results would have been different and better without the benefit of hindsight or knowledge gained over the course of the pandemic. That, of course, is impossible to know.

Any objective observer (the species seems to be endangered) must acknowledge that the scientific knowledge was rapidly evolving, and the public reporting was contradictory and often wrong. This then stoked a fear factor that had political consequences and strongly influenced policymakers. For all the reasons above, there is no compelling basis to argue that another set of people would have reacted in a significantly different way.

The oversight of not separating statistics about LTCFs from the totals, however, has had enormous economic consequences. Limiting restrictions (at least until actual experience could provide a clearer road map) in places with low general population death rates (not skewed by LTCF mortality) would have made far more sense and caused far less pain. Lack of scientific knowledge is not a viable explanation for this shortcoming. 

Sadly, the press is unlikely even to recognize the underlying nuances of the data and we can expect more of the Times’ kind of shock reporting rather than the careful analysis once embraced by career journalists.

Greatness Agenda

Let America Work Again

It shouldn’t take planes falling out of the sky or a pandemic to convince American policymakers of the need not only to make great things in America but also to afford Americans the opportunity to make them.

It was Monday morning on March 10, 2019, when Ethiopian Airlines Flight 302 lurched away from the gate, rolled to a sprint, and peeled its wheels off the runway for the last time. Aboard, 157 souls including eight Americans and one veteran on vacation doing missionary work, were flying.

Six minutes after takeoff, Flight 302 plunged back to earth, trailing white smoke across the sky until reaching its terminus near Bishoftu, Ethiopia. All aboard perished when the Boeing 737 Max 8 aircraft screamed into the ground at nearly 700 miles per hour, leaving a massive crater with wreckage driven up to 30 feet deep into the soil.

That tragedy put visa worker programs in the spotlight after a report in Bloomberg revealed Boeing had cut costs by outsourcing 737 Max production to low-paid foreign subcontractors, coders, and software engineers.

Mark Rabin, a former Boeing software engineer, said the company’s decision to outsource “was controversial because it was far less efficient than Boeing engineers just writing the code.” It often was the case, he said, that “it took many rounds going back and forth because the code was not done correctly.” 

Boeing denies outsourcing played any role in the faulty flight-control software that forced the plane into a fatal nosedive. 

Though the flames that consumed Flight 302 have long since flickered out, the economic crisis in the wake of the coronavirus pandemic has reignited the debate about outsourcing and visa worker programs. 

Over the weekend, 30 college student organizations sent a letter to President Donald Trump: “Mr. President, in addition to ending the Bush-Obama [Optional Practical Training] program, we strongly urge you to suspend the H-1B program.” 

These college students are correct to be concerned and they have an interest in paying attention to what the administration does next. These programs now pose an acute threat to their job prospects and the wages of our best and brightest, especially those in science, technology, engineering, and mathematics. As things stand, not only will the class of 2020 enter the worst job market since the Great Depression, it will compete with a government-subsidized labor force of H-1B and OPT workers as well.

Optional Practical Training is a component of the F-1 visa program, which enables foreign nationals to study as full-time students in the United States. As a condition for entry, U.S. Citizenship and Immigration Services asks F-1 students to maintain a residence abroad that they “have no intention of giving up.” In other words, applicants must show their intention to return to their country of origin at the end of their study. But OPT offers a loophole out of this requirement.

F-1 visa holders can apply for 12 months of “temporary” employment through OPT, including those who have graduated and should be on their way home. At the end of that period, foreign nationals in STEM may apply for a 24-month extension

A tax break of as much as $12,000 incentivizes American businesses in hiring OPT workers over equally, or even more qualified young Americans. Unlike the H-1B, which requires employers to pay workers a wage that corresponds to the occupation and the region where they will be employed, there are no wage requirements for OPT hires. 

Many F-1 visa holders who work the full 36 months with OPT will overstay their visas and slip into illegal status. Not to worry, though, because they can pursue a green card by transferring to a dual intent visa, such as the H-1B.

With a few twists and turns, then, foreign nationals who entered the United States as temporary students join the ranks of H-1B visa workers displacing Americans in STEM occupations. 

As with OPT, there are underhanded incentives not to hire American, such as loopholes in the program’s wage rules that make it easy for employers to underpay H-1B workers compared to Americans. 

The Economic Policy Institute earlier this month found 60 percent of H-1B positions certified by the U.S. Department of Labor are assigned wage levels well below the local median wage for the occupation. This sad fact is an open secret among H-1B employers.

“I know from my experience as a tech CEO that H-1Bs are cheaper than domestic hires,” said Vivek Wadhwa, an advocate for expanding visa worker programs. “Technically, these workers are supposed to be paid a ‘prevailing wage’ but this mechanism is riddled with loopholes.” 

It’s common that employers who share Wadhwa’s fondness for outsourcing force Americans to train their lower-paid H-1B replacements as a condition of severance pay. As a result, there are millions of STEM degree holders in the United States who have given up seeking employment in their fields because firms have incentives either not to hire them or not to retain their services in favor of hiring cheaper visa workers. This contributes to the narrative that America has a severe STEM worker shortage, exaggerated to benefit those who exploit outsourcing.

Critical worker shortages would be reflected in substantially higher pay, but wages in many STEM occupations—such as computer science and information technology—remain stagnant. The Wall Street Journal acknowledged a direct connection between the H-1B program and “lower wages and employment for American tech workers” in 2017. 

Even if there are fewer workers than jobs, the result is a healthy, tight labor market that encourages employers either to make themselves more attractive by offering better wages or invest in labor-saving technologies. The alternative is the perpetuation of programs that provide companies with cheap labor at the cost of American jobs.

In the aftermath of Flight 302, it came to light that Boeing primarily used the H-1B program to outsource. But OPT hires also may have played a role. Rabin, who was involved in software testing for the 737 Max, remarked about offices across from Seattle’s Boeing Field, where “recent college graduates employed by the Indian software developer HCL Technologies Ltd. occupied several rows of desks” working on code for Boeing. These “temporary workers” made as little as $9 an hour to develop and test software, compared to the $41$46 hourly wages of a regular hire on a 40-hour workweek.

It shouldn’t take planes falling out of the sky or a pandemic to convince American policymakers of the need not only to make great things in America but also to afford Americans the opportunity to make them. President Trump, as students wrote last week, “can make this right by ending the OPT program and suspending the H-1B program.”

Great America

Life Is Risky

Continuing these measures going forward is unconscionable. We are depressed, demoralized, put upon, and now a poorer nation on account of our elites’ panic about the coronavirus.

Perhaps the most unserious response to the coronavirus pandemic has been the facile assertion that lockdowns, the destruction of the economy, and the suppression of our historic freedoms are all justified if they “save just one life.” As Joe Biden put it on Twitter, “I’ve said it before, and I’ll say it again: No one is expendable. No life is worth losing to add one more point to the Dow.”

While every person is unique and has an immortal soul, we do not do anything and everything to save lives from all hazards, nor should we. Adults know that there are no easy solutions to most problems, and real life consists of tradeoffs.  

No One Really Believes “Just One Life” Nonsense

First, we do not save every life because doing so, however noble our intentions, would be incredibly destructive. Protecting life—in other words, paying for safety—takes away from resources that make life worth living. Does everyone in the “just one life” chorus drive a Volvo? And, for that matter, do those who do also wear a crash helmet while driving or riding in this car? Of course not. Most drive a less safe car so they can afford to go on vacation or get a haircut or buy a new iPhone.  

Doing everything possible to save “just one life” would bring life to a halt. Driving a car, for example, is always risky. But we still drive. Similarly, all human interaction has some small chance of spreading a fatal disease, but we still value time with friends and family. Before coronavirus, there was the flu. It kills tens of thousands of Americans every year, especially in vulnerable populations. But no one seriously contemplated a national shutdown to counter that yearly tragedy. 

Second, trying to save lives from particular threats may increase the risks from others. Not every investment in safety has the same return, and a range of risks must be balanced. Even accepting the dubious moral premise that saving “just one life” is always worth any expense, doing so might use up resources that could be more effectively deployed against other risks. 

We tend to overinvest in certain risks because they’re frightful or merely salient. Many people are worried about coronavirus right now, but do they worry much about the 83,000 who die of diabetes and over 40,000 who die of suicide every year? How about the 67,000 dead from drug overdoses? Or the increases in all of these numbers coming from the shutdowns?

Ideally, we would spend money rationally to maximize the saving of lives and improvement of life overall. But in practice this is all haphazard and distorted by cognitive biases of various kinds.

Finally, we all voluntarily risk death in order to pursue other things of value. This goes beyond thrill-seekers who jump out of planes. All of us do this in more prosaic ways all the time. We take risks in getting behind the wheel, getting on an airplane, choosing what to eat, having a cocktail, living in a city, working on the job, or going for a swim. 

We just celebrated Memorial Day. Surely, our military victories were not secured by a monomaniacal concern for “saving just one life.” Victory came first.  

We all have an intuitive sense about these things, and individuals vary in their appetites for risk. But we know that the mere extension of life is not the point of life, and we each risk death in some ways every single day.   

We Must Value Life Finitely In Order to Make Rational Public Policy

When making policies for others, including designing systems or compensating victims of negligence, we must account for the economic value of life more precisely. This is not undertaken because we are callous about the value of life. It is undertaken precisely because we do value life. In order to demonstrate that in legal terms, however,  we have to reduce this value to a number, a dollar amount. 

Economists, public health experts, and juries have spent a lot of time pondering the question of the value of a life. They have come up with some sensible answers. While no one would forfeit his life for any amount of money—as you can imagine, the marginal value of money declines precipitously at that point—people are willing to be compensated for a 1 in 1,000 or 1 in 10,000 risk to life. 

These risk premiums provide good data on how much economic value people place upon their own lives. This is why an Alaskan fisherman or a coal miner is paid more than, say, a cashier or marketing assistant. If these dangerous jobs command a $1,000 premium for a 1 in 10,000 on-the-job risk, we can extrapolate how people value statistical lives. Incidentally, $1,000 is about the high end of the risk premium for a 1 in 10,000 fatality risk. Thus, various studies have concluded that the economic value of a statistical life is between $5 and 10 million dollars—simple math here $1,000 multiplied by 10,000. 

Obviously, merely stating this conclusion sounds a bit callous. But without reducing statistical lives to an economic figure, other choices also denominated in numbers—choices about medical treatments, the placement of traffic lights, airbags, or vaccines—cannot be made in a rational way. Doctors and other decision-makers could not know if it’s worthwhile to spend $100 per patient or $1,000 or $1 million to do one more test, one more surgery, or one more multi-trillion-dollar quarantine effort. 

The National Lockdown’s Cost Far Exceeds Its Benefit

Setting aside for a moment that many of the fatalities blamed on the coronavirus may have died from something else, or have had few years left to live in any case, and that the lockdowns may not have stopped even a single coronavirus case—what if we assume that these measures cut the fatalities in half? What if, in their absence, 200,000 would have died, rather than 100,000? Would they be cost-justified?  

In the last few months, at least twenty million Americans have lost their jobs. If we assume they made an average of $50,000 per year and are out of work on average for at least a year—how long this will be is anyone’s guess—this is an economic loss of at least $1 trillion. Added to this cost, government deficit spending to forestall an economic freefall has already been in excess of $2 trillion. The loss in paper wealth in the form of retirement accounts is another $2-4 trillion. A total loss in excess of $5 trillion has been incurred to save lives from the coronavirus.

Even assuming lockdowns worked and prevented 100,000 deaths—an extremely generous set of assumptions—the economic benefit would be about $1 trillion assuming the high-end $10 million statistical life value above. 

In other words, even under the best-case scenario and best-case assumptions, the costs of the government’s actions here are at least 5 times higher than the benefit. If these measures saved 10,000 lives, they are 50 times higher. If no lives were saved, it was all cost and no benefit. 

When things have a cost, it pays to be hard-headed. Government-mandated social distancing, lockdowns, and other efforts to prevent the spread of the coronavirus only make sense based on fantastic and now demonstrably false assumptions about the coronavirus’ mortality and contagiousness (so-called R0). 

It appears no one in charge ever bothered to do some back-of-the-envelope math on whether these measures were worth it. With the benefit of hindsight, we can now conclude that this has been a man-made disaster, the consequences of which far exceed the benefits.

Continuing these measures going forward is unconscionable. We are depressed, demoralized, put upon, and now a poorer nation on account of our elites’ panic about the coronavirus.  The “data-driven” fanatics turn out to simply be following the whims of the slippery Dr. Fauci. 

He and everyone in charge would benefit from some basic math: five is greater than one. And $5 trillion in economic damage is an absolute disaster equal to the loss of 500,000 lives.

Great America

We Didn’t Close America in

When all this is over, one of the key problems to resolve will be: how to deal with viruses and save lives without doing unacceptably massive damage to our whole society.

I was born in 1957. That’s me at the piano—late that year, I think—with my older sisters Susan (middle) and Nancy.

That fall, a deadly flu was tearing through America.

It ended up killing some 116,000 Americans—in a country that, with 170 million people, had about half the population of today’s America.

Even using inflated numbers, about 95,000 have died so far from the coronavirus, by comparison.

Clark Whelton, now 80, wrote about his experience with the disease in a March 13, 2020 piece in City Journal.

“I spent a week in my college infirmary with a case of the H2N2 virus, known at the time by the politically incorrect name of ‘Asian flu,’” Whelton wrote.

“My fever spiked to 105, and I was sicker than I’d ever been. The infirmary quickly filled with other cases, though some ailing students toughed it out in their dorm rooms with aspirin and orange juice.”

Whelton recalled something else about the disease.

“The college itself did not close, and the surrounding town did not impose restrictions on public gatherings. The day that I was discharged from the infirmary, I played in an intercollegiate soccer game, which drew a big crowd,” he wrote.

The flu hit and killed all sorts of people, including children and pregnant women. COVID-19, by comparison, has most viciously targeted the elderly and people with serious health conditions.

But America in 1957 did not shut down. The National Football League played its games. Elvis Presley kept on wiggling his hips in concerts and TV shows (below, in a publicity still for that year’s “Jailhouse Rock”). The economy kept functioning, albeit with a greater number of people than usual calling in sick.

The country seemed far more concerned about the Soviet Union’s launch of the first artificial satellite in space, Sputnik 1, on October 4. Soviet domination of space could have enslaved the world.

As Rhode Island doctor and author Andrew Bostom notes, the federal Centers for Disease Control now say that the infection fatality ratio of COVID-19 is 0.27 percent, almost equal to the 0.26 percent of the 1957-1958 Asian flu.

Why did COVID-19 seem so much more terrifying, initiating lockdowns that have devastated America’s small businesses?

Well, the “science” was initially wrong. Epidemiological models that were used by governments all over the world, including those of the United States, grossly exaggerated the death toll. Politicians insisted we had to shut down to “flatten the curve” of infections, lest the hospitals be overrun with patients. No one could be certain what was happening.

China’s evil behavior also scared people. Its communist government hid from the world important information about the virus, which may have escaped from one of its laboratories outside Wuhan. The brutal regime shut down domestic travel from Wuhan—epicenter of the disease—to protect itself while permitting international travel. Effectively, China seeded the virus around the world, doing enormous economic damage to its geopolitical rivals and sapping the power of free societies.

There may be something else to explain the difference.

Americans were different in 1957-1958. Many had experienced two world wars, the Great Depression, and a Cold War that threatened to plunge the world into a nuclear holocaust. They understood that life inherently carries risks and that no government, however big, can shelter everyone from harm. Indeed, totalitarian governments in the 20th century had killed people by the tens of millions.

In addition, Americans were familiar with far scarier diseases—notably, polio, which, until a vaccine was developed by Jonas Salk in 1955, crippled and killed children in far greater numbers than the flu.

A flu vaccine for 1957-1958 basically failed. The Asian flu arose, quickly spiked, and went away.

Unlike the coronavirus, it did not leave behind devastated small businesses, massive unemployment and a shattered economy. While the lockdowns are easing, some states today are still effectively making it a crime to run some businesses.

Some said during the Vietnam War that we had to destroy the village in order to save the village. When all this is over, one of the key problems to resolve will be: how to deal with viruses and save lives without doing unacceptably massive damage to our whole society.

Great America

Reflections on Memorial Day 2020

This day permits us to enlarge the individual soldier’s view, to give meaning to the sacrifice that was accepted of some but offered by all, not only to acknowledge and remember the sacrifice, but to validate it.

Today we mark the 152nd anniversary of the first official observation of the holiday we now call Memorial Day, as established by General John A. Logan’s “General Order No. 11” of the Grand Army of the Republic dated May 5, 1868. The order reads, in part: “The 30th day of May 1868 is designated for the purpose of strewing with flowers and otherwise decorating the graves of comrades who died in defense of their country during the late rebellion, and whose bodies lie in almost every city, village and hamlet churchyard in the land.”

Logan’s order ratified a practice that was already widespread, both in the North and the South, in the years immediately following the Civil War. It also reflected ancient practice, for example Pericles’ funeral oration honoring the Athenian dead as recounted by Thucydides.

We should remember the true meaning of this day. Alas, for too many Americans, Memorial Day has come to mean nothing more than another three-day weekend, albeit the one on which the beaches open, signifying the beginning of summer. Unfortunately, the tendency to see the holiday as merely an opportunity to attend a weekend cook-out obscures even the vestiges of what the day was meant to observe: a solemn time, serving both as catharsis for those who fought and survived, and to ensure that those who follow will not forget the sacrifice of those who died that the American Republic and the principles that sustain it, might live. Some examples might help us to understand what this really means.

On July 2nd, 1863, Major General Dan Sickles, commanding III Corps of the Army of the Potomac, held the Union left along Cemetery Ridge south of Gettysburg, Pennsylvania. Dissatisfied with his position, he made an unauthorized movement to higher ground along the Emmitsburg Pike to his front. In so doing, he created a gap between his corps and Major General Winfield Scott Hancock’s II Corps on his right. Before the mistake could be rectified, Sickles’ two under-strength divisions were struck by General James Longstreet’s veteran I Corps of Lee’s Confederate Army of Northern Virginia in an attack that ultimately threatened the entire Union position on Cemetery Ridge.

At the height of the fighting, a fresh Alabama brigade of 1,500 men, pursuing the shattered remnants of Sickles corps, was on the verge of penetrating the Union defenses on Cemetery Ridge. Union commanders including Hancock rushed reinforcements forward to plug the gap, but at a critical juncture, the only available troops were eight companies—262 men—of the 1st Minnesota Volunteers. Pointing to the Alabamans’ battle flags, Hancock shouted to the regiment’s colonel, “Do you see those colors? Take them.”

As the 1st Minnesota’s colonel later related, “Every man realized in an instant what that order meant—death or wounds to us all; the sacrifice of the regiment to gain a few minutes time and save the position, and probably the battlefield—and every man saw and accepted the necessity for the sacrifice.”

The Minnesotans did not capture the colors of the Alabama brigade, but the shock of their attack broke the Confederates’ momentum and bought critical time—at the cost of 215 killed and wounded, including the colonel and all but three of his officers. The position was held, but in short order, the 1st Minnesota ceased to exist, suffering a casualty rate of 82 percent, the highest of the war for any Union regiment in a single engagement.

Memorial Day is also about the sacrifices of other units, for example, the 54th Massachusetts, a regiment of black soldiers whose exploits were portrayed in the movie Glory. The 54th’s assault, in the face of hopeless odds, against Battery Wagner, which dominated the approaches to Charleston Harbor, cost the regiment over half its number and proved beyond the shadow of a doubt that black soldiers were equal in both bravery and determination of white soldiers. In addition, the 54th Massachusetts is directly related to Memorial Day. It took part in the first celebration of its predecessor, Decoration Day, on May 1, 1865.

Americans continued to distinguish themselves in World War I at the Second Battle of the Marne and Meuse-Argonne, the largest military operation in U.S. history. In World War II, we have Utah Beach and Bastogne, as well as Tarawa, Iwo Jima, and Okinawa. In Korea, the Chosin Reservoir. In all cases, what admiral Chester Nimitz said of the Marines on Iwo was generally true: “uncommon valor was a common virtue.”

But Memorial Day is also about individuals we may have known. It is about a contemporary of my father, who himself fought and was wounded in the Pacific during World War II. Marine Sgt. John Basilone was awarded the Medal of Honor for his actions on Guadalcanal. Though he was not obligated to do so, he insisted on returning to combat and was killed on the first day of the struggle for Iwo Jima.

Memorial Day is also about Marine Corps Corporal Larry Boyer, a member of the platoon that I led in Vietnam from September 1968 until May 1969. The men of that platoon would all have preferred to be somewhere other than the Republic of Vietnam’s northern Quang Tri Province, but they were doing their duty as it was understood at the time. In those days, men built their lives around their military obligation, and if a war happened on their watch, fighting was part of the obligation.

But Corporal Boyer went far beyond the call of duty. At a time when college enrollment was a sure way to avoid military service and a tour in Vietnam, Corporal Boyer, despite excellent grades, quit, enlisted in the Marines, and volunteered to go to Vietnam as an infantryman. Because of his high aptitude test scores, the Marine Corps sent him to communications-electronics school instead. But Corporal Boyer kept “requesting mast,” insisting that he had joined the Marines to fight in Vietnam. He got his wish, and on May 29, 1969, while serving as one of my squad leaders, he gave the “last full measure of devotion” to his country and comrades.

Of course we have recent additions to the roster of those who fell in America’s wars. When I first delivered this address in 1999, we were still celebrating the “end of history,” the triumph of liberal democracy across the globe. Then, of course, 9/11 destroyed that pretty illusion and in the wars that followed, the best America has to offer again willingly placed themselves on the altar of the nation in Afghanistan and Iraq. From Mesopotamia to the Afghan mountains, the “uncommon valor” of Americans in combat has continued to be “a common virtue.”

What leads men to act as the soldiers of the 1st Minnesota, the 54th Massachusetts, John Basilone, Larry Boyer, and the countless others who have shared their sacrifice? Since the Vietnam War, too many of our countrymen have concluded that those who have died in battle are “victims.” How else are we to understand the Vietnam War Memorial—“The Wall”—a structure that evokes not respect for the honored dead, but pity for those whose names appear on the wall and relief on the part of those who, for whatever reason, did not serve?

While most Americans in general, and veterans in particular, reject this characterization, there is a tendency these days also to reject the polar opposite: that these men died for “a cause.” Many cite the observation of Glen Gray in his book, The Warriors: Reflections on Men in Battle: “Numberless soldiers have died, more or less willingly, not for country or honor or religious faith or for any other abstract good, but because they realized that by fleeing their posts and rescuing themselves, they would expose their companions to greater danger. Such loyalty to the group is the essence of fighting morale.”

It is my own experience that Gray is right about what men think about in the heat of combat: the impact of our actions on our comrades always looms large in our minds. As Oliver Wendell Holmes observed in his Memorial Day address of 1884, “In the great democracy of self-devotion private and general stand side by side.” 

But the tendency of the individual soldier to focus on the particulars of combat makes Memorial Day all the more important, for this day permits us to enlarge the individual soldier’s view, to give meaning to the sacrifice that was accepted of some but offered by all, not only to acknowledge and remember the sacrifice, but to validate it.

In the history of the world, many good soldiers have died bravely and honorably for bad or unjust causes. Americans are fortunate in that we have been given a way of avoiding this situation by linking the sacrifice of our soldiers to the meaning of the nation. At the dedication of the cemetery at Gettysburg four months after the battle, President Abraham Lincoln fleshed out the understanding of what he called in his First Inaugural Address, the “mystic chords of memory, stretching from every battle-field, and patriot grave, to every living heart and hearthstone, all over this broad land…”

Like Pericles’ funeral oration, Lincoln’s Gettysburg Address gives universal meaning to the particular deaths that occurred on that hallowed ground, thus allowing us to understand Memorial Day in the light of the Fourth of July, to comprehend the honorable end of the soldiers in the light of the glorious beginning and purpose of the nation. The deaths of the soldiers at Gettysburg, of those who died during the Civil War as a whole and indeed, of those who have fallen in all the wars of America, are validated by reference to the nation and its founding principles as articulated in the Declaration of Independence.

Some might claim that to emphasize the “mystic chords of memory” linking Memorial Day and Independence Day is to glorify war and especially to trivialize individual loss and the end of youth and joy. For instance, Larry Boyer was an only son. How can the loved ones of a fallen soldier ever recover from such a loss? I corresponded with Cpl. Boyer’s mother for some time after his death. Her inconsolable pain and grief put me in mind of Rudyard Kipling’s poem, Epitaphs of the War, verse IV, “An Only Son:” “I have slain none but my mother, She (Blessing her slayer) died of grief for me.” Kipling, too, lost his only son in World War I.

But as Holmes said in 1884, 

grief is not the end of all. I seem to hear the funeral march become a paean. I see beyond the forest the moving banners of a hidden column. Our dead brothers still live for us, and bid us think of life, not death—of life to which in their youth they lent the passion and joy of the spring. As I listen, the great chorus of life and joy begins again, and amid the awful orchestra of seen and unseen powers and destinies of good and evil our trumpets sound once more a note of daring, hope and will.

Linking Memorial Day and Independence Day in the way Lincoln  did enables us to recognize that while some of those who died in America’s wars were not as brave as others and indeed, some were not brave at all, each and every one was far more a hero than a victim. And it also allows us forever to apply Lincoln’s encomium not only to the dead of the 1st Minnesota and the rest who died on the ground at Gettysburg that Lincoln came to consecrate, or even to John Basilone, Larry Boyer, and the countless soldiers, sailors, airmen, and Marines who have died in all of America’s wars, but also to the living, that a nation dedicated to the principles of liberty and equality might “not perish from the earth.”


The Black Swan of 2020

When a majority of Americans finally realizes what was really happening in 2016 and 2017, do you really think they’ll give Joe Biden power again?

Most of us are familiar in some way with the term “black swan,” an unexpected event with major implications, and one which, in hindsight, often makes total sense. We should probably be talking more about the potential black swan of the 2020 elections, and I don’t mean the coronavirus.

I mean the reaction of the American people watching this summer as more evidence becomes public of senior Obama Administration officials conspiring and using the powerful tools of federal law enforcement and the surveillance state to spy on their successor, President Trump and members of his campaign and administration. 

It is entirely possible that some of these former Obama administration officials and FBI agents will be charged with crimes. And it is likely that through these disclosures of political dirty tricks—exceeding even Nixon’s Watergate—the electorate will realize that the Democratic nominee for president is eyeball-deep in this gross abuse of power. 

Over the last few weeks more and more evidence has been released—mostly by the Office of the Director of National Intelligence—revealing Joe Biden’s involvement in these activities. Biden was involved in the West Wing meetings concerning President Trump’s National Security Advisor, Michael Flynn, the FBI plot to get him at any cost, despite the absence of any evidence of criminal activity, and the illegal leaking of classified information about Flynn to continue a falsely predicated investigation—all of which, when confronted by the facts, Biden lied about. More information is very likely coming regarding Biden, and others, and their roles in #Obamagate.

In the next few weeks, it would not surprise me to see news reports of people being brought in front of the grand jury that U.S. Attorney John Durham has convened. Nor will I be surprised when legal counsel for those witnesses leak to the press that their clients are cooperating with Durham. The question is who is already cooperating? Take for instance FBI Agent Joseph Pientka, who apparently has falsified FBI 302s—the Bureau’s standard post-interview reports—and withheld exculpatory evidence. It’s not really a question of if he is going to jail. That’s pretty much a given. The question is: for how long? It really depends on what evidence he offers Durham regarding who was actually involved and directed him to violate the law. 

I’ve never been a big fan of speculating absent the facts, but what gives me some confidence that I’m barking up the right tree is knowing that John Durham has documents to challenge witness assertions and “refresh memories.” We can bet he has cooperating witnesses (I told One America News that FBI general counsel James Baker likely is a cooperating witness last fall). You know that the Pientkas and Bakers of the world aren’t going to take the fall all by themselves; heck, for that matter, throw in Lisa Page, who I suspect is being far more cooperative than people suggest.

That’s all to say I’m pretty sure someone higher up the food chain at the FBI is going to be in Durham’s crosshairs; it’s only a question of who and how many of them. If the evidence supports bringing charges, Durham will hammer them. Durham, unlike the majority of his federal prosecutor colleagues, has a history of holding accountable federal law enforcement agents and lawyers who break the law. He has a record of protecting the civil rights of witnesses and defendants to ensure that his prosecutions never suffer the ignominy that was just handed out to the agents and prosecutors who went after Michael Flynn. 

Remember, Durham can bring charges himself. People should operate on the assumption that Bill Barr and John Durham have to get this done by November 2020 to be safe. If things are going to be set right at the Justice Department, the FBI, and the Intelligence Community, they have an absolute but narrow window of time in which to achieve this just in case Trump is not reelected. 

That’s why I’m convinced we’ll be seeing more and more breaking news regarding Obamagate and abuse of power. If I’m right, we may even see a late June/early July press conference with Barr and Durham in which they announce the first charges against those former Obama-era officials who were involved in the coup against Donald Trump. 

On the political front, despite the mainstream media’s attempts to tamp down all of this, it’s not working. Nonpolitical people are already talking about the Flynn case in Florida and California, so it has even broken through COVID-19 coverage into the greater national discussion. Imagine this summer when even more Americans understand what the Obama Administration attempted. And then they’re going to judge Obama’s No. 2, his self-styled wingman, Joe Biden, and wonder what else the senile old man in the Delaware basement knew. 

I always tell people that a deep sense of fairness permeates Americans and is one of the hallmarks of our culture of liberty. Another hallmark of the American people is their common sense. A friend of mine always cautions those in the media and the political class never to underestimate the collective wisdom of the American people. When a majority of Americans finally realizes what was going down in 2016 and 2017, how illegal it was, and what a gross abuse of power it was, their sense of fairness, common sense and wisdom will come into play. 

After all that, do you really think they’re going to give Joe Biden power again? I don’t think so.