Defense of the West • Deterrence • Foreign Policy • History • Post

The Lessons of the Versailles Treaty

The Treaty of Versailles was signed in Versailles, France, on June 28, 1919. Neither the winners nor the losers of World War I were happy with the formal conclusion to the bloodbath.

The traditional criticism of the treaty is that the victorious French and British democracies did not listen to the pleas of leniency from progressive American President Woodrow Wilson. Instead, they added insult to the German injury by blaming Germany for starting the war. The final treaty demanded German reparations for war losses. It also forced Germany to cede territory to its victorious neighbors.

The harsh terms of the treaty purportedly embittered and impoverished the Germans. The indignation over Versailles supposedly explained why Germany eventually voted into power the firebrand Nazi Adolf Hitler, sowing the seeds of World War II.

But a century later, how true is the traditional explanation of the Versailles Treaty?

In comparison to other treaties of the times, the Versailles accord was actually mild—especially by past German standards.

After the 1870-1871 Franco-Prussian war, a newly unified and victorious Germany occupied France, forced the French to pay reparations and annexed the rich Alsace-Lorraine borderlands.

Berlin’s harsh 1914 plans for Western Europe at the onset of World War I—the so-called Septemberprogramm—called for the annexation of the northern French coast. The Germans planned to absorb all of Belgium and demand payment of billions of marks to pay off the entire German war debt.

In 1918, just months before the end of the war, Germany imposed on a defeated Russia a draconian settlement. The Germans seized 50 times more Russian territory and 10 times greater the population than it would later lose at Versailles.

So, under the terms of the Versailles Treaty, the winning democracies were far more lenient with Germany than Germany itself had been with most of its defeated enemies.

No one denied that Germany had started the war by invading Belgium and France. Germany never met the Versailles requirements of paying fully for its damage in France and Belgium. It either defaulted or inflated its currency to pay reparations in increasingly worthless currency.

Versailles certainly failed to keep the peace. Yet the problem was not because the treaty was too harsh, but because it was flawed from the start and never adequately enforced.

The Versailles Treaty was signed months after the armistice of November 1918, rather than after an utter collapse of the German Imperial Army. The exhausted Allies made the mistake of not demanding the unconditional surrender of the defeated German aggressor.

That error created the later German myth that its spent army was never really vanquished, but had merely given up the offensive in enemy territory. Exhausted German soldiers abroad were supposedly “stabbed in the back” by Jews, Communists, and traitors to the rear.

The Allied victors combined the worst of both worlds. They had humiliated a defeated enemy with mostly empty condemnations while failing to enforce measures that would have prevented the rise of another aggressive Germany.

England, France, and America had not been willing to occupy Germany and Austria to enforce the demands of Versailles. Worse, by the time the victors and the defeated met in Versailles, thousands of Allied troops had already demobilized and returned home.

The result was that Versailles did not ensure the end of “the war to end all wars.”

As the embittered Marshal Ferdinand Foch of France, supreme commander of the Allied forces, presciently concluded of the Versailles settlement: “This is not peace. It is an armistice for 20 years.”

Foch was right.

Twenty years after the 1919 settlement, the German army invaded Poland to start World War II, which would cost the world roughly four times as many lives as World War I.

After the Treaty of Versailles, the victorious Allies of 1945 did not repeat the mistakes of 1919. They demanded an unconditional surrender from the defeated Nazi regime.

The Western Allies then occupied, divided and imposed democracy upon Germany. Troops stayed, helped to rebuild the country and then made it an ally.

In terms of harshness, the Yalta and Potsdam accords of 1945 were far tougher on the Germans than Versailles—and far more successful in keeping the peace.

The failure of Versailles remains a tragic lesson about the eternal rules of war and human nature itself—100 years ago this summer.

(C) 2019 TRIBUNE CONTENT AGENCY, LLC.

Photo credit: SSPL/Getty Images

Administrative State • America • Center for American Greatness • History • Post • The Culture • The Left

If We Could Put a Man on the Moon . . .

It’s been 50 years since we first landed on the moon. It’s been 46 years, seven months, and four days since we last departed from there.

When President Kennedy first announced the goal of landing on the moon, it was a literal “moon shot.” The announcement came mere days after Alan Shepard became the first American to reach space during a 15-minute suborbital flight—we had yet to even put a man in orbit.

President Kennedy’s goal would require NASA to learn to put a manned spacecraft into Earth orbit and return it safely, conduct rendezvous and spacewalks, perform trans-lunar injections, achieve lunar soft-landings, and bring vehicles back from the moon.

It would require the development of rockets bigger than any built before, sophisticated suits to protect astronauts from the harshness of space and sustain their lives, and innovative technology and software to control precisely the complex and exacting navigational requirements.

The attempt was literally unprecedented.

But we did it.

Despite the massive technical and scientific challenges, it took America just eight years, one month, and 26 days to fulfill Kennedy’s promise. NASA’s budget for the duration was just $3 billion shy of the $40 billion that Kennedy had called on the country to pledge for the moon shot.

Imagine that—a government agency completing a project under budget and ahead of schedule.

The program was not without opposition. Two years after Kennedy’s announcement, former President Eisenhower stated, “anybody who would spend $40 billion in a race to the moon for national prestige is nuts.” In early 1969, mere months before the historic landing, a poll found that only 39 percent of Americans were in favor of the Apollo project. Among the reasons that 49 percent opposed the program: “God never intended us to go to space.”

The Apollo program and its supporters plowed straight through the opposition. They had a goal and they’d be damned if a little bit of negative opinion would stand in their way. It was something that America had to do and something that America would be proud of doing.

And they ended up being right. By now most, if not all, opposition has faded with the years. Even the most hardened cosmopolitan globalists, anarchistic libertarians, and identitarian separatists must get a small kick of nationalistic pride when they remember that we are the only country to ever land people on the moon.

But that pride has become inextricably mixed with nostalgia. It is no longer pride for what America is. It is pride for what it was.

We don’t do real moon shots anymore.

What Happened to Us?
President Obama’s Cancer Moonshot received $1.8 billion to be spent over seven years—just under 0.01 percent of the yearly federal budget, compared to the nearly 5 percent we spent yearly on NASA during the height of the Apollo program.

We aren’t even willing to make substantial outlays of time, effort, or money to deal with the substantial, concrete issues we have.

When $8.6 billion—just under 0.2 percent of the federal budget—is too high a price to pay to secure our borders and some start arguing that we should just give up because illegal immigrants will enter the country anyway, we know that we’ve lost our resolve.

But this downward trend has been with us for a while, in spite of temporary reversals.

President Carter was not entirely wrong when, in 1979, he said that America was facing a “crisis of confidence.” And he was not wrong to point to the moon landing, then just 10 years past, as a symbol of America’s strength. Nor was he was not wrong when he called America’s people, values, and confidence the greatest resource of the country saying that we would have to renew all three lest we fall to “fragmentation and self-interest” and turn to “worship” of “self-indulgence and consumption.”

Unfortunately, Carter had the charisma of a damp mop cloth and inspired about as much confidence as Lehman Brothers in late 2008.

But the malaise that the nation felt in the late 1970s was tame compared to what was to come. The United States, much like the Apollo program, may have been a victim of its own success.

After the emotional fervor surrounding Apollo 11, the subsequent missions drew far less interest—for many, the moon (if not space) was conquered and all that was left were the technical details that were best left to the scientists. The nation lost interest.

Similarly, many believed after the end of the Cold War that the major ideological struggles of human history had been resolved and anything left was best left to the technocrats. History effectively was over. And with no more ideological battles to wage, Americans felt increasingly entitled to kick their legs up and indulge in the material signs of our prosperity. Having won the fight, why would we risk the spoils on anything as intangible as an abstract goal?

As Francis Fukuyama argued, the “struggle for recognition, the willingness to risk one’s life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination, and idealism” was being replaced by “economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands.”

He even mused that the “very prospect of centuries of boredom at the end of history will serve to get history started once again.”

But it was never clear that history actually ended.

The Crisis of Our Elites
It is far more plausible that America’s so-called elite—enraptured by the cornucopia of cheap money to be gained from globalization—wanted to rationalize their complete abandonment of their neighbors and fellow citizens. They bought indulgences for their residual guilt by throwing money at supposedly oppressed groups and prostrating themselves on the altar of wokeness.

But now, the body politic is waking up, shaking off the false consciousness of political correctness and leftist cultural hegemony, and starting to see the reality of our current predicament. And America’s so-called elite, which once believed that they lead the country, are learning that they were the tail wagging the dog and that populism can be a bitch.

The sweet lullaby of the end of history and the fairytale of the “developed nation” have given way to the harsh reality that the United States has been falling behind economically, culturally, technologically, and spiritually and that our current national security and our future freedom hang in the balance.

Everyone knows that we are in tremendous debt. Last year alone, we paid more than $500 billion just to service the interest on the debt. But this—no matter what the libertarians and deficit-hawks tell you—is not fatal by itself. There are far more important threats we face that few in the media want to stress. Perhaps sustained attention to these issues would raise the obvious question of why we have done so little to ameliorate them and why the experts have been loath even to acknowledge their existence.

We face a formidable threat in China—a country we have systematically underestimated and treated the way a parent might treat a petulant teenager. A slap on the wrist will not stop their systematic theft of intellectual property, manipulation of currency dynamics, and exploitation of our trade policies.

But until President Trump’s election, economists and technocrats, enthralled by the prospect of “the endless solving of technical problems” of bureaucratic trade negotiation and the perpetual paychecks such tasks could produce, did not dare rock the boat lest the price of some crappy and lead-ridden toy from China jump 20 cents. The risk and the paperwork weren’t worth it.

Where Do Their Loyalties Lie?
We face uniquely powerful and fundamentally un-American tech companies that are intent on silencing opinions with which they do not agree. Companies that have worked with foreign governments to create censored search engines, only stopping after intense public scrutiny in the United States. And companies that have faced increasing scrutiny for sharing sensitive technology for potential military applications with foreign entities.

These companies have smartly paid off most of the main institutions in Washington, D.C. and have hidden behind a wall of insufferable libertarians and free-marketers who would rather see conservatives trashed and censored by the big tech companies than cross the sacrosanct principle of the invisible hand—these are the same people who seem to forget that “Ma Bell” was broken up by the Reagan Administration.

We face a broken education system with rising tuition and diminishing value. Americans have taken out over $1.5 trillion in student loan debt with the federal government owning nearly 92 percent of the debt. The government apparently seems intent on repeating the same mistakes that it made with Fannie Mae and Freddie Mac ahead of the Great Recession. At least those loans had some underlying collateral that could be recovered. Good luck monetizing that 20-something’s bisexual Native-American pottery degree.

Even our most elite colleges are breaking. At Yale, I saw some of the greatest minds of my generation destroyed by madness, spending weeks arguing whether Latin@ or Latinix was the less-gendered term to replace the masculonormative Latino. The classics are casually tossed aside with scorn—what could we possibly learn from Aristotle? He didn’t even have Snapchat!

The elite educational institutions are cynically cashing in on their brands and churning out many mediocre students who cannot reason themselves out of a paper bag and are far more interested in following the beaten moneyed path than having an independent thought.

How Do We Dig Out of This Hole?
We have recently seen progress on all three of these challenges. Trump has held a firm negotiating stance with China, in spite of all of the hand-wringing and lamentations in the press. Republicans in Congress have started appreciating the threat of the large tech companies and have begun to grapple with their Heritage and Cato foundation talking points to find some way they can address the actual problems in front of them while still getting money from the Kochs. And we’ve seen increased skepticism of higher education and some substantive attempts at reform.

Good. Let’s keep fighting.

But even these problems pale in comparison with the fundamental upheaval our entire world has seen over the past century. An upheaval that few are willing to acknowledge.

Technology, increasing social volatility, and an enlightenment-inspired skepticism of tradition and the past is changing how humans live at a pace that we haven’t seen before. And though we have been debating the increasing pace of modern life for a long time, there’s no doubt that information technology and our immersive devices have produced a quickening. It remains to be determined whether the benefits outweigh the costs—and this determination largely falls to us.

Our Societal Challenge
The rise of birth control, the sexual revolution, the various waves of feminism have all fundamentally shaken society. Women have played an increasingly prominent role in the professional sphere and a diminished role in the domestic sphere. The dominant culture pressures young women to have high performing careers and shames stay-at-home mothers. In spite of a slight reversal in recent years, the share of stay-at-home mothers has fallen dramatically over the past 50 years.

Mobile phones and other interactive devices with screens and access to the internet, all fairly recent inventions, are now ubiquitous. Social media, less than 20 years old, has become an important fixture in our day-to-day social interactions. The average American adult now spends more than 11 hours per day engaging with media on their screens. Many parents now give screens to young children to keep them entertained and to help with education. This is a profound change in the way that people interact with each other.

All sorts of behaviors and orientations, once stigmatized, have been normalized and gained widespread prominence in popular culture. Homosexuality and transgenderism are now widely supported by most in the mainstream. Dissenting voices that criticize the normalization of such orientations are typically punished harshly and socially ostracized. Recreational drug use and frequent premarital sex have become commonplace and are regularly depicted in media with many technological tools facilitating both.

These are not necessarily entirely bad things. They are also not unquestionably good things either. But our inability to speak openly about the changes or to have a free exchange of opinions about the various changes is, undoubtedly, an evil. Such profound changes are certain to have positive and negative side effects—if we are only allowed to speak of the positives, the negative effects will fester and metastasize.

Our decades-long inability to have open conversations and debates about these trends is in part a byproduct of the belief of many that we have reached the end of history and liberalism has won the ideological fight.

Of course, many in the mainstream orthodoxy have claimed liberalism for themselves and have constructed highly convenient definitions for the ideology. Nevertheless, if the ideology won and any further fighting is merely between those states and individuals “still in history” and those already at the end of history, what self-respecting pseudointellectual wouldn’t want to stand squarely at the end of history?

A Shallow End to History?
And so, eager not to be left behind, the “woke” among us accept whatever manifestation of liberalism is fed to them by the academics in their ivory towers and view any dissent with scorn. “Educate yourself,” they sneer as they clutch their copies of The Atlantic, The Nation, and The New York Times—the scriptures of their expert oracles in the Church of Wokeness.

Not content with deconstructing the present and distracting us with petty stupidity that pushes us ever closer to another civil war, they have started deconstructing the past. Most recently, telling us why the Apollo program was sexist and misguided.

All of this is rich, coming from people who have never landed themselves on the moon and would likely have to call AAA to change a flat.

We choose to go to the moon in this decade, and do the other things, not because they are easy, but because they are hard. Because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win, and the others too.

This language would be considered racist, ableist, chauvinistic, and imperialistic by many in the same party that nominated its author, John F. Kennedy, nearly 60 years ago.

During the 2016 election, many people asked when America had been great. They pointed to the countless sins of the past and smeared our entire history from top to bottom. But history is not that simple. It is not a simple fable of good versus evil with wooden two-dimensional characters.

America was great when it helped win World War II. It was great when it landed a man on the moon. It was great when it built the Interstate Highway System. America was great when it had confidence in itself and didn’t spend its time mired in remorseful, brooding, nostalgia, cataloging all of its wrongs and agonizing over missed opportunities.

America needs hustle. It needs spunk. It needs another goal to tackle. And it needs the heart to want to win.

Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact licensing@centerforamericangreatness.com.

Photo Credit: Heritage Space/Heritage Images/Getty Images

America • History • Identity Politics • Post • Progressivism • The Left • The Media

When ‘The Right Stuff’ Goes Wrong

In 1962, President John F. Kennedy committed the United States to put an American on the moon and return him safely to Earth by the end of the decade. 

Just to be clear, nothing like this had ever been attempted. Americans, though, were uniquely suited to the task, Kennedy said: “We choose to go to the moon . . . and do the other things, not because they are easy, but because they are hard.”

In Kennedy’s day, it was understood the American he had in mind for this dangerous mission would be a man (or, as it turned out, men) expected to perform at the highest level.

In his speech Kennedy emphasized “hard,” as in something requiring great effort, now a word heard mostly in male-enhancement commercials. Sadly, the bedroom may be the one place these days where men’s performance gets any kind of public mention—and that’s to sell pharmaceuticals. 

American men live in a very different country from the one Apollo 11 came from on July 20, 1969. That was the day when Neil Armstrong and Buzz Aldrin walked on the moon while Michael Collins circled above in the lunar command module, hoping to take the three of them home. 

How different? Read The Right Stuff by Tom Wolfe. The book is a celebration of old-fashioned manhood in all its rocket-powered glory. In other words, most of the astronauts Wolfe wrote about would never pass a human resources screening.

Their qualifications were tested, but not by having them fill out forms. These guys, many of them former combat pilots, had already proven they had balls. That’s part of what Wolfe meant by the book title. 

In the ultrasensitive work environment of today, the mere mention of similar male attributes would get you fired. On the flight of Apollo 11, they were among the things that mattered most.

Which explains why on this historic occasion there won’t be any mainstream media salutes to “the right stuff,” as Wolfe conceived it. Putting aside the Playboy lifestyle enjoyed by some astronauts, the idea that three white men, relying solely on know-how and pre-toxic masculinity, got from the earth to the moon and back might alarm certain people. Then there are “the optics.” In addition to being all male and active duty or ex-military, the Apollo 11 crew was not ethnically diverse, culturally inclusive, or gender fluid. 

Years later, there were rumors in some parts of the world that Armstrong had converted to Islam while taking his famous moonwalk. All officially denied by the U.S. State Department in 1983.

Speaking of spiritual matters, the National Aeronautics and Space Administration (NASA), which manages the U.S. space program, used to allow crews to mix science and religion. During the 1968 Christmas Eve flight of Apollo 8, astronauts broadcast the first live television pictures of an earthrise as they read a passage from Genesis. Atheists sued.

Armstrong and Aldrin have never been rebuked like Christopher Columbus and other well-known explorers, largely because they never encountered and/or enslaved any indigenous peoples on their 21-hour moon visit. But they did leave an American flag behind. And that was a problem.

“First Man,” the 2018 Neil Armstrong biopic, received generally favorable reviews, except from some conservatives, who complained the film omitted an important patriotic element by not depicting the planting of a U.S. flag on the lunar surface. 

Producers were probably concerned about the effect on ticket sales in countries that hate America, or maybe ticket sales to moviegoers in this country who hate America.

Just as Nike was concerned recently when it halted the sale of a new shoe decorated with a miniature version of the original U.S. flag. (On the advice of a washed-up football player who’s made a new career for himself trashing the country’s most cherished symbols.)

Given how much has changed, it’s not hard to imagine what America’s first mission to the moon would be like if it happened today. 

The spacecraft would have to be bigger to accommodate a larger, more diverse crew, including at least one unskilled illegal immigrant. 

Leading the mission would be a commander of color, with crew members chosen by NASA and a select panel of race, ethnicity, and gender identity consultants. 

In-flight meals would feature dehydrated multicultural entrees and a special vegan menu. Tang would also be served. 

The landing would be televised and show the mission commander climbing down a ladder to set foot on the moon, followed by the non-binary co-commander who would read the following statement:

“That’s one small step for they. One giant leap for them.”

Then, as the phone rang, signaling a call from the White House, xe would say, “If that’s Donald Trump, we’re not answering.” 

It makes you glad the real thing happened 50 years ago.

Photo credit: Corbis via Getty Images

America • Defense of the West • Democrats • History • Post • The Left

The War Over America’s Past Is Really About Its Future

The summer season has ripped off the thin scab that covered an American wound, revealing a festering disagreement about the nature and origins of the United States.

The San Francisco Board of Education recently voted to paint over, and thus destroy, a 1,600-square-foot mural of George Washington’s life in San Francisco’s George Washington High School.

Victor Arnautoff, a communist Russian-American artist and Stanford University art professor, had painted “Life of Washington” in 1936, commissioned by the New Deal’s Works Progress Administration. A community task force appointed by the school district had recommended that the board address student and parent objections to the 83-year-old mural, which some viewed as racist for its depiction of black slaves and Native Americans.

Nike pitchman and former NFL quarterback Colin Kaepernick recently objected to the company’s release of a special Fourth of July sneaker emblazoned with a 13-star Betsy Ross flag. The terrified Nike immediately pulled the shoe off the market.

The New York Times opinion team issued a Fourth of July video about “the myth of America as the greatest nation on earth.” The Times’ journalists conceded that the United States is “just OK.”

During a recent speech to students at a Minnesota high school, Rep. Ilhan Omar (D-Minn.) offered a scathing appraisal of her adopted country, which she depicted as a disappointment whose racism and inequality did not meet her expectations as an idealistic refugee. Omar’s family had fled worn-torn Somalia and spent four-years in a Kenyan refugee camp before reaching Minnesota, where Omar received a subsidized education and ended up a congresswoman.

The U.S. Women’s National Soccer Team won the World Cup earlier this month. Team stalwart Megan Rapinoe refused to put her hand over heart during the playing of the national anthem, boasted that she would never visit the “f—ing White House” and, with others, nonchalantly let the American flag fall to the ground during the victory celebration.

The city council in St. Louis Park, a suburb of Minneapolis, voted to stop reciting the Pledge of Allegiance before its meeting on the rationale that it wished not to offend a “diverse community.”

The list of these public pushbacks at traditional American patriotic customs and rituals could be multiplied. They follow the recent frequent toppling of statues of 19th-century American figures, many of them from the South, and the renaming of streets and buildings to blot out mention of famous men and women from the past now deemed illiberal enemies of the people.

Such theater is the street version of what candidates in the Democratic presidential primary have been saying for months. They want to disband border enforcement, issue blanket amnesties, demand reparations for descendants of slaves, issue formal apologies to groups perceived to be the subjects of discrimination, and rail against American unfairness, inequality, and a racist and sexist past.

In their radical progressive view—shared by billionaires from Silicon Valley, recent immigrants and the new Democratic Party—America was flawed, perhaps fatally, at its origins. Things have not gotten much better in the country’s subsequent 243 years, nor will they get any better—at least not until America as we know it is dismantled and replaced by a new nation predicated on race, class and gender identity-politics agendas.

In this view, an “OK” America is no better than other countries. As Barack Obama once bluntly put it, America is only exceptional in relative terms, given that citizens of Greece and the United Kingdom believe their own countries are just as exceptional. In other words, there is no absolute standard to judge a nation’s excellence.

About half the country disagrees. It insists that America’s sins, past and present, are those of mankind. But only in America were human failings constantly critiqued and addressed.

America does not have be perfect to be good. As the world’s wealthiest democracy, it certainly has given people from all over the world greater security and affluence than any other nation in history—with the largest economy, largest military, greatest energy production and most top-ranked universities in the world.

America alone kept the postwar peace and still preserves free and safe global communications, travel and commerce.

The traditionalists see American history as a unique effort to overcome human weakness, bias and sin. That effort is unmatched by other cultures and nations, and explains why millions of foreign nationals swarm into the United States, both legally and illegally.

These arguments over our past are really over the present—and especially the future.

If progressives and socialists can at last convince the American public that their country was always hopelessly flawed, they can gain power to remake it based on their own interests. These elites see Americans not as unique individuals but as race, class and gender collectives, with shared grievances from the past that must be paid out in the present and the future.

We’ve seen something like this fight before, in 1861—and it didn’t end well.

Photo Credit: Getty Images

(C) 2019 TRIBUNE CONTENT AGENCY, LLC.

America • History • Post • The Culture

Music in the 80s Sucked

In an excellent essay making the case that pop music was at its zenith in 1984, Julie Kelly writes that the era represented a patriotic swoon. She might also have mentioned that the same year, Lee Greenwood released “God Bless the U.S.A.,” the most patriotic song of the year though it was not, strictly speaking, a pop song.

The year 1984 may have been the high point of pop and rock, but that is not saying much. The entire decade is more notable for the musical malaise it created. As a music director for radio stations during that decade, I should know. It was during the ’80s that radio stations began to tighten their playlists all to the happy applause of corporate music execs. The rapid creativity of the 1970s radio stations died, to be replaced by preplanned and survey-tested radio formats. The most significant of these were the songs stations received from the radio syndication company Drake-Chenault

No longer were program and music directors left to their own knowledge and gut as to what made a hit. They deferred to the “experts.” It was a disaster. The same songs were played and replayed to the point of monotony. Music and then radio began to lose its audience, and the music that was created for just this purpose suffered. In a significant way, it all began to sound the same.

The 1980s represented the creeping destruction of musical creativity. The few shining moments in this decade were achieved by those acts allowed by their corporate producers to test the boundaries of acceptable on-air material—Michael Jackson’s “Thriller” falls into this category. 

Most of the music in the 1980s, however, to put it colloquially, sucked.

It is remarkable that the music execs and radio gods decided to clamp down on creativity at the moment they did. It was only 10 years earlier, in 1974, that a small band signed with about as independent a record label (London) as one could get at the time, and packed Austin stadium with 80,000 of their closest friends. Try doing that without major label backing. ZZ Top did it, though, and they were immensely popular even before their hit song “Tush” and their signing with Warner Brothers. But in the 1970s, as now, the market craved something original, even if it was not audience tested and approved. It worked. 

When the record labels merged and clamped down on musical talent, they froze out the bands that would have carried their creative market into the next decade. Those who wanted to remain a signed act were forced into the company playlist with company producers and company song writers. Many bands before the explosion of the internet and independent labels were sadly never to find broad fame and marketability they deserved because music executives really did not have the expertise they thought they had. 

Case in point was a West Coast band called the Crazy 8s. They packed whatever venue they played in the 1980s. When I was a music director at a radio station, I pleaded with many label reps to sign the band. Every time they told me, “We’d love to, but we do not know how to categorize them.” The Crazy 8s never were signed to a major label, but they inspired their standing-room-only crowds to go wild simply because they were not a cookie cutter band and they offered a unique sound that resonated. They also had the added benefit of being a talented act. In one concert, I remember the college age crowd of the ’80s nearly destroyed the venue upon hearing the immensely popular Johnny Q—a rip on mainstream media before it was cool.

The pressure the industry put on artists in the ’80s led to the present musical explosion we are now witnessing. Suffocated by the music industry’s grip on what was acceptable, bands started to go on their own. The best songs of the ’80s were not created in that decade, but long after. As one astute student told me one day, “Interpol is the ’80s done right.” To that you can add Bloc Party

It was not just pop that stunk in the nostrils of the musicians and smart disc jockeys of the day. Country also suffered from the same stagnation. The slow rolling creation of an entirely new genre (alt-country) that came out of the Byrds (via Gram Parsons and Scott Hillman) would not reach its breakout moment until the 1990s. This is a legacy even the Beatles do not have. The Byrds were the most influential band in American music for what they unleashed and created, but it took time because of the resistance from the major labels that wanted to kill music not created in their hot-house market tested image. The tight grip of elite music producers and writers caused Robbie Fulks to pen this irreverent tune to corporate execs. But he was not the only one who did so

The indie and alt-country movements were born out of the putrification of a decade. When people like Jack White of the White Stripes lent his support behind not only recreating the minimalist sound, but also independent record companies to put the power of music back in the hands of the creators and the fans, new radio stations under the new influence began to fill the void and thousands of fans left the preplanned and predictable sounds of the major labels. 

The 1980s stunted music’s growth. That is why Greta Van Fleet is so popular, and sounds like a band we’ve heard before. They are experimenting with an age that the record companies killed off. Indie saved rock-n-roll and saved us from the 1980s and ’90s. We are all better off for it. 

Photo Credit: Paul Natkin/Getty Images

America • Education • History • Post • Progressivism • The Left

Up Against the Wall with Washington

Why didn’t Sir Isaac Newton, the famed English mathematician, physicist, and astronomer, invent the automobile? 

Newton was a clever guy. He came up with calculus, and all it took was an apple dropping onto his noggin to give him his epiphany about gravity. He plotted the movement of planets and comets on paper using a quill pen, not even a ballpoint. Why couldn’t he think up an internal combustion engine and have mankind tooling around in snazzy convertibles by, say, 1673? He lived until 1727, so he would have had time to invent seat belts, air bags and, just for fun, tail fins. 

But, he didn’t. He was absolutely ignorant about automotives. There are guys who flunked high school auto shop who know more about cars than he ever did. So, should we expunge all mention of his genius from our history books? 

No. Newton was doing the best he could with what he had. And we accept that. A lot of other stuff had to be invented before anyone could go cruising Main Street on a Saturday night.

Everyone seems fine with scientific progress taking incremental steps over time. We may be amused by the brick-like cell phones that the early adopters hauled out with a flourish to envious eyes just a few decades ago, but we aren’t angry with their creators. Not even a Betamax or an Edsel offends our sensibilities. We are aware that technology can take wrong turns. 

We also accept that technical progress can be slow. In the early 20th century, Buck Rogers-type science fiction suggested that we’d soon have robots to walk Fido and atomic rockets would zip us to Mars for family holidays, but Fido still needs a human to go walkies on frosty December mornings and there aren’t any Martian hotels boasting rooms with scenic views of that planet’s ruddy mountains. 

We may be disappointed, but we don’t get mad. We understand that improving the world can be difficult and slow. 

Unfortunately, some don’t extend this understanding to more human forms of progress. They expect the world instantly to transform into the utopia of their desires and declare anyone who isn’t up to date on the latest iteration of the scheme an evil monster. This attitude is a particular characteristic of today’s progressives, who wield political correctness like a flaming sword. Not satisfied with hacking away at living opponents, they seek to destroy enemies they find in history who failed to change the world centuries ago in a way that meets their approval today. 

That’s what is happening at San Francisco’s George Washington High School. Recently, the San Francisco Board of Education voted unanimously to spend an estimated $600,000 not on books or classroom improvements, but instead to destroy murals painted in the 1930s on the walls of that high school.

Titled “The Life of George Washington,” the murals were painted by Victor Arnautoff, a Russian-born artist and Communist who, though no fan of Washington or of America in general (he would spend the last years of his life in the USSR, in fact) was willing to take a check to the work. Indeed, Uncle Sam’s Works Progress Administration, an effort by the Roosevelt Administration to employ artists during the Great Depression, paid artists like Arnautoff to decorate public buildings, which they did more or less according to their own ideas. 

Arnautoff’s mural shows scenes from Washington’s life. The school board claims they are racist because a dead Native American is shown in one part and slaves are depicted doing menial work in other parts. The images are mild. The corpse shows no wounds, and the slaves aren’t being abused as they too often were in reality. 

Arnautoff was illustrating the negative side of our first president’s life while also showing his accomplishments. Washington had waged war on Native Americans and was a plantation owner who used slave labor. Ironically, rather than accepting this nuanced depiction (offered by a Communist, no less!), the woke board decided that the images were too traumatic for the school’s students to see. That students had seen them for about eight decades without much complaint had no effect on the board’s decision.

While American high school students are more and more ill-educated, it is unlikely that any are unaware that Native Americans were harshly displaced when Europeans arrived in the New World or that Africans were enslaved here. Indeed, in the classrooms of George Washington High School you can be certain that lectures have been delivered upon these topics, homework assigned from textbooks containing this information, probably illustrated with images similar or more lurid than the murals, and tests administered with poor grades dispensed to those who didn’t absorb what they had been taught. 

Will these teaching materials be similarly censored or the lessons abandoned as racist and traumatic? Of course not. It’s obvious that the intent of the destruction isn’t to shield students from harm. It is to reduce the status of Washington by further associating him with the racism their lesson plans surely already lay at his feet. The San Francisco School Board is very progressive and an important part of the progressive effort to transform America is branding Old America as requiring transformation. To that purpose, heroes like Washington must be shown to be defective, which renders anything they created or honoring of them also defective.

Washington is considered very flawed by progressive standards. During the Revolution, he fought Native Americans. That Native Americans had, however, sided with the British and were therefore fair game like any red coat escapes scrutiny. 

Also escaping mention is that Washington did make war on Native Americans as president; he far preferred to make peace. He entertained multiple Native American delegations and forged treaties that were beneficial to both sides. The Creek, in particular, did well from his efforts. Unfortunately, Washington’s approach—avoiding war, purchasing lands, negotiating treaties, and making efforts to assimilate Native Americans into America’s growing nation—didn’t continue after he left office.

Washington’s relationship with slavery was less advanced. His slaves were freed upon his death, but that was less of a sacrifice than freeing them when he was alive would have been. Still, it is likely they appreciated the gesture and we should remember that slavery was considered an inevitable if unfortunate part of the world in which Washington lived. Indeed, it was common around the world and had been throughout history. It would take a horrific civil war to end it in the United States. 

To suggest that Washington, or anyone else, could have ended slavery earlier without tearing apart the infant country is foolish. Lincoln and a massive Union army barely managed to free the slaves and hold the Union together in the 1860s.

This brings us back to Newton and the incremental nature of progress. He once said of his accomplishments, “If I have seen further it is by standing on the shoulders of giants.” Newton used mathematics that can be traced back to clay tablets in ancient Mesopotamia. Newton added his own great advancements, and others have followed. 

The same is true of progress in freedom. America’s Founders built our nation upon work that can be traced to Roman law and England’s Magna Carta. Washington, like Newton in science, made great contributions that built on earlier efforts to increase recognition of human dignity and freedom. First, he won the nearly impossible fight against Britain that allowed America to form the first modern republic. Then, he refused to be made a king. He became a wise president and left office after two terms in a peaceful transfer of power that many nations would come to envy. 

With respect to racial relations, Washington did not take America all the way to where it is today or to where some might wish it to be, but any honest thinker must admit that his were heroic accomplishments that were fundamental to building a country that could arrive there. Without them America and all the good its ideals and actions have accomplished would not have happened. Like Newton, we should be grateful that we have had the shoulders of giants such as Washington to stand upon.

Right now, new forms of tyranny that would astound George Orwell are clamping down on great masses of humanity and freedom is in jeopardy as never before. We are witnessing the creation of computerized totalitarian states where Big Brother augmented by artificial intelligence seeks to monitor every moment of their drone citizens’ lives. 

Even in nations that have long enjoyed freedom, forces are at work seeking to curtail rights that were once thought fundamental and secure. If the San Francisco School Board is truly interested in creating a better world, they should be defending what Washington and our other national heroes helped incrementally to build and asking how those advancements might protect against this burgeoning turn toward tyranny, instead of obsessively picking at past failings.

People are imperfect and nations are imperfect, but virtue should be respected and not scorned as inadequate to the smug prejudices of those who sit in freedom and safety on a school board two and a half centuries away from any danger of being hanged by George III.

Photo Credit: Saul Loeb/AFP/Getty Images

America • Democrats • History • Hollywood • Identity Politics • Post

Insolent Leftists Would Lock Us Up in Hamlet’s World

When I was a boy, everyone said the epitome of Shakespeare is Hamlet’s soliloquy. The soliloquy, the one from Act III, the one that poses the question: “To be, or not to be.”

Those words, like the first four notes of Beethoven’s Fifth Symphony, are instantly recognizable, a pocket summary of an entire cultural tradition. The phrases that follow, one after another, dot our language: “What dreams may come,” “this mortal coil,” “the insolence of office,” “the undiscovered country,” etc., etc., etc.

You could hardly escape Hamlet’s soliloquy. In movies, it got the classic treatment by Olivier, was taken up by a melancholy Doc Holliday in “My Darling Clementine,” was played for laughs by Jack Benny in (what else?) “To Be or Not to Be,” and (later on) given an explosive reading by Schwarzenegger in “Last Action Hero.” And that’s just one slice of pop culture. Wherever it turns up, and however it is rendered, the soliloquy is said to be the Bard’s most penetrating insight into the human condition.

I couldn’t see it. What was so hard about the question, “To be or not to be”? Doesn’t everybody want to be? Why all this chin-tugging?

What I didn’t understand—what I never suspected—was that Hamlet was speaking of the shared experience of humanity. Through the centuries, people have been in his shoes. They have had to choose between subservience and struggle—to choose whether to live on your knees or die on your feet. In my sheltered 1950s world, I had no idea of that.

Was my sheltered world normal? Not really. Life has a way of teaching bitter lessons to those who’ve been sitting pretty, sometimes bit by bit and sometimes all at once.

On 9/11, for example, a cloud of flames, dust and death billowed over lower Manhattan, the Pentagon, and a Pennsylvania pasture. All of us were getting it, right in the face. In Tennessee, someone asked a law professor and budding “Instapundit” named Glenn Reynolds when life in America would ever get back to normal.

“This is normal,” he replied. Then Reynolds wrote: “For most of human history, wondering when somebody from another tribe was going to try to kill you was the standard activity. In much of the world, it still is. … It’s only in comparatively strong and wealthy Western nations that we can pretend that safety is normal.”

These dour thoughts were brought to mind by the recent dust-up among the Democrats over crosstown school busing. That policy, which aimed at integrating widely separated urban schools, was a small matter compared with 9/11, but in the 1970s, it was pretty traumatic for a lot of people. In South Boston, a decision in 1974 to swap busloads of high-schoolers with students from a largely black school a few miles away in Roxbury sparked months of furious protests. (A scene from one such protest is this story’s featured image: A woman wearing a “Stop Forced Busing” button seeks safety as mounted police disperse the crowd. She and the others had gathered at South Boston High School after a black student stabbed a white student there.)

The rioting reached a climax more than a year later, when white students picketing City Hall attacked a black lawyer who happened to cross their path. The confrontation produced an image that shocked a nation accustomed to thinking of Boston as a citadel of liberal civility, not like those dens of Southern yahoos in Little Rock and Selma.

Race riots in the heart of liberaldom were explained away by pointing at “Southie,” the home of Boston’s Irish working class. No one had yet coined the word “deplorables,” but that is what liberals were thinking. What they weren’t thinking about—what they had already forgotten, if indeed they ever had even noticed—was that only a few months earlier, a white woman living in Roxbury had been burned alive by a gang of black teenagers a few blocks from her home. Dying in a hospital bed, she told police her attackers said they didn’t want any whites in their neighborhood. And Southie’s kids were supposed to be bused there?

That woman, Evelyn Wagler, is not a household name, like, say, Emmett Till. But she, no less than he, bore witness to the savagery that can erupt when people from hostile tribes collide. Those who promote tribalism in America—which is what “identity politics” boils down to—should take heed.

I advise Joe Biden’s rivals to cut him some slack, then. So what if he opposed busing in the 1970s? The policy wasn’t popular then, and it wouldn’t be loved now.

But set aside the business about busing, tribalism and identity politics. Such things are not what bothered Hamlet. The people who were giving him fits were not only Danes like him, they were his own flesh and blood. What is worse, they were his superiors in rank. His uncle Claudius, who had murdered his father and married his mother, was now his king.

What to do when your ruler is a murderer? Knuckle under, or fight and die?

Adolf Hitler established his credentials as a murderous ruler quite early in his reign over Germany. On the night of June 30 into July 1, 1934, he had several thousand Nazi rivals, personal enemies, and assorted inconvenient individuals shot, without trial, in what became known as the Night of the Long Knives. The firing squads had to be relieved from time to time because of the mental stress it was putting on the soldiers. Winston Churchill recounts the events dispassionately, in the first volume of his World War II memoirs, and then comments:

This massacre, however explicable by the hideous forces at work, showed that the new Master of Germany would stop at nothing, and that conditions in Germany bore no resemblance to those of a civilized country. A Dictatorship based upon terror and reeking with blood had confronted the world.

It was bad enough to have to fight such a monstrosity from the outside. How much worse was it to be inside, among Hitler’s subjects, or, even worse, among those whose military duty involved carrying out his crimes!

One of the many World War II survivors interviewed for the British documentary series “The World at War” was Christabel Bielenberg, an Englishwoman who had married a German and consequently spent the war in Germany. She told the interviewer several heartbreaking stories, sometimes literally wringing her hands. (An example starts at 35:40 in this link.) Among them was this one:

Near the end of the war, I had to travel from Berlin to the Black Forest, and I happened to travel in the same [train] carriage as an SS man. . . . He explained to me that he was on his way to the front now and all he wanted was to get killed. . . .

He told me that in Poland, he had belonged to one of the commandos that were called the extermination commandos, and on one particular occasion when the Jews were standing around in a semicircle, with the half-dug graves behind them, that the machine guns had been set up, and out of the ranks of the Jews that were standing there, [a rabbi] had come towards him . . . and said, “God is watching what you do.”

And he said, “We shot him down before he returned to the semicircle.” Another [among the Jews], a little boy … had asked him, “Am I standing straight enough?” And, he told me, these things he could never forget and that he only, as I say, now wanted to die.

That soldier’s dilemma was dramatized in “A Time to Love and a Time to Die,” a 1958 American movie made by German-born director Douglas Sirk from Erich Maria Remarque’s World War II novel. The film tells of German private Ernst Graeber’s experiences on the Russian front and during a bittersweet home furlough. In its early Russian scenes, it features a brief appearance by the boyish actor Jim Hutton in the role of Hirschland, a new recruit who can’t believe he’s being ordered to kill unarmed civilians. Film critic Glenn Kenny describes Hirschland’s reaction:

“What if I shoot over their heads?” he desperately asks Ernst. “We’ve all tried that,” Ernst says, himself almost as weary as death. “We only had to do it again. It’s like . . . executing them twice.” The Germans then make light of their task, trying to get drunk on the “good Russian vodka” they’ve pilfered from their victims, only to end up squabbling like hens and destroying their loot. And Hirschland blows his brains out.

Shall we state the obvious? No American today is facing anything like such an extremity. Those who prattle about Donald Trump being another Hitler are in no actual danger of being snatched by the Gestapo in the night, any more than they were under George W. “Chimp/Hitler” Bush.

It’s a fact of modern life, however, that controversial public figures often receive death threats. And, though it shames us to have to say so, the basement-dwelling loons who emit such threats do include among them some Trump supporters as well as many of his foes. So it’s not as if the Trump haters have nothing at all to worry about.

Even so, when prominent Trump critics go on book tours, land lucrative endorsement deals, or deliver profanity-laced tirades at glitzy, televised celebrity affairs—all while congratulating themselves for having the “courage” to “speak truth to power” in “fascist AmeriKKKa”—one must wonder where self-flattery ends and lunacy begins.

While the classical picture of a lunatic is someone who thinks he’s Napoleon, a lot of people today seem to think they’re Hamlet. They see Trump as a usurper like Claudius, and themselves as being deprived, as Hamlet was, of their birthright—which evidently is to occupy positions of power from which to rule over us.

And, like Hamlet, they are spurring themselves toward a violent conclusion. We’ve already seen bits of it, in the murderous attack on a congressional GOP baseball practice and the brutal assaults by Antifa thugs on any conservative who ventures within their reach.

How can we minimize the harm such people do? By minimizing their influence. By isolating them, cutting the support out from under them, and thus depriving them of the power to bully others the way they do now.

Let’s reflect that while America is far from becoming anything like Hitler’s Germany, some of the evils of which Hamlet complained have appeared, and indeed have been with us for a very long time.

Hamlet spoke of “the law’s delay.” Americans get a lot of that. Evelyn Wagler, for example, suffered a fate as dire as that suffered by any of Hitler’s victims. You’d think the law would lose no time in avenging her. You’d think wrong. No one was ever identified, let alone arrested, prosecuted, convicted, or executed, in connection with her murder. Yet with “the insolence of office,” the powers that be were soon sending Boston’s finest out on horseback to chase protesters away from Southie’s high school, which had become the scene of further violence. The unhappy woman pictured at the top of this article was living in Hamlet’s world.

No one wants to live in Hamlet’s world, and that includes the very people the Left presumes to speak for. As I wrote more than six years ago:

Many among liberalism’s special clientele—poor people, minorities, feminists, gays, unions, the homeless, etc.—are sick to death of gangs and violence and would love to see that all swept away.

So let’s focus on the violence. Let’s focus on crime and punishment.

Let us fight fire with fire. We should set about hanging murderers, to the point that death for murder becomes the rule rather than the extremely rare exception. And we should keep on locking up the murderers’ lesser fellows for as long as necessary, until violent crime once again is as rare as it was when I was a boy—indeed, even rarer.

Liberals may fret and fuss about “legalized murder” in the first case, and “the new Jim Crow” in the second. Ordinary people like the ones quoted here, here, here, here, here and here will know better. Fight the Left on this battleground, and more and more of those ordinary people will walk away from their leftist leaders, until that fine day when the leaders turn around and, sick with impotent fury, realize no one is following them any more.

Crime should be crushed for the victims’ sake, of course, regardless of any partisan considerations. But in the coming electoral showdown, those considerations are not to be ignored. Have you got it in for liberal Democrats? Then take it out on their pets, the hoodlums who have been shedding our blood these many years. There’s no surer way of bringing about the liberals’ downfall, and securing everyone else’s safety.

While public safety may not be normal in historical terms, it’s the goal we should strive to achieve. Liberals stand in the way, as they have for decades. Today they enjoy an added sense of heroism in “resisting” Orange Man, but such vainglory is nothing new to them. Why should we indulge them any longer? Time’s up! On the issue of crime and punishment, liberals are overdue for a comeuppance, so let’s bring it on.

The more the Left tries to preserve the law’s delay, the sooner it will come to rue its insolence of office.

Photo Credit: Ted Dully/The Boston Globe via Getty Images

America • Declaration of Independence • History • Post • The Constitution

Memo to Kaepernick: Read More Frederick Douglass

Many observers were quick to correct Colin Kaepernick’s recent selective quoting from Frederick Douglass’s speech, “What to the Slave is the Fourth of July?” They were right to do so. Misrepresenting anyone’s words in the manner that Kaepernick did breaks one of the first rules of good writing.

In spite of his error, thanks are due as well to him for bringing attention to a very fine speech that all Americans should read. Another of Douglass’s speeches that I urge Mr. Kaepernick and others to read addresses the great document that stands next to the Declaration of Independence: the United States Constitution.

Douglass, born into slavery, escaped and purchased his freedom with the help of others who raised funds. He eventually moved to Rochester, New York and worked to end slavery by helping people reach freedom on the Underground Railroad, supporting anti-slavery political parties, and publishing his own antislavery newspaper, The North Star. It was at the invitation of the Rochester Ladies Anti-Slavery Society that he appeared on July 5, 1852 to deliver the Independence Day speech. The circumstances for his speech on the Constitution were very different. The title of the speech is in the form of a question: “The Constitution of the United States: Is It Pro-Slavery or Anti-slavery?

After Douglass’s escape from slavery he worked with the Anti-Slavery Society founded by William Lloyd Garrison. The American Anti-Slavery Convention convened in 1833 in Philadelphia to address the enslavement of one-sixth portion of the American people. They looked back 57 years to 1776 and acknowledged the effort to deliver America from a foreign yoke, stating that the Temple of Freedom was founded on the principles of the Declaration—that all men are created equal and that they are endowed by the Creator with certain inalienable rights. They contrasted their efforts of relying on the spiritual and working through God to the efforts of the Founders who were forced to wage war and marshal arms. They also believed that the Constitution was a pro-slavery document.

Douglass, however, eventually split from Garrison over his interpretation of the Constitution and the use of politics and force to end slavery.

Douglass delivered the Constitution speech in Glasgow, Scotland in 1860. He began by drawing out a contrast between the American government and the American Constitution, which is always worth doing. “They are distinct in character as is a ship and a compass. The one may point right and the other steer wrong. A chart is one thing, the course of the vessel is another. The Constitution may be right, the Government is wrong.”

The issue then was not whether slavery existed at the time of the Founding, but rather whether the Constitution guarantees a right to one class to enslave or hold as property people of another class and should the union be dissolved over disagreement about the question. The Garrisonians held that the Constitution did hold such guarantees and it that it should be dissolved as a “compact with the devil.” In addition, they refused to vote or  hold office in what they understood to be a corrupt system. Douglass stated his position to the contrary: “I, on the other hand, deny that the Constitution guarantees the right to hold property in man, and believe that the way to abolish slavery in America is to vote such men into power as will use their powers for the abolition of slavery.”

The Constitution, Douglass explained, was ratified by the people and it is only they who can alter, amend, or add to it. He took issue with those who look away from the text and dismissed commentaries and creeds written by those who wished to give the text a different meaning or who searched for secret motives or dishonest intentions of those who wrote it.

He gave examples of those who misrepresented the language of the Constitution and corrected them by giving a faithful reading of the words and an interpretation consistent with the historical evidence before him. He also reminded his listeners that the preamble begins with “We, the people of these United States” and “not we the white people, not even we the citizens, not we the privileged class, not we the high, not we the low, but we the people; not we the horses, sheep, and swine, and wheel-barrows, but we the people, we the human inhabitants; and, if Negroes are people, they are included in the benefits for which the Constitution of America was ordained and established.”

Douglass did not excuse those Americans who had given the Constitution a slaveholding interpretation, but dissolution of the union, for him, was not a remedy. He openly rejected Garrison’s call for no union with slaveholders as all Americans have a duty to return the plundered rights of the black people.

Douglass had previously spoken in Glasgow in 1849 when he held different views from the ones that he advanced in his 1860 speech. He readily admitted to the positions that he held previously. “When I escaped from slavery, and was introduced to the Garrisonians, I adopted very many of their opinions, and defended them just as long as I deemed them true,” he said. “I was young, had read but little, and naturally took some things on trust. Subsequent experience and reading have led me to examine for myself. This had brought me to other conclusions.”

We should heed the direction of Douglass to read and examine for ourselves and not hesitate to reevaluate our beliefs and opinions. Perhaps Colin Kaepernick will do as Frederick Douglass did in light of his misrepresentation of the latter. The Constitution speech is a good follow up for further study, but the best place to begin is with Douglass’s Autobiography, which gives the full measure of the man.

Photo Credit: Salwan Georges/The Washington Post via Getty Images

America • Center for American Greatness • Democrats • History • Identity Politics • Post • The Left

Kaepernick, in Nike’s Big House

Colin Kaepernick is woke. He wears pig socks to protest cops, whom he reckons are modern day slave catchers, and even cosplays as Angela Davis. He has deemed America’s first standard, sewn by the hand of Betsy Ross, an icon of slavery—that Ross was a Quaker, therefore an abolitionist, matters not. And he does all of this under the auspices of his thoroughly white paymasters at Nike.

All nine of Nike’s executives are as white as the driven snow, and Kaepernick is only too happy to do the woke shuffle for them.

Kaepernick’s liberation from “AmeriKKKA” has led him merely into a more complete bondage, with the corporate sector dominated by white liberals as his master. Kaepernick, in other words, has gone up from the fields and entered the cool halls of the “Big House.”

The main plantation house—sometimes called the “Big House,” or “Great House Farm”—was the building or compound occupied by antebellum slave owners. Slaves, not knowing any other kind of life, considered it an honor and a privilege to find themselves in the luxurious presence of their white masters, out of the sun and further from the lash. “The slave who was selected to sleep in the ‘big house,’” Booker T. Washington recalled, “. . . was considered to have the place of honour.”

“Few privileges were esteemed higher, by the slaves of the out-farms, than that of being selected to do errands at the Great House Farm,” Frederick Douglass similarly observed. “It was associated in their minds with greatness. A representative could not be prouder of his election to a seat in the American Congress, than a slave on one of the out-farms would be of his election to do errands at the Great House Farm.”

Of course, bondage did not end at the big house, it simply became more intimate (in some cases quite intimate), and therefore more total; just as it has with Kaepernick, whose level of “wokeness” has merely tightened the fetters of oppression over his mind.

The has-been football player has so thoroughly soaked his persona in the salty tears of victimhood that without slavery and without the white people he so despises (even as they cut his exorbitant paychecks) Kaepernick would be precisely what he is beneath his afro and raised fist: a professional failure.

Nike figures that the prospect of a majority nonwhite—excuse me, “diverse”—United States to come means that becoming the un-American brand is good for business in the long-term.

I wish that I could say that Nike is wrong. But given that the Republican Party has failed to stop and even facilitated the mass immigration of foreigners who care about as much for the Betsy Ross flag as Kaerpnick does, Nike is probably not far off the mark. America already is and will likely only continue to become more unpatriotic as it becomes more “diverse.”

Whereas some Americans see dark clouds of crisis looming on the horizon, Nike’s executives, good capitalists that they are, see an emerging market in anti-Americanism; and they have, accordingly, updated old formulas and modes of enslavement.

Frederick Douglass was a penetrative observer and saw through the designs of America’s original slavemasters. It was customary in his time for owners to grant a reprieve from hard labor to their slaves between the days of Christmas and New Year’s Day—but there came a twist with this quasi “liberation.”

Slaves were actively discouraged from spending their holiday liberty doing anything industrious or edifying for themselves. They were pushed instead by their owners toward “sports and merriments as playing ball, wrestling, running foot-races, fiddling, dancing, and drinking whisky.”

On the surface, wrote Douglass, it was “a custom established by the benevolence of the slaveholders.” But Douglass realized that it was, in fact, “the result of selfishness, and one of the grossest frauds committed upon the down-trodden slave”; the object of which was “to disgust their slaves with freedom, by plunging them into the lowest depths of dissipation.”

The scheme was simple enough. Fill a slave’s cup with as much liquor as they could take—and then give them more, thus searing into his or her mind a connection between freedom and vice. “Thus, when the slave asks for virtuous freedom, the cunning slaveholder, knowing his ignorance,” wrote Douglass, “cheats him with a dose of vicious dissipation, artfully labelled with the name of liberty.”

Kaepernick has drunk the poison of the diversity doctrine that insists America is vile, just as slaves were led to believe about freedom; and, certainly, there are some who genuinely believe as much.

But Nike, like so many other corporations, simply does not care about the truth involved in historical grievances. Moralistic stances in the marketplace are merely the mask that modern capitalism wears to veil its only true concern: profit. White liberal CEOs fan the flames of woke culture, pouring more poison into the cups of the Kaepernicks of this world, and place them behind the shop window with a sign around their neck that reads: “Black Lives Matter.”

This is indeed an old formula. Booker T. Washington, like Douglass, took notice; in his case, of how Yankee industrialists used freed blacks against Southern whites. “In many cases it seemed to me that the ignorance of my race was being used as a tool with which to help white men into office,” wrote Washington, “and that there was an element in the North which wanted to punish the Southern white men by forcing the Negro into positions over the heads of the Southern whites.”

“I felt that the Negro would be the one to suffer for this in the end. Besides, the general political agitation drew the attention of our people away from the more fundamental matters of perfecting themselves in the industries at their doors and in securing property.” In other words, cynical white political types, instead of encouraging black citizens to become better and more informed citizens, helped to inflame their passions so they would choose the most irresponsible and ineffective among them to serve in Congress and elected office, as a way to get revenge on their political enemies in the south. Washington’s words, like Douglass’, ring true today as well. Colin Kaepernick has been a political football for cynical white liberals playing high stakes identity politics for going on two years.

The fusion of woke culture with modern capitalism has become like a cotton gin separating the fibers of our society from the hearts and minds of the people, because doing so is more profitable than not. And like the original cotton gin, this machine in service of woke culture, is sure to guarantee our servitude and suppress our liberty. Kaepernick might think that he’s got it made, that he has achieved liberation, but he has merely moved his bags into the big house, where his white paymasters are only too happy to watch him dance.

Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact licensing@centerforamericangreatness.com.

Photo Credit: Paul Marotta/Getty Images

First Principles • History • Law and Order • Post • The Courts

Liberalism, Originalism, and the Constitution

No one would mistake the Supreme Court’s liberal justices for adherents to the concept of “originalism,” or the belief that one should consider—first and foremost—the Founders’ intent when ruling on constitutional issues. And yet their opinions in the Maryland “Peace Cross” case suggests that they, at least implicitly, support the idea.

In American Legion v. American Humanists Association, the court upheld Maryland’s Bladensburg Cross against the claim that its presence on public property violates the establishment clause of the First Amendment. Writing in dissent, no less a figure than Justice Ruth Bader Ginsburg—the Notorious RBG and hero of the American Left—turned to the thought of the Founders in order to find the meaning of the establishment clause.

According to Ginsburg, the establishment clause intends to create a “wall of separation between Church and State.” Like generations of liberal judges before her, Ginsburg here relies on Thomas Jefferson’s understanding of the purpose of the establishment clause, expressed in his famous letter to the Danbury Baptists. Ginsburg and her predecessors additionally lean on James Madison’s arguments in the Memorial and Remonstrance, written against a proposed system of state support for religion in Virginia.

As it happens, this is a kind of originalism. It’s just that it’s a sloppy and tendentious form of originalism.

The original meaning that conscientious judges should seek is the meaning as it was understood by the public at large when the Constitution was ratified, and not the personal political views of selected Founders who, we should be reminded, were not sovereigns. Nevertheless, it is worth noting that no less a liberal giant than Ginsburg is willing to use at least some form of originalism in an effort to find the meaning of the Constitution.

Nor is this an isolated case. In the previous decade, when the court grappled with the meaning of the Second Amendment in District of Columbia v. Heller (2008), both the conservative and liberal justices turned to the Founding generation to understand the meaning of the “right to keep and bear arms.” Justice Antonin Scalia and the conservative majority found that the weight of the historical evidence supported an individual right to keep and bear arms, while the liberal dissenters disputed this conclusion.

Notably, however, the “living Constitution”—which some liberal commentators treat as the common sense alternative to the much-derided originalist line of inquiry—made no appearance in Heller. Justice John Paul Stevens and the liberal dissenters never suggested the meaning of the Second Amendment should be interpreted in light of today’s values. They mounted no attack on originalism itself as a mode of interpretation. Instead they countered with their own originalist investigation, looking at the same evidence and holding that it supported the view that the Second Amendment aimed merely to protect the state militia.

As well they should. Constitutional originalism provides a nonpolitical standard for judges, one that permits them to think beyond their own policy preferences. Its liberal detractors may claim that it is just a clever disguise for their own political judging (“Originalism is a scam,” according to one recent ThinkProgress headline), but their argument is a weak one.

Anyone who studies the early history of the American Republic can see that originalism is not some novel invention of modern conservatives but a long-established and venerable approach to constitutional interpretation. In his celebrated opinions for the Supreme Court, John Marshall—the “Great Chief Justice”—sought the original meaning of the constitutional provisions on which he was called to rule. Certainly Marshall never suggested—unlike the modern purveyors of the “living Constitution”—that the meaning of constitutional provisions could change over time or might be imbued with new meaning by the jurists of the present generation.

James Madison—the “father of the Constitution”—expressly endorsed originalism as a method of constitutional interpretation. In an 1824 letter to Henry Lee, Madison held that in seeking a “just construction” of the Constitution we must turn to “the sense in which the Constitution was accepted and ratified by the nation.”

“In that sense alone,” Madison added, “it is the legitimate Constitution. And if that be not the guide in expounding it, there can be no security for a consistent and stable, more than for a faithful exercise of its powers.”

Small wonder, then, that even liberal justices sometimes draw on originalist traditions. In light of that fact, liberal pundits ought to be candid enough to admit that originalism is not some cynical conservative expedient but a legitimate method of constitutional interpretation.

Perhaps they can go a step further and ponder the following question: If originalism is good enough for some areas of constitutional inquiry, why isn’t it good enough for all of them?

Photo Credit: Stock Montage/Getty Images

America • Americanism • Declaration of Independence • History • Post • self-government

Calvin Coolidge: ‘If All Men Are Created Equal, That Is Final’

overlay_color=”” spacing=”yes” hover_type=”none” undefined=”” background_repeat=”no-repeat” border_position=”all” padding=”50 0px 50px 0px” margin_top=”0px” margin_bottom=”0px” animation_type=”” animation_direction=”left” animation_speed=”0.3″

The following is an excerpt from Calvin Coolidge’s (lengthy) speech in Philadelphia on July 5, 1926, marking the 150th anniversary of the Declaration of Independence. 

We meet to celebrate the birthday of America. The coming of a new life always excites our interest. Although we know in the case of the individual that it has been an infinite repetition reaching back beyond our vision, that only makes it the more wonderful. But how our interest and wonder increase when we behold the miracle of the birth of a new nation. It is to pay our tribute of reverence and respect to those who participated in such a mighty event that we annually observe the fourth day of July.

Whatever may have been the impression created by the news which went out from this city on that summer day in 1776, there can be no doubt as to the estimate which is now placed upon it. At the end of 150 years the four corners of the earth unite in coming to Philadelphia as to a holy shrine in grateful acknowledgment of a service so great, which a few inspired men here rendered to humanity, that it is still the preeminent support of free government throughout the world.

Although a century and a half measured in comparison with the length of human experience is but a short time, yet measured in the life of governments and nations it ranks as a very respectable period. Certainly enough time has elapsed to demonstrate with a great deal of thoroughness the value of our institutions and their dependability as rules for the regulation of human conduct and the advancement of civilization. They have been in existence long enough to become very well seasoned. They have met, and met successfully, the test of experience . . .

. . . About the Declaration there is a finality that is exceedingly restful. It is often asserted that the world has made a great deal of progress since 1776, that we have had new thoughts and new experiences which have given us a great advance over the people of that day, and that we may therefore very well discard their conclusions for something more modern. But that reasoning can not be applied to this great charter. If all men are created equal, that is final. If they are endowed with inalienable rights, that is final. If governments derive their just powers from the consent of the governed, that is final. No advance, no progress can be made beyond these propositions. If anyone wishes to deny their truth or their soundness, the only direction in which he can proceed historically is not forward, but backward toward the time when there was no equality, no rights of the individual, no rule of the people. Those who wish to proceed in that direction can not lay claim to progress. They are reactionary. Their ideas are not more modern, but more ancient, than those of the Revolutionary fathers.

In the development of its institutions America can fairly claim that it has remained true to the principles which were declared 150 years ago. In all the essentials we have achieved an equality which was never possessed by any other people. Even in the less important matter of material possessions we have secured a wider and wider distribution of wealth. The rights of the individual are held sacred and protected by constitutional guaranties, which even the Government itself is bound not to violate. If there is any one thing among us that is established beyond question, it is self-government—the right of the people to rule. If there is any failure in respect to any of these principles, it is because there is a failure on the part of individuals to observe them. We hold that the duly authorized expression of the will of the people has a divine sanction. But even in that we come back to the theory of John Wise that “Democracy is Christ’s government.” The ultimate sanction of law rests on the righteous authority of the Almighty.

On an occasion like this a great temptation exists to present evidence of the practical success of our form of democratic republic at home and the ever-broadening acceptance it is securing abroad. Although these things are well known, their frequent consideration is an encouragement and an inspiration. But it is not results and effects so much as sources and causes that I believe it is even more necessary constantly to contemplate. Ours is a government of the people. It represents their will. Its officers may sometimes go astray, but that is not a reason for criticizing the principles of our institutions. The real heart of the American Government depends upon the heart of the people. It is from that source that we must look for all genuine reform. It is to that cause that we must ascribe all our results.

It was in the contemplation of these truths that the fathers made their declaration and adopted their Constitution. It was to establish a free government, which must not be permitted to degenerate into the unrestrained authority of a mere majority or the unbridled weight of a mere influential few. They undertook the balance these interests against each other and provide the three separate independent branches, the executive, the legislative, and the judicial departments of the Government, with checks against each other in order that neither one might encroach upon the other. These are our guaranties of liberty. As a result of these methods enterprise has been duly protected from confiscation, the people have been free from oppression, and there has been an ever-broadening and deepening of the humanities of life.

Under a system of popular government there will always be those who will seek for political preferment by clamoring for reform. While there is very little of this which is not sincere, there is a large portion that is not well informed. In my opinion very little of just criticism can attach to the theories and principles of our institutions. There is far more danger of harm than there is hope of good in any radical changes. We do need a better understanding and comprehension of them and a better knowledge of the foundations of government in general. Our forefathers came to certain conclusions and decided upon certain courses of action which have been a great blessing to the world. Before we can understand their conclusions we must go back and review the course which they followed. We must think the thoughts which they thought. Their intellectual life centered around the meeting-house. They were intent upon religious worship. While there were always among them men of deep learning, and later those who had comparatively large possessions, the mind of the people was not so much engrossed in how much they knew, or how much they had, as in how they were going to live. While scantily provided with other literature, there was a wide acquaintance with the Scriptures. Over a period as great as that which measures the existence of our independence they were subject to this discipline not only in their religious life and educational training, but also in their political thought. They were a people who came under the influence of a great spiritual development and acquired a great moral power.

No other theory is adequate to explain or comprehend the Declaration of Independence. It is the product of the spiritual insight of the people. We live in an age of science and of abounding accumulation of material things. These did not create our Declaration. Our Declaration created them. The things of the spirit come first. Unless we cling to that, all our material prosperity, overwhelming though it may appear, will turn to a barren sceptre in our grasp.

If we are to maintain the great heritage which has been bequeathed to us, we must be like-minded as the fathers who created it. We must not sink into a pagan materialism. We must cultivate the reverence which they had for the things that are holy. We must follow the spiritual and moral leadership which they showed. We must keep replenished, that they may glow with a more compelling flame, the altar fires before which they worshiped.

Photo credit: History Archive/Universal Images Group via Getty Images

background_color=”” border_position=”all” spacing=”yes” background_image=”” background_repeat=”no-repeat” padding=”” margin_top=”0px” margin_bottom=”0px” class=”” id=”” animation_type=”” animation_speed=”0.3″ animation_direction=”left” hover_type=”none” element_content=””]

America • Americanism • First Principles • History • Post • The Declaration

The Lessons of the Declaration of Independence

The colonists’ quest for independence from the British in 1776 began with a goal: “to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them.” Declaring independence meant that Americans were no longer subordinate to the British monarch nor subject to the British Parliament, but were equal, free, and independent.

The reasons for this dramatic action demanded explanation, as they noted:  “a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.” Therefore, the Declaration includes two parts. There are on the one hand facts about the conduct and behavior of the British toward the colonists. On the other, there are immutable truths about the human condition. These truths have guided Americans in their quest to live and to build a just and worthy nation since they were penned.

The colonists recognized a universal standard independent of man-made governments and institutions, invoking the authority of the laws of nature and of nature’s God. The coupling of God and Nature was comprehensive. It included that which is human and of this world and that which is created by God and universal.

In his 1837 speech commemorating the Declaration of Independence, John Quincy Adams coupled the laws of nature with the dictates of justice and proclaimed: “In the annals of the human race, then, for the first time, did one People announce themselves as a member of that great community of the powers of the earth, acknowledging the obligations and claiming the rights of the Laws of Nature and of Nature’s God. The earth was made to bring forth in one day! A Nation was born at once!”

The now citizens of the United States of America were not just severing ties with their colonial British past, nor were they simply forming a new government. The foundation laid by the Declaration of Independence articulated truths that had never been used as the foundation of any actual government. “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

These rights are comprehensive in the lives of men and women, though it is important to note that the document says “among these rights,” which suggests that this list is not exhaustive. That such rights exist is a matter of self-evident truth—that is, the truth of the thing is contained within the definition of the thing. To acknowledge that there is such a thing as “man,” created by God and not himself a god, is to acknowledge his equality to every other creature that can be called “man”—and that goes for females, too. Man, as in mankind, meant that in their essential dignity and nature as beings, created by God, they are beings who are by nature also created equal.

The mere assertion that a sovereign people have inalienable rights is not sufficient, however, because government is always required to secure those rights. A right can exist without people acknowledging it. For rights to be actualized, action is required and the authors of the Declaration were very clear about what kind of action they meant.

The relationship between the citizens and the government is made clear by Adams: “by the affirmation that the principal natural rights of mankind are unalienable, it placed them beyond the reach of organized human power.” This reinforces the idea that government is meant to secure rights rather than grant them, and also the notion that the people themselves, rather than the government, are sovereign.

As Adams continues in his explanation of the text of the Declaration, “by affirming that governments are instituted to secure them, and may and ought to be abolished if they become destructive of those ends, they made all government subordinate to the moral supremacy of the People.” The rights recognized in the Declaration are inherent. The moral supremacy of the people gives a standard by which to judge: it is not force or violence, it is a moral foundation that can be discerned by reason and used as a basis for forming judgments.

How best to understand further the rights and the powers of government to secure our rights as stated in the Declaration? Thomas Jefferson, one of the principal drafters of the Declaration of Independence, expounded upon the limitations of the legislature in his Virginia Bill for Religious Freedom. Drafted in 1779, he included references to the natural rights listed in the Declaration: “And though we well know that this assembly elected by the people for the ordinary purposes of legislation only, have no power to restrain the acts of succeeding assemblies, constituted with powers equal to our own, and that therefore to declare this act to be irrevocable would be of no effect in law; yet we are free to declare, and do declare, that the rights hereby asserted are of the natural rights of mankind, and that if any act shall be hereafter passed to repeal the present, or to narrow its operation, such as would be an infringement of natural right” (emphasis added). This theme is advanced as well by Frederick Douglass and Calvin Coolidge, who invoke the word “final” to describe the sentiment.

Douglass in his 1852 speech “What to the Slave Is the Fourth of July?” described those who drafted the Declaration as preferring revolution over peaceful submission to bondage.

They were quiet men; but they did not shrink from agitating against oppression. They showed forbearance; but that they knew its limits. They believed in order; but not in the order of tyranny. With them, nothing was ‘settled’ that was not right. With them, justice, liberty and humanity were ‘final;’ not slavery and oppression. You may well cherish the memory of such men. They were great in their day and generation. Their solid manhood stands out the more as we contrast it with these degenerate times.

Coolidge, in his speech on the 150th anniversary of the Declaration of Independence, also looked upon the Declaration as having a finality: “If they are endowed with inalienable rights, that is final. If governments derive their just powers from the consent of the governed, that is final. No advance, no progress can be made beyond these propositions.”

The finality that Jefferson, Douglass, and Coolidge invoke doesn’t mean that America is static or without the possibility of development or improvement. Instead, it affirms that the principles of the Declaration of Independence are ideals and markers by which to measure all conduct thereafter. The citizens have a responsibility to themselves and to fellow citizens to act in a manner consistent with the foundational principles. One need look no further than Douglass and his reference to “degenerate times.”

The times of which Douglass spoke were among the darkest in the history of the United States. The Declaration recognized that all men are created equal and that they possessed inalienable rights of life, liberty, and the pursuit of happiness, but slavery was still present in the southern states. Douglass condemned the recently passed Fugitive Slave Law, which required that slaves in free states be returned to their bondage. He was a freed slave who was active in the abolitionist movement and intentionally gave his “What to the Slave Is the Fourth of July?” speech on July 5. He pointedly included in his remarks about the Fourth of July, “it is the birthday of your National Independence.”

Douglass posed questions to his audience: “Fellow-citizens, pardon me, allow me to ask, why am I called upon to speak here to-day? What have I, or those I represent, to do with your national independence? Are the great principles of political freedom and of natural justice, embodied in that Declaration of Independence, extended to us? and am I, therefore, called upon to bring our humble offering to the national altar, and to confess the benefits and express devout gratitude for the blessings resulting from your independence to us?” He continues, “What, to the American slave, is your 4th of July? I answer: a day that reveals to him, more than all other days in the year, the gross injustice and cruelty to which he is the constant victim.”

Douglass was blunt. He questioned how the country could declare the self-evident truths of the Declaration, but “you hold securely, in a bondage which, according to your own Thomas Jefferson, ‘is worse than ages of that which your fathers rose in rebellion to oppose,’ a seventh part of the inhabitants of your country.” Not only were those in the south who held slaves denying them fundamental rights, they were also masters of them in direct contradiction to the principles of the Declaration.

Douglass spoke those words in 1852, but he ended his speech on a note of hope: “Allow me to say, in conclusion, notwithstanding the dark picture I have this day presented of the state of the nation, I do not despair of this country. There are forces in operation, which must inevitably work the downfall of slavery. ‘The arm of the Lord is not shortened,’ and the doom of slavery is certain. I, therefore, leave off where I began, with hope. While drawing encouragement from the Declaration of Independence, the great principles it contains, and the genius of American Institutions, my spirit is also cheered by the obvious tendencies of the age.”

Lincoln, with respect to the principle of equality in the Declaration, said in his speech responding to the Supreme Court’s 1857 decision in Dred Scott:

They meant to set up a standard maxim for free society, which should be familiar to all, and revered by all; constantly looked to, constantly labored for, and even though never perfectly attained, constantly approximated, and thereby constantly spreading and deepening its influence, and augmenting the happiness and value of life to all people of all colors everywhere. The assertion that ‘all men are created equal’ was of no practical use in effecting our separation from Great Britain; and it was placed in the Declaration, not for that, but for future use.

The eternal and universal principles used to craft an argument for severing ties with the British and laying the foundation for a new nation were successful in 1776, but their relevance did not end once America became an independent nation, as Lincoln recognized. The principles used as justification and explanation speak to all, as much today as they did 243 years ago. The Declaration is a measure that serves as a reference point, regardless of the time or of a particular set of circumstances. That we are created equal is not dependent upon our being American, or male or female, or our economic condition. It is a universal truth.

That we have inalienable rights to life, liberty, and the pursuit of happiness is not dependent upon the government that is in power. That we have the right to alter our government if it becomes destructive of these rights is as true today as it was in 1776. That government is intended to secure our rights and derives its just powers from the consent of the governed reminds us that the foundation of the sovereignty of the people requires participation and vigilance. Consent is given on an ongoing basis and the judgment of whether the government is securing the rights of American citizens or thwarting them is a constant exercise. The words and themes of the Declaration have remained at the heart of American discourse because they provide a guide that is timeless.

There are great differences between the likes of John Quincy Adams, Frederick Douglass, Abraham Lincoln, and Calvin Coolidge, yet they all read the Declaration because it spoke to them and it spoke to the America in which they lived, but it also spoke to the America that they wanted it to become.

The Declaration of Independence speaks to us today because it initiates dialogue between the ages, with the ideas and arguments of citizens past and with those living today. When we are forced to ponder all of the assertions and truths in the Declaration, we must engage in a dialogue with our fellow citizens. To quote from another great work from the period of the Declaration, Federalist 1, we are asked to determine whether we can establish good government from “reflection and choice” or are doomed to accept that we must submit to “accident and force.”

The Declaration provides the parameters of the discussion that lead to reflection and choice. As America becomes more fragmented—the current word is tribalistic—as monuments to historical figures are being toppled, questions asked about how America’s past should be honored or erased, or whether one should stand or take a knee during the National Anthem, such parameters and the principles of the Declaration are necessary. The annual remembrance of the ideals and recitation of the words, a deliberation and reflection upon the meaning and applicability of the words must be an occurrence that goes beyond the Fourth of July.

Editor’s Note: This essay is based upon a speech delivered at the St. John’s College Graduate Institute, Santa Fe, New Mexico, “Does the Declaration of Independence Still Speak to Us Today?”

Photo Credit: Universal History Archive/Getty Images

America • Donald Trump • History • Immigration • Post

Hadrian’s Wisdom: Why America Needs a Wall

Nearly 1,900 years ago, Roman Emperor Hadrian built a great wall across Britain. It was 73 miles long, and divided the Roman province of Britannia from the realms of the “painted men” to the north. Why?

The northerners were not particularly dangerous. Even if they were, they could quite easily climb the wall—or row around it! Perhaps more importantly, Queen Boudicca’s rebellion proved that Britannia’s greatest threat was its native Celtic population. And what exactly was Hadrian protecting? Britain’s north was an empty expanse of pastures and forests. Who cares if tattooed peasants herded their sheep in the hills? Rather than wasting precious Denarii on a wall, Hadrian should have fortified Roman cities and spent the rest on bread and circuses!

Hadrian’s Wall has stood since its completion in 128 AD. It witnessed the fall of Rome, the Saxon invasions, and the Norman Conquest. It survived King John’s antics, Oliver Cromwell’s violence, and the Kaiser’s wrath. It outlasted Bath’s bathhouses, the Globe Theater, and it will certainly outlive the Shard and its ostentatious glass cousins at Canary Wharf. Men have lived and men have died. The wall remains.

This is why Hadrian built his wall. And this is why Trump must build his. A wall is not just brick and mortar, nor steel and concrete—it is an enduring symbol of a people’s commitment to their sovereignty, their brethren, and their future.

Romulus’ Legacy
Tradition has it that Romulus, the first king of Rome, drove his ox-plow around the city on April 21, 753 B.C. The resulting furrow demarcated the spiritual and legal boundary between Rome and “not Rome”—Rome existed only within this sacred boundary, the pomerium; everything beyond was simply territory. This meant that Roman law applied only to those citizens within the pomerium. Anyone beyond the boundary, be they citizen, slave, or foreigner, was subject to martial law. In this way, Roman sovereignty was linked to the land itself.

This was a radical departure from the Greek understanding of sovereignty. In ancient Greece, the main unit of political organization was the polis, which is often translated as city-state. The translation is misleading. A polis was not a city in the modern sense—it was not synonymous with a particular place or collection of buildings. For example, it was not uncommon for a polis simply to relocate in times of conflict—even Athens temporarily relocated to the Island of Aegina during the Persian Wars (499-449 B.C.). This is in stark contrast with Rome, which refused to evacuate in the face of the Carthaginian invasion under the famed Hannibal. Athens could move. Rome could not.

Likewise, a polis was not a state, as it existed independently of any particular government or code of laws. Consider how Greek poleis routinely changed from oligarchies to democracies and back again throughout the Peloponnesian Wars (431-404 BC)—but nevertheless retained a continuity of identity. The polis existed independent of its constitution. Compare this to Rome, which formally changed governmental systems twice in some 1,200 years (once, if you ask Octavian), and both regime changes were accompanied with an orgy of calumny followed by an ordeal of introspection and reinvention. Roman sovereignty was embodied in the law, and thus to replace the law was to replace Rome itself.

Greeks made laws, laws made Romans.

These two conceptions of sovereignty (and by implication, identity) help explain why Rome, rather than Athens, united the Western World.

To the Strongest . . .
The conquests of Alexander the Great (d. 323 B.C.) were a brilliant flash in the bleak march of history. Within just 13 years, Alexander had subjugated everything between the ancient Nile and the mysterious Indus. His was the greatest empire the world had ever seen. So electric was Alexander’s spirit and so impressive were his deeds, that he was deified in Egypt as the son of Amon Ra—of Zeus himself! He was a living Heracles. Yet Alexander’s empire crumbled like sand upon his death. Meanwhile, the empire’s successor states were fragile, held together only by spilt blood and sacrificed treasure. What went wrong?

Although Macedonian by birth, Alexander was raised Greek. His father, King Philip II, went to great pains to Hellenize his people by competing in the Olympic Games, avenging the Delphic oracle in the Third Sacred War, and hiring the famed polymath Aristotle to educate his son. As such, it is not surprising that Alexander’s conception of sovereignty was quintessentially Greek: what cohered his empire was not the rule of law, nor his disparate people’s loyalty to their homelands, but lineage. Consider Alexander’s two most famous attempts to consolidate his conquests.

First was Alexander’s apotheosis in Egypt. After occupying the Nile, Alexander crossed the Great Sand Sea to supplicate the Oracle of Amon Ra at the Siwa Oasis. There Alexander was deified, and so entered the Egyptian pantheon. No longer was he a conquering king—he was Egyptian god, like the pharaohs before him. Alexander made this detour because he assumed that sovereignty flowed through blood (in this case, divine blood), and thus the best way to secure Egypt was to become Egyptian by (re)birth.

Second were the Susa Weddings. Alexander’s attempt to consolidate his rule in Persia culminated in a mass marriage: he betrothed an Achaemenid princess and forced 10,000 of his men to take Persian wives, mostly against their will. Again, this highlights Alexander’s emphasis on common ancestry as being integral to sovereignty—Greeks could not become Persians, and Persians could not become Greeks, without comingling their blood. Lineage was the essence of Greek sovereignty.

Tying sovereignty to the people worked exceptionally well as long as the Greeks lived in relatively small, isolated, and homogenous communities—there is a reason King Leonidas’ 300 Spartans were able and willing to stand against Xerxes’ diverse myriads—but this paradigm failed in the Hellenistic kingdoms. Why?

The vast majority of Hellenistic subjects were not Greek and had no way of becoming Greek, so why should they care what happened to their Greek overlords? Why should they die for Antigonus the One-Eyed rather than Seleucus the Victor? After all, “they’re all Greek to me!” Most Hellenistic subjects lacked skin in the game, and this largely explains why Alexander’s empire fractured upon his death, why the successor states were constantly fragmenting, and why the Hellenistic world ultimately succumbed to Rome.

Caracalla’s Wisdom
We can sum up Rome’s expansion by paraphrasing Julius Caesar: they came, they saw, they conquered. What made Rome unique, however, was not its ability to conquer new lands, but its ability to assimilate new people, to turn barbarians into Romans.

The process of assimilation began early in Roman history with the formation of the Latin League, which granted Roman allies various privileges over the neighboring Etruscans. In doing so, Rome gave its allies a stake in its success—it put its skin in their game. This culminated with Rome granting varying degrees of citizenship to its allies following the Second Latin War (340-338 B.C.). Rome turned non-Romans into Romans.

The expansion of citizenship was Rome’s salvation: when King Pyrrhus of Epirus invaded Italy in 280 B.C. he could not overcome Rome’s manpower. At the time, half of Rome’s army consisted of allied peoples. There is little doubt that Pyrrhus would have been victorious had Rome stood alone. The same fate awaited the Carthaginian general Hannibal during the Second Punic War (218-201 B.C.). Although Hannibal killed or captured 30,000 Romans at the Battle of the Trebia River, 80,000 at the Battle of Cannae, and 20,000 at the Battle of Lake Trasimene, Rome eventually prevailed. Why?

Rome was not just the City of Rome. Rome was (most of) Italy, and Italy could sustain the losses. Rome continued to grow, granting citizenship to all peninsular Italians following the Social War (91-88 B.C.), and to people in Iberia and Gaul at Julius Caesar’s behest. Finally, Emperor Caracalla granted citizenship to all free men within the Roman Empire in 212 A.D. Rome owed its success to the fact that it was the first state in history to combine the scale of a Hellenistic Empire with the cohesion of a Greek polis. Rome was both Pericles’ Athens and the Empire of Alexander.

In short, Rome was the world’s first country—a land of laws inhabited by loyal citizens. It was not until Rome devolved into multifarious ethno-religious tribes that she finally fell.

Another Brick in the Wall
This brings me back to Hadrian, Trump, and walls.

As we have seen, Hadrian’s Wall was not built to keep people out like the Great Wall of China, nor was it built to keep people in like the Berlin Wall. And frankly, it protected little of economic value and nothing of strategic importance. This raises the question: why would Hadrian build his big, beautiful wall at the furthest-flung, least significant corner of the Roman Empire? Why not spend the resources bolstering the fortifications along the Rhine or the Danube? Why not wall the Anatolian passes to stymie the marauding Parthians?

The answer is that Hadrian’s Wall was not just a wall—it was an indelible symbol, a metaphor made manifest: it was Rome’s new pomerium, civilization’s sacred boundary wrought in stone. Behind the wall was Rome, a land of laws upheld by loyal citizens, beyond was hostile territory. Hadrian’s Wall needed to be built at the very edge of the Empire to show everyone—be they Roman or barbarian—that Rome was not just Rome. Rome was Britannia. Rome was Africa. Rome was Asia. And Rome would defend to the death every grain of sand and every blade of grass behind Hadrian’s Wall. Why? Because each was a part of Rome, and Rome was sacred.

America’s Founders envisioned our nation as a land of laws upheld by loyal citizens—America was a new Rome. And like Rome, one of America’s greatest strengths was in its exceptional ability to turn non-Americans into Americans, to create unity out of discord, and from this unity, strength. This is no longer happening, and this is why President Trump must build a wall.

In addition to keeping migrants out, America needs a wall to show everyone—be they American or foreign—that America is not merely a collection of competing individuals and tribes. It is not a whipping-boy for the military-industrial complex. It is not a plaything for globalists. And it is not a goody-bag for Third-World migrants to pillage at their pleasure. Instead, America is a land of laws inhabited by loyal citizens who will defend to the death every grain of sand and every blade of grass behind the wall. Why? Because each is a part of America, and America is sacred. Many Americans—and most of the world—forgot this.

We need to remind them. Build the wall.

Photo Credit: Herika Martinez/AFP/Getty Images

 

America • Center for American Greatness • History • political philosophy • Post • Religion and Society • The Constitution • The Culture

The American Founding’s High-Minded Purposes

James Madison is justly celebrated for his frequently stated opinion that “all power in just and free Government is derived from compact.” But Madison’s view is not endorsed by all purported champions of the founders. A recent article, “Our Unwritten Constitution: Orestes Brownson and the Foundation of American Liberty,” published as part of the Real Clear Policy series on the American Project and co-authored by Richard M. Reinsch II and the late Peter Augustine Lawler, argues that Madison is utterly mistaken in his claim. In fact, the authors claim that reliance on “Lockean contract theory” produced a constitution that was “devised solely in the interest of the rights of individuals” and was “based on the unrealistic abstraction of unrelated autonomous individuals.”

Lawler and Reinsch claim that autonomous individuals—that is, human beings abstracted from real life—cannot provide the appropriate material for political life. They are not “parents, creatures, [or] even citizens. Lockean thought, thus, isn’t political enough to be the foundation of government, and it isn’t relational enough to articulate properly the limits of governments or the roles of family and organized religion.”

Reinsch and Lawler rely heavily on Orestes Brownson’s criticism of Locke’s influence on the American Founding. They describe Brownson, accurately if a bit oddly, as “a 19th century New England intellectual associated with the transcendentalist movement who converted to Roman Catholicism” and vouch for his assertion that “the equality of human persons is a fact. But it is a fact that entered the world through Christian revelation and was later affirmed as self-evident by philosophers.” The authors maintain, according to Brownson, the self-evidence of human equality as it appears in the Declaration of Independence “is undermined” by its “pure Lockean dimension . . . where individual sovereignty becomes the foundation of government. Every man, Locke says, has property in his own person, and for Brownson that assertion of absolute self-ownership is, in effect, ‘political atheism’.”

Brownson, however, vigorously resists the idea of self-ownership: “man is never absolutely his own, but always and everywhere belongs to his Creator; it is clear that no government originating in humanity alone can be a legitimate government. Every such government is founded on the assumption that man is God, which is a great mistake—is, in fact, the fundamental sophism which underlies every error and sin.”

Our authors endorse Brownson’s criticism of the notion that the just powers of government derive from the consent of the governed or that sovereignty ultimately resides in the people. To say that the people are sovereign is “implicit atheism” because “[s]ocial contract thought lacks an external standard higher than man’s will that could limit, shape, and condition it. The highest being is man, who would self-create government by consent . . .” This is the universe of “self-sovereignty or political atheism” that Hobbes, Locke, and Rousseau occupied and which the authors of the Declaration of Independence obediently followed.

The authors of the Declaration, of course, appealed to the “Laws of Nature and of Nature’s God,” as their authority. Were they simply disguising the fact that they relied on no higher authority with high sounding rhetoric?—that despite their rhetoric they were “political atheists”? It is true the Declaration is the quintessential statement of social compact theory, but isn’t it also clear that its entire argument rests on the acknowledgment of a Creator and an intelligible Creation?

Reinsch and Lawler are wrong to assert that compact is only about the protection of rights and does not involve obligations. In a social compact, every right entails a reciprocal obligation. Every member of the compact who joins for the equal protection of his equal rights has the duty to protect the equal rights of fellow citizens—even the right of revolution is a reciprocal duty belonging to all citizens. Anyone who is unwilling or unable to perform the duties attendant upon membership in a community based on social compact is ineligible to become a member.

Our authors apparently did not notice the closing statement of the signers of the Declaration of Independence: “we mutually pledge to each other our Lives, our Fortunes and our sacred Honor.” The signers are willing to sacrifice life and property—both of which are natural rights—to preserve their honor. They believed that honor or justice was of higher rank than the natural right to life or property. Clearly, the signers of the Declaration ranked the goods of the soul (honor, justice) higher than the goods of the body (life, property). For Hobbes, of course, honor is not any part of the human good. It is utterly impossible to imagine him ever pledging his “sacred honor” to any cause.) But Reinsch and Lawler maintain throughout, that the Lockean authors of the Declaration and the Constitution sought only to provide protection for the natural rights of autonomous individuals or, as they described it on one occasion, “to provide protection against violent death and to secure property rights.” As we have just demonstrated, however, they are mistaken. In ranking honor above life, the authors of the Declaration demonstrated they were not Hobbesians, willing to sacrifice everything to the “fear of violent death.”

In addition, the Declaration never claims that the principal end or purpose of government is the protection of natural rights; it is rather the “safety and happiness of the people”—what one prominent political philosopher described as the alpha and omega of political life as depicted by Aristotle. Our authors make the significant, but frequent, error of those who insist that the American founding was radically modern, simply ignoring the obvious Aristotelian elements incorporated in the framers’ handiwork.

Bound by the Law of Nature
The authors of The Federalist accepted the Declaration of Independence as the authoritative source of the Constitution’s authority. Madison in The Federalist insisted that the proposed Constitution must be “strictly republican” because no other form of government could be “reconcilable with the genius of the people of America; with the fundamental principles of the Revolution; or with the honorable determination which animates very votary of freedom to rest all our political experiments on the capacity of mankind for self-government.”

The “genius of the people” refers to the habits, manners, customs, history, traditions, and religion of Americans. Contrary to our authors, the social compact founders were well aware of the necessity of including these factors in their constitutional deliberations. No one can read The Federalist or, for that matter, the writings of the Anti-Federalists, without coming to that realization.

The second and central factor that requires “strictly republican government” is adherence to “the fundamental principles of the Revolution,” i.e., the principles of the Declaration. The third reason is that strictly republican government requires self-government; and that means rule by the consent of the governed, a principle squarely based on social compact.

In following Brownson, Reinsch and Lawler may have followed a false prophet. Brownson’s account of Locke is seriously defective because he seemed to be unaware of the unique theological-political problem that Locke faced. Our authors seem to have followed him through the gates of error.

The wars of religion were still a fresh memory to Locke and other political philosophers of his era. They were not just a distant memory to the American founders, either. In the classical world, the laws of particular cities were always supported by their gods. Obedience to the gods and obedience to the laws were one and the same. As soon as there was a universal God for all cities, however, political obligation became problematic. In the Christian world, conflicts between obligations to God and obligations to civil authority became inevitable, and in cases of conflict, the first obligation of Christians was to God or ecclesiastical authority. This reveals the apolitical character of Christianity. As the apostle Paul wrote to the Philippians, “our government is in heaven.”

The universalism of Christianity, of course, makes an appeal to particular gods as the ground or foundation of the laws of a particular regime impossible. Some ground for political obligation—for politics—independent of Christian theology had to be found if political life was to be free from the continuous strife engendered by the theological disputes that arose within Christianity. The late Harry Jaffa probably understood this theological-political predicament better than anyone when he argued:

Christianity had established within the souls of men the idea of a direct, personal, trans-political relationship between the individual and his God. But this relationship did not determine what the laws were to be, or the precise character of the obligation owed to those laws. The idea of the state of nature—the idea of a non-political state governed by moral law—corresponded to the relationship which every Christian had with every other Christian as he considered himself prior to and apart from his membership in a particular civil society. Just as every Christian was under the moral law, without being a member of civil society, so every human being was under the moral law of the state of nature, prior to entering a particular civil society by way of the social contract.

It is clear in Locke that everyone is bound by the law of nature—the moral law—in the state of nature. Thus, Jaffa argues, the social contract, by creating particular political communities, reestablishes the idea of man as by nature a political animal, an idea that was absent from the apolitical universe of Christianity. It provided a ground for political obligation, based in reason and consent, that was also absent in Christianity. Far from the “political atheism” described by Brownson, Locke restored man’s political nature based on higher law, the laws of nature—and he did it on Aristotelian grounds!

Good Theology and Good Government
Of course, Locke spoke most often in terms of individual rights, something that Brownson deplored as leading to the radically autonomous individuals who assumed, he falsely believed, the sovereignty of God. Brownson misunderstood Locke, but he must surely have understood the origin of the idea of individual rights was in Christian theology itself. In Christian theology, man’s relationship to God is personal, thus the political relationship must also be “personal,” that is based on individual rights. Locke understood that the principles of natural right must be able to accommodate the regnant theology. Rights must belong to individuals; that was good theology—and it was good government.

Aristotle says that the principles of human nature are universal, but for human nature to flourish, for human potential to become actual, it must do so in particular human communities—in the polis. For Christians, the highest aspirations are in the life to come, and political life in this world is merely a preparation for the next. Paul cautioned the Colossians to “mind the things above, not the things on earth.” From this point of view, man is by “nature” apolitical. Social compact reaffirms man’s political nature by establishing particular political communities where this-worldly aspirations are the proper objects of political life. At the same time, man’s universal nature is affirmed by the law of nature that is the standard and measure by which particular communities are judged. While reasserting man’s political nature, social compact at the same time retains its compatibility with the City of God because natural law is understood to be, in Locke’s terms, “the Will of God” or reason which is the “the voice of God.”

The Declaration is also Aristotelian in its recognition of universal human nature (“all men are created equal”) but also recognizing that the implementation of that equality in securing of the “safety and happiness” of the people requires the creation of a “separate and equal” nation. Only in a separate and equal nation—a sovereign nation—can the privileges and immunities of citizenship be guaranteed and the habits, manners and virtues suitable for republican citizenship be inculcated.

No doubt Reinsch and Lawler will complain that this social construct is hardly Aristotelian because it is a human construct, an act of pure human will, whereas Aristotle maintained that man is by nature a political animal. For Aristotle, of course, the polis does not grow spontaneously—it is not the result of natural growth; rather, it had to be “constituted” by human art, and the one who first “constituted” the polis, Aristotle says, is the cause of the “greatest of goods.” The polis exists by nature because, while it is last in the order of time, it is first in the order of final causality. All associations—male and female, the family, the tribe, the village—are incomplete, and their incompleteness points to the polis as a final cause. And the final cause is nature. Aristotle’s polis thus seems to be no less the result of artifice than social compact. In other words, Aristotle’s polis—no less than America—had to be founded by human art. Had Aristotle faced the same theological-political situation that Locke faced, I believe he would have agreed that social compact was the only possible ground for establishing political life on the foundations of nature or natural law.

Brownson and our authors are particularly exercised by Locke’s “doctrine” of self-ownership. They believe this to be the most destructive of all Locke’s subversive writings. Men always belong to the Creator; they can never belong to themselves. But what is the sovereignty of the individual presupposed by social compact “but the assumption that man is God?” Let’s see.

In the sixth paragraph of the Second Treatise, Locke spells out the obligations that men have in the state of nature. It is quite remarkable that in a book famous for its advocacy of rights, we hear first about the obligations that everyone has to the law of nature:

The State of Nature has a Law of Nature to govern it, which obliges every one: And Reason, which is that Law, teaches all Mankind, who will but consult it, that being all equal and independent, no one ought to harm another in his Life, Health, Liberty, or Possessions. For Men being all the Workmanship of one Omnipotent, and infinitely wise Maker; All the Servants of one Sovereign Master, sent into the world by his order and about his business, they are his Property, whose Workmanship they are, made to last during his, not one another’s Pleasure.

Men are thus the property of “one Omnipotent, and infinitely wise Maker.” This act of creation—the “workmanship of God”—makes each man equally the property of God, and each being the property of God, no one can be the property of anyone else. Thus each is “equal and independent” with respect to every other human being, which can only mean that “every Man has a Property in his own person” in his relations with every other human being, but is responsible to God in fulfilling his obligations to the law of nature—those obligations that God has imposed for the preservation of His workmanship. According to Locke in the First Treatise, God made man and planted in him a desire for self-preservation so that “so curious and wonderful a piece of Workmanship” should not perish. And according to Locke in the Second Treatise, God has set the individual free and made him “master of himself, and Proprietor of his own Person” so that he might go about fulfilling his obligations to the laws of nature, which he describes as the “Will of God” in the service of preserving God’s workmanship, not only of individuals but of mankind.

Liberty Is the Law of God and Nature
This is hardly the portrait of radically autonomous individuals who seek to supplant the authority of God drawn by Brownson and endorsed by Reinsch and Lawler, but it is the authentic Locke available to anyone who is willing to read him with any modicum of care. The American Founders read Locke as enlightened statesmen, gleaning political wisdom from his superior understanding of the theological-political problem. It was the absence of such disputes that made the success of the American Founding possible—a rare time in history when such a providential dispensation favored political founding—a dispensation prepared in large measure by Locke.

Madison was right: compact is the ground of all just and free government, and the theologians at the time of the founding agreed.

I will discuss here only one widely circulated sermon that was typical of the many sermons that relied on compact to reconcile questions of theology and politics. The Reverend John Tucker delivered “An Election Sermon” in Boston in 1771 that was profoundly influenced by Locke. “Civil and ecclesiastical societies are, in some essential points, different,” Tucker declaimed. “Our rights, as men, and our rights, as Christians, are not, in all respects, the same.” It cannot be denied that God’s

Subjects stand in some special relation and are under some peculiar subjection to him, distinct from their relation to and connection with civil societies, yet we justly conclude, that as this divine polity, with its sacred maxims, proceeded from the wise and benevolent Author of our being, none of its injunctions can be inconsistent with that love of liberty he himself has implanted in us, nor interfere with the laws and government of human societies, whose constitution is consistent with the rights of men.

Tucker exhibited a common view among New England clergy: the constitution of the “divine polity” cannot be in conflict with any civil government “whose constitution is consistent with the rights of men” and the “love of liberty” that God implanted in human nature. According to Tucker, the proper constitution of civil government begins with the reflection that

All men are naturally in a state of freedom, and have an equal claim to liberty. No one, by nature, not by any special grant from the great Lord of all, has any authority over another. All right therefore in any to rule over others, must originate from those they rule over, and be granted by them. Hence, all government, consistent with that natural freedom, to which all have an equal claim, is founded in compact, or agreement between the parties;—between Rulers and their Subjects, and can be no otherwise. Because Rulers, receiving their authority originally and solely from the people, can be rightfully possessed of no more, than these have consented to, and conveyed to them.

Thus compact seems to be the key to reconciling divine polity and civil polity. Tucker began the sermon with the invocation that “the great and wise Author of our being, has so formed us, that the love of liberty is natural.” Liberty is the law of God and nature. The laws of divine polity are prescribed in the Gospel; those of civil polity are derived from social compact. What connects divine polity and civil polity is the liberty that God created as the essential part of man’s nature. Social compact is the reasonable exercise of that freedom in the formation of civil society. Thus it seems that the theological-political problem—the problem of potentially conflicting obligations between divine polity and civil polity—is solved by Tucker, at least on the moral and political level, on the basis of social compact, which provides the only rightful basis for government because it is the only origin of government consistent with natural liberty.

In fashioning his account of the social compact, Tucker readily acknowledges the influence of “the great and judicious Mr. Locke,” extensively quoting and citing “Locke on Civil Government.” I think it fair to say that “America’s philosopher” dominated the pulpit no less than he dominated legislative halls and constitutional conventions. Thus a remarkable providence seemed to have guided the American founding in the form of a dispensation from the theological-political disputes that would have rendered impossible any attempt to establish constitutional government.

To argue that the American Founders fell prey to Locke’s radical individualism when they relied on social compact reasoning is simply perverse and a mischaracterization of the Founders’ (and Locke’s) understanding. The Founders did not read Locke as a radical modern. They were unaware—or ignored—the philosophic dispute between ancients and moderns. As statesmen, they were interested in the history of politics and were free to choose the most salutary and beneficial practical solutions. Their reading of Locke traced the ideas of natural law directly back to Aristotle. They were mostly unaware of the latter-day discovery of Locke’s esoteric writing that provided insights into the radical core of his thought. Locke’s exoteric writings provided an entirely salutary political teaching that was adopted—and adapted—by the Founders.

The Founders’ decision decision to follow Locke on social compact—“the principles of the Revolution”—meant that the end of government was the “safety and happiness” of the American people, an Aristotelian conception that helped to insulate the founding from the storms of modernity that were threatening Europe. It provided America with a more comprehensive and elevated purpose than simply avoiding “violent death” and “protecting property,” the Hobbesian purposes assigned by Reinsch and Lawler.

Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact licensing@centerforamericangreatness.com.

Photo Credit: The Print Collector/Print Collector/Getty Images

America • Center for American Greatness • Europe • History • military • Post • The Culture

My Father’s D-Day Memories

D-Day is more than a remembrance of America’s great victory in the Battle of Normandy. It is a celebration of the Greatest Generation and the lessons they have to teach us.

Like Jews repeating the story of the Passover every year for 3,000 years, we must recall the story of this generation’s great deeds, or we will lose some idea of who we are, why we are here, and what we are capable of achieving. Indeed, if we don’t remember what our fathers knew, we will lose our country.

My beloved father, who passed away two years ago at 98-years-old, was a typical member of the greatest generation. Phil Schultz was eternally optimistic, fearless, hard-working, a responsible family man and provider, and patriotic to his core. He achieved the American Dream, not through selfishness or callousness but rather through family loyalty, taking care of those closest to him, and believing in himself. It was the same ability to pull together and have confidence in victory that gave our country the stamina to win World War II, and later let my Dad realize his personal dream of being a professional cameraman.

If only the Millennials and Generation Z could share in his life experiences and wisdom for just a moment, their world would be transformed.

A Quintessentially American Story
Here are the roots of my Dad’s optimism.  He was born in a small house with a dirt floor in the Jewish Pale of Settlement in the Soviet Union. His father escaped the Communists, made his way to America, and after several years, had earned enough to bring the family to join him.

My Dad was 9 years old. He excelled in public school and won a place in the Bronx High School of Science, but had to drop out during the Depression to help his family. He never finished school. He did serve in the Civilian Conservation Corps in Oregon as a firefighter and a logger. Back home, he was a self-taught photographer with his gang of Jewish friends in the Bronx, taking girlie pictures and selling them to cheap magazines for a few dollars.

When America entered World War II, my father, armed with his portfolio of photos, signed up immediately.  He was assigned to be a combat photographer with the Army Signal Corps.

Phil Schultz with his camera. (Photo courtesy of the author.)

He soon shipped out to England, to prepare for the Allied invasion of Northern Europe. He was with the 165th Signal Photo Company, 29th Infantry Division. This was the “Band of Brothers” division that took Omaha Beach, the lead troops in the invasion that began on June 6, 1944.

Being a combat photographer meant he served on the front lines of World War II from Omaha Beach to the liberation of Paris, including the Battle of the Bulge, as well as the battle to take the Remagen Bridge that led into Germany and ultimately Berlin.

His films of the action are in the Library of Congress. During the war, they were edited by the Army and shown as newsreels in cinemas across America. Remember, this was before TV, and the images captured by soldiers like my father were how Americans at home could follow the war. It was important in mobilizing the entire country to sacrifice, to work hard for the war effort and to win.

The amazing thing is I have “home photos” of it all, which I found years later when Dad had to move to assisted living and I was closing down my parents’ apartment. There was a small box from a Roliflex camera he had found in a cave in Germany during the war, and it was crammed full of high-quality Leica contact sheets of still photos he and his buddies had taken mostly between the battles.

The Museum of the City of New York held an exhibition of Phil Schultz’s photographs. (Courtesy of the author.)

Here are his few personal photos from Normandy, June 1944, with commentary in his own words.

“At Last, We Are Going After Hitler”
My experiences in World War II, I would say, started way before Pearl Harbor, because I was always extremely anti-fascist, and I knew somewhere along the way we would have to fight, and fight everything that was happening before it. So after Pearl Harbor, I went to volunteer in the Army, even before most of my friends.

As a soldier, we didn’t know when the D-Day invasion was going to come, but there was a feeling—there was such a build-up of American forces . . . all of a sudden, you are almost elbow to elbow with American forces on this island (Great Britain). They were coming over by the boatload and it was more and more of a build-up. Then one day, one day we said, “Alright, pack everything, you have to get on the trucks.”

We got on the trucks in a convoy and we went this way and that way. The roads were dark, and all the signs had been taken down, in case of a German invasion.

I still get a chill, remembering. As far as I could see along the country roads, piles of munitions. The people came out in the dark and watched. They lined the roads. It was so emotional. We didn’t talk in the trucks and it was very emotional. The only talking was maybe, “You got a cigarette?” The emotion. They knew what was happening.

We went to Torquay, which was where we got on the boat. We weren’t gung ho. No, we were scared, because we weren’t experienced. We didn’t know what to [expect]—we hadn’t been under fire. War was movies.

I remember feeling, at last, we are going after Hitler. I was happy because this would open up the second front and end the war and end Hitler. I wasn’t happy, “ha-ha happy,” but it was a very emotional period. We all knew. I and a couple of other guys I was close with said, “Oh boy, this is it.”

On the truck that night we were told where we were going. We are going to Normandy. They gave us maps, told us where we were going, what our objective was, where we were going to land exactly on the beach, every yard was marked off on the map. They knew where Phil Schultz was going to land, the only thing missing was my name. We were supposed to land about 2 p.m. on D-Day.

The map soldiers were given before the Normandy invasion. (Courtesy of the author.)

We got on the boats that night and fell asleep. I was on a small boat with artillery. The next morning, we first saw where we were. We were not close to shore. We were surrounded by an armada of tens of thousands of ships. We just couldn’t believe it.

Part of the armada headed for the Normandy coast. Photo by Phil Schultz. (Courtesy of the author.)

Then there was a reading on the ship of Eisenhower’s proclamation and order of the day. What we were supposed to do, to invade this and that.

Before we went off to the invasion, about month or two before, when I was assigned to London. I spent a lot of time with Robert Capa (the older brother of a best friend from the Bronx). He was a war correspondent and he knew he was going to go in with the very first wave. I was supposed to land at 1 or 2. But what happened was that we heard all sorts of rumors that things didn’t go well on the beach. We didn’t go in when we were supposed to and I started to notice small speed boats bringing wounded back to certain ships, and some wounded were brought back to my ship.

I asked permission to go to the beach, because they were going to pick up wounded. I had to promise I wouldn’t go onto the beach. The officer said, “You aren’t landing yet and you can’t land without your unit, so only if you come back.” I did go on the beach and we brought back wounded.

Now the beach— [there] was what you call beach master, this was a Navy guy, the beach master was in charge of this much beach and the boat. And he was standing there with all the artillery. There was a designation for the ships to stop and come and go.

There was a beach master. Fortunately, there was no shelling when I got there. They invaded or started invading about 5-5:30 and when I got there it was 10:30-11:00. And the beach was practically empty because everyone on the beach was laying down, and they were up against the hedgerows where they could not break through yet. So, I got some pictures of the wounded being put on and I went back to my ship and we didn’t land until late, late that day.

The first thing that happened to me when we got on the beach, it was quiet already, I bumped into a United Press correspondent who I knew from London, because of Robert Capa, and the first thing he said was, “Capa is missing.”

Omaha Beach. Photo by Phil Schultz. (Courtesy of the author.)

I said, “Oh, God!” and the first thing that came to my mind was what am I going to tell Julia, his mother. I was very close to his family, he was like a big brother to me. For three days, I really worried.

I went into Sainte-Mère-Église, which we just captured earlier that day. It was in the movie “The Longest Day,” where the paratrooper got stuck on the steeple. That was the village. It was right on the waterfront practically, just to the right of us. For three days I worried whether Capa . . . I didn’t know it, but before I even got to the beach he got his pictures and he was back in London. He wouldn’t trust his pictures to anybody. He got back on the boat and went back on one of the ships and got himself back to London to the labs to print his pictures.

Robert Capa’s famous photo of the Omaha landing. Photo by Pierre Andrieu. (AFP/Getty Images)

His darkroom assistant was so excited the negatives were rushed back before the battle was over, that he melted them in his haste, and only a few images survived.

Playing poker in a Normandy barn. One soldier filled his helmet with what he thought was water in a barrel, and let out a yell—“whiskey!” It was Calvados, Normandy’s famous apple brandy. Photo by Phil Schultz. (Courtesy of the author.)

From then on things are kind of blurry. We fought for weeks in the hedgerows.

In the battle for Saint-Lô, we were under such heavy artillery fire I wouldn’t—couldn’t—I was afraid to stand up. I was crawling in a tank rut and knew I’d be hit any minute. You felt like every shell was coming straight at you. There was a French farmer’s body in the trench and I crawled right over it. All of sudden I hear, “You can’t get pictures that way soldier.”

I looked up. It was General Cota I’d been assigned to take photos of him in England, when he went to visit Lady Astor. (He was played by Robert Mitchum in “The Longest Day.”) He was walking along under fire. But he got hit—shrapnel in the shoulder. An hour later, I was taking pictures of him getting a medal.

General Bradley’s aggressive thrust allowed Allied troops to reach Mt. Saint Michel quickly. Photo by Phil Schultz. (Courtesy of the author.)

By August, the road was open to Paris. We stopped for the 2nd French. Eisenhower thought the 2nd French Armored Division attached to the American Army should have the honor of marching in and taking Paris. So we were off now to Paris and every photographer in the Army, no matter where they were, attached themselves to the 2nd Army Division. We were advancing 20, 30, 40 miles a day and there was nothing to hold us back and the only thing in front of us was Paris.

I hooked up with Capa again, and we came to the town of Rambouillet and there was Hemingway with his own private army of free French, marching them up and down.

We go into Paris. It is so unbelievable what the scene was – right in the middle of French soldiers. They were screw ups because I remember that night before we were driving into Paris, they were driving with all their headlights on. You don’t do this! It’s still war, you’ll get killed.

On both sides of the street, French lined the streets and French tanks lined up like a convoy firing point blank down towards the Place de la Concorde, because there was still some resistance there. I am behind one of the tanks and getting pictures of people cheering and the tanks firing.

Paris, after the shooting had stopped. Photo by Phil Schultz. (Courtesy of the author.)

And I know, experienced already, that if a tank is firing, someone is going to shoot back. I get my pictures and I leave, go around the corner. And my officer, he went to the spot where I was and got killed. You get that streetwise—battlewise. You are there, you do a job, and get out.

Then I went into the Place de la Concorde and there were thousands of people there already, and someone started to throw fire again, and that is when this 300-pound woman grabs me and sits on me, lying on top of me trying to get up. After that, there was no more fire.

That was the liberation of Paris.

Paris was so beautiful. The French people were beautiful, the whole world was beautiful, the weather it was fine, and we were beating the bastards and we were winning the war and we were alive and it was beautiful.

Phil Shultz relaxing by the Seine after the liberation of Paris. (Courtesy of the author.)

After the War

It was a long, hard war, with much death and many moments of imminent death or capture. It wasn’t something my father talked about, except for the funny bits, like finding the Calvados or the fat lady in Place de la Concorde. He came home with a boundless font of optimism and gratitude and love of America.

The post-war boom was not something that fell into the soldiers’ laps—their hard work and struggles to survive continued. New York City had a tight-post war economy and a father-son dominated photographers’ union that would not let in new members. For several years, his dream of working as a cinematographer was foiled by the union and anti-Semitism in New York’s advertising industry.

Those were just two more real-life challenges you accepted as reality and met, without whining and without building a life-long grievance. The important part was winning, not that life presented a fight.

At times, he could barely put food on the table for his family. My father, after he married, gave my mother credit for urging him to believe in himself and not give up on his dream career. Eventually, he got that dream job and became a pioneer in early TV commercials, making many of the famous commercials Baby Boomers grew up with.

The last few years of his life, my Dad’s conversations became short and repetitive, but they were quintessential Great Generation to the end: “Your Daddy’s fine. I have no major problems and no minor problems. I try not to let anything get me down. I look on the bright side of life.”

“Just roll with the punches,” he would say. “Don’t let the bastards get you down.”

His last words to us were: “I’m tough. That’s my hobby. Just keep going to the end. I’m going to jump for joy.”

Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact licensing@centerforamericangreatness.com.

Photo credit: Universal History Archive/UIG via Getty Images

America • History • Post • The Culture

Way of the Sixgun: An American Martial Art

With culture around us apparently in decline, it’s worth celebrating uniquely American traditions, especially growing ones. And few activities could be more American than the marksmanship sport of Old West-style shooting.

The increasingly popular shooting matches of the National Congress of Old West Shooters and the Single Action Shooting Society are perhaps more than a healthy sport. Old West-style shooting could be considered the quintessential “American Martial Art.”

Consider the characteristics of the best-known Asian martial arts. They descend from practical fighting systems and can still be used for self-defense. Most hobbyists, however, practice them for refinement of character and, to be honest, the fun of daydreams fulfilled in simulated combat. Historic or quasi-historic costumes are worn, both for practice and competition. While an impression of great antiquity is part of Eastern martial art lore, many of the most popular date back only to the 19th century or are even more recent. (Judo was developed in the 1880s and Aikido was formally named in 1942.)

Many of the martial arts developed sport forms to practice skills safely. Also, it’s undeniable that action movies featuring the Asian martial arts have had a huge impact on how they’re viewed, and taught.

In Old West style shooting, practiced at clubs around the United States, all of these conditions apply. A combat activity native to our culture and important in our folklore, a set of fighting skills with a rich tradition and history, has been formalized into a martial art.

Not that these shooters necessarily take themselves quite so seriously. Old West shooters are fanatical about safety, but often lighthearted about almost everything else.

For instance, while the Single Action Shooting Society requires members to dress to suit the sport, they may choose historical impressions, or a “B-Western” look. Shooting aliases often include wordplay, and a day’s shooting scenario (every scenario is different!) is frequently inspired by a beloved movie scene and may begin with the shooter uttering a designated line (“Telegram for Mongo!”).

Don’t let those details mislead you, however. Old West shooters cultivate very impressive skills.

A three-weapon routine is common at such events, alternating among shotgun (double-barrel or pump), lever-action carbine, and classic “sixguns.” Engaging steel targets with the correct weapons in a designated order constitute the “kata” of the art.

Just as Asian martial arts keep obsolete weapons like nunchaku or tonfa on the market, the Cowboy Action scene preserves a market for 19th-century firearm designs and the ammunition to suit. None of those weapons are semiautomatic; none ought the offend the sensibilities of those gun-grabbers whose complaint is “modern military weapons.” Masters of the proper sixgun skills can, nevertheless, approach the Western novel cliché of “emptying the cylinder in one continuous roar.”

We usually think of the martial arts as “empty hand” combatives, but at least one of the most famous—kendo—originally relied on a specialized, high-quality steel weapon, just as Old West shooting does.

It’s worth mentioning: just as the Japanese sword arts have an adjunct “fast draw” division, in techniques referred to as “iai-jutsu,” the American art has a fast-draw specialty. Fast-draw, though, is not connected with the Western-style range events, and would get you ejected from them. In their own martial hobby, fast-draw artists, with wax bullets, promote a tradition which is a legacy of Hollywood’s Westerns rather than the American frontier; single-action fast draw was refined for showmanship in the movies. Not nearly as popular as the three-gun Old West live fire sports, “fast draw” nonetheless has devotees around the world. Each seeks his inner cowboy as avidly as any teenage boy wishes to be a ninja.

The quest is the same—and need not be a foolish one. Beneath the fancy trappings, either can be the cultivation of mental calm and a capacity for sudden, effective action.

When our forefathers deliberated over the form of the new American society in the late 1700s, there was very little (if any) disagreement about what our citizens ought to be like under pressure. We Americans were to be brave and formidable. Our Second Amendment was not just about an ability to grab private weapons in an emergency; you can’t raise an effective militia from a population that doesn’t invest time and pride in developing martial skills.

The Founders enjoyed such a culture and perpetuated it. The resulting social climate gave us Annie Oakley and Audie Murphy in real life, and in our imaginations, the cowboy ideal so many of our heroes of the 20th century shared.

What better way, then, to promote American greatness, than to encourage this terrific all-American activity? The six-gun toting cowboy cliché, which the Left despises, represents the best in the American spirit. The more Americans take up the traditional tools of our heroes, the better. So “support your local gunfighter”—or better yet, fill yer own hand.

Photo Credit: Matthew Hatcher/SOPA Images/LightRocket via Getty Images

America • History • Hollywood • Post • The Culture

Hollywood’s Happy Hoodlum Makes Murder Routine

The crook is in a parking lot trying to break into a car when he sees a lady with shopping bags approaching her vehicle. Pretending to be a friendly stranger, he insists on helping her load the bags into her trunk. She tells him that since she didn’t ask for his help, he won’t be getting a tip.

He grins and says, “Oh, that’s OK, ma’am. I’ll just take your car.”

That’s a “laugh” line from the 1998 crime film, “Out of Sight.” Presumably, few people in the chuckling audience were ever victims of a carjacking. But the really odd thing is that in the film’s plot, the grinning carjacker is one of the good guys.

Most of the crooks in “Out of Sight” are either mean as snakes or dumb as posts (or both), and in the end, right does prevail, more or less. So the movie is hardly the worst example of glorified crime ever to come out of Hollywood. But it’s long past time for Tinseltown to be getting over its love affair with criminals.

That affair has been going on at least since James Cagney gave his girlfriend a face full of grapefruit in “The Public Enemy” (1931), but it reached a crescendo in the 1960s and ’70s—which, it so happens, is when real-life American crime posted its steepest increases in the modern era.

In those days, a vogue for attractive, successful screen criminals enlisted Hollywood stars including Paul Newman and Robert Redford (partners in crime in “The Sting” and “Butch Cassidy and the Sundance Kid”), Steve McQueen (a gentleman thief in “The Thomas Crown Affair”), Warren Beatty (a charming bank robber in “Bonnie and Clyde,” a charming pimp in “McCabe & Mrs. Miller”), and Rock Hudson (a dashing serial killer—yes, a dashing serial killer—in “Pretty Maids All in a Row”).

It included “blaxploitation” films (“Superfly,” with its drug-pusher hero) and had an international dimension. (In the European caper film “Topkapi,” it’s perfectly all right to be a jewel thief, but unforgivable to be a “schmo.”) The celluloid crime wave subverted even the rock-ribbed rectitude of John Wayne (who organizes a violent gold robbery in “The War Wagon”).

These movies usually ignored the canons of ’30s gangster epics and ’40s film noir, in which the lawbreaking protagonist’s unhappy fate is sealed when he takes his first wrong step. In many of them, the crooks get away scot-free; in others, their violent last stand is depicted as heroic rather than pathetic.

Many of the films went out of their way to portray policemen as ugly, corrupt, or coldly evil. In at least one (“Lawman,” with Burt Lancaster), the peace officer holds center stage as a bloody sociopath. The image common to virtually all of them is that of the happy hoodlum, the gutsy, daring rogue.

Most of the liberal media’s film critics purred happily at this “cute crook” genre. (There were honorable exceptions: New Yorker critic Pauline Kael, for example, tore up “Butch Cassidy” as supercilious and morally deranged.) Tellingly, those same critics were livid in 1974 about “Death Wish,” with its strong depiction of the agonies suffered by crime victims and their survivors. But critical acclaim and commercial success aren’t the only reasons crime on screen was running wild. First, it had to get permission.

No cute-crook picture would have been allowed under the old Motion Picture Production Code, which decreed that “the sympathy of the audience should never be thrown to the side of crime, wrongdoing, evil or sin.” Nor could the gruesome slasher genre have flourished under the code, which required also that “the technique of murder must be presented in a way that will not inspire imitation.” In the 1960s, the code was abandoned. The cinematic crime wave was the result.

A closer look at two hit films, 23 years apart, may serve to illustrate how greatly American culture changed in one generation. I happened to see them one after the other on television one night, and the contrast was striking.

The Late Show was “Shane,” produced and directed in 1953 by George Stevens, a veteran both of Hollywood and of the war in Europe. Set in the late 19th century, “Shane” tells the story of a former gunfighter, weary of violence, who reluctantly defends a group of Wyoming homesteaders against a cattle baron’s efforts to force them off their land. When the cattleman’s bullying and bloodless half measures are thwarted, he sends for a gunfighter of his own, one with no scruples about murder.

This villain (played by a young and very scary Jack Palance) promptly goads one of the farmers into a duel in which the sodbuster is hopelessly outmatched. The gunman dispatches his victim with a sadistic smile, and the farmers are given to understand that the same fate awaits them all if they don’t clear out. In the absence of state law enforcement, it falls to Shane to bring retribution down on both the killer and his employer.

Next on the tube was “The Outlaw Josey Wales,” a 1976 western starring Clint Eastwood. In its opening scene, Civil War guerrillas massacre a Missouri man’s family, turning him from a peaceful farmer into a remorseless killer. The film is little more than an arrangement of set-pieces in which the hero guns down a series of contemptible minor characters.

What struck me then is that Eastwood was playing his part in the very same manner as the gunslinging bad guy in “Shane.” He had the same soft, hissing voice, the same cold air of menace. The only difference was that whereas Palance’s gunman smiled at his victims while shooting them, Eastwood’s character would grimace, shoot, and then spit tobacco juice on the corpse. And this was how the “good guy,” the audience’s role model, behaved!

Equally stark was the contrast in what the two films showed of a gunfight’s aftermath. In “Josey Wales,” it’s just “bang! bang!” and on to the next showdown. But in “Shane,” when the sodbuster dies, we go to his funeral, we listen to the hymns sung and prayers said over his grave, we hear his widow’s sobs. We even watch his dog whining and scratching at his coffin as it’s lowered into the ground.

Moreover, George Stevens’ film shows the humanity of all its characters, inviting the audience’s sympathy for everyone involved. When the hero metes out justice at the end, the mood is not triumph but sorrow. His motivation throughout is never spite; it’s a sense of shared obligation, of duty accepted and fulfilled for the sake of others.

“Shane” is very much an icon of its era, reflecting a generation’s gratitude to all the reluctant warriors, the loved ones who accepted the grim and bloody challenge of defeating the Axis in World War II.

“Josey Wales” is equally a sign of its times. But go back far enough, and you’ll find lots of Hollywood films with nobler themes.

Many, of course, laud duty and honor and have no killings at all: classics like “It’s a Wonderful Life” and lesser-known gems like “The Strawberry Blonde” and “Third Man on the Mountain.” Others show criminal violence while teaching us earnestly to hate it: “The Killers,” “The Asphalt Jungle,” “The Naked City,” “On the Waterfront,” “West Side Story,” “The Man Who Shot Liberty Valance,” “Murder, Inc.”

More recent movies are more problematic, mainly because of their harsher depictions of bloodshed; yet many of them retain moral clarity about the human cost of crime: “In Cold Blood,” “Bullitt,” “Hang ’Em High,” “The Onion Field,” “The Black Marble,” “Fort Apache, the Bronx,” “An Eye for an Eye,” “Fargo.”

Francis Ford Coppola’s “Godfather” trilogy might be counted among that bunch, as its tale is tragic and its depiction of mob violence is unflinchingly grim. Yet it seems more a glorification of crime than a repudiation of it. In Coppola’s eyes, the Corleones conduct themselves with dignity, and their victims—from the Hollywood horse’s ass who finds a horse’s head in his bed to the rival mob bosses who fall like tenpins before the Godfather’s righteous wrath—all have it coming to them. Real-life wrongdoers ranging from Mafia don Joseph Bonanno to Iraqi despot Saddam Hussein have fallen in love with this image of the gangster as hero.

The Martin Scorsese gangster films, such as “Goodfellas” and “Casino,” paint a much different picture. They could even be called the anti-Godfathers, full of vicious characters whose disputes are petty and stupid and whose respect for the people around them is tenuous to non-existent. Scorsese’s gangsters are more pathetic than heroic. They are apt to trample the rights and snuff out the life of anyone, fellow hoodlum or hapless innocent, who crosses them in any way at all, and when they finally do meet their end, they themselves are the ones who “have it coming.”

All the same, for every serious and worthwhile crime drama, you have a dozen moral monstrosities: slasher films from “Friday the 13th” to “Scream,” cute-crook movies from “Bonnie and Clyde” to “Natural Born Killers,” mass murder as catharsis in “Carrie” and “The Matrix.”

I stopped paying attention to Hollywood’s output around the turn of the century, but I doubt very much that the past two decades have seen any improvement in that regard. This little mash-up of recent movie mayhem suggests the party is far from over.

Under the old code, Hollywood had a care for what effect its products might have on the more impressionable members of the audience. Not so in the post-code era. And what do you suppose would be the result?

Let’s leave the cinema for a moment and read a line of dialogue from reality: “Murder is not weak and slow-witted, murder is gutsy and daring.”

Those words were written in 1997 by a 16-year-old boy just before he butchered his mother, then shot two of his classmates to death and wounded seven others at a high school in Pearl, Mississippi. His was the first in a series of schoolhouse massacres that, as the riots and assassinations were to the ’60s, have become a signature of our times.

Can a line of responsibility be drawn from the entertainment industry to all that mayhem? Not entirely—though in some cases, where the little morons have consciously aped some atrocity they’ve seen on the big screen, you can just about do that. (“Natural Born Killers” and “The Matrix” were especially fruitful that way.) But there’s enough of a connection to have a lot of us seeking some way to confront those who promote death-worshiping movies, song lyrics and video games.

With regard to motion pictures, however, just how kind and gentle do we want them? Can a diet of “Mary Poppins” appeal to a youth who wants, above all, to be seen as “gutsy and daring”? We often hear about the huge number of on-screen homicides a boy has witnessed by the time he reaches his teens. But children have been raised on tales of blood ever since Achilles slew Hector and David slew Goliath, and indeed long before that.

By way of illustration, here is a bedtime story as told by Barry Lyndon to his son Bryan:

We crept up on their fort, and I jumped over the wall first. My fellows jumped after me. Oh, you should have seen the look on the Frenchmen’s faces when 23 rampaging he-devils, sword and pistol, cut and thrust, pell-mell came tumbling into their fort. In three minutes, we left as many artillerymen’s heads as there were cannonballs. Later that day we were visited by our noble Prince Henry. “Who is the man who has done this?” I stepped forward. “How many heads was it,” says he, “that you cut off?” “Nineteen,” says I, “besides wounding several.” Well, when he heard it, I’ll be blessed if he didn’t burst into tears. “Noble, noble fellow,” he said. “Here is 19 golden guineas for you, one for each head that you cut off.” Now what do you think of that?

“Were you allowed to keep the heads?” asks Bryan. “No, the heads always become the property of the King.” “Will you tell me another story?” “I’ll tell you one tomorrow.” “Will you play cards with me tomorrow?” “Of course I will. Now go to sleep.”

Barry’s tale is given a heart-breaking reprise when young Bryan lies dying.

It may be hard to recall today, but in America the words “shoot-em-up” used to have a happy meaning. It referred to Western B-list pictures starring good-hearted, clean-minded cowboys like Roy Rogers and Gene Autry. Young boys would come home from Saturday matinees, take their toy pistols out to the back yard, and holler “Bang! Bang! You’re dead!” at each other to their heart’s content—and no one had anything to fear. It was all in fun, with no malice at all.

What’s changed? Is it simply a difference in quantity, too much time in front of the boob tube, too many “first-person shooter” video games? Many experts say so. But isn’t it obvious that there has also been a change in quality, in the moral context in which deadly conflict is presented?

A mother who lost her child in one of the early school massacres said something that reinforces the point. With the atrocity at Columbine High School renewing her own grief, Suzanne Wilson of Jonesboro, Arkansas, argued that kids shouldn’t be kept unaware of violence.

“Let the children go to funerals,” she said. “Let them see what happens after the shots are fired. Let’s show them the empty bedroom. Let them know that death is final.”

Wilson was speaking of real life, of course—of real funerals like her daughter Brittheny’s. But her words made me think of the mother’s bereavement in “The Naked City” and of the sodbuster’s funeral in “Shane.”

“The Outlaw Josey Wales,” with its serial-killer hero, was released amid a crime wave unequaled in our history, a 50-year disaster that Hollywood’s movie mayhem both reflected and incited. Mercifully, this crime tsunami has receded from its 1991 crest, yielding to tough lock-’em-up policies, proactive “stop and frisk” policing, a resurgent if increasingly proscribed reliance on the death penalty, and stubbornly law-and-order social attitudes.

Yet each of those anti-crime factors has itself accumulated a burden of complaints and countervailing efforts that threaten to move further improvement beyond our reach and may even put whatever improvement we’ve already achieved at risk.

Meanwhile, as shown by the continuing craze for massacres in schools, churches, nightclubs, concert venues and other public places—to say nothing of the relentless daily toll taken by routine murders in our streets, parks and homes—we remain a long way yet from normalcy.

And the happy hoodlum continues to be a Hollywood staple.

Photo Credit: John Springer Collection/CORBIS/Corbis via Getty Images

America • Education • History • Post • The Culture

‘Hope’ and History

Clear, accurate, and inspiring, Wilfred McClay’s Land of Hope: An Invitation to the Great American Story is a welcome antidote for agenda-driven history textbooks that paint the United States as an illegitimate nation born of evil.

A review of Land of Hope: An Invitation to the Great American Story, by Wilfred M. McClay (Encounter Books, 504 pages, $34.99)

What is the purpose of history? Is it merely a record of facts—of dates and kings, wars and voyages? Or is it something more?

Evaluating a history textbook must begin with knowing what history is.

A nation’s history is more than just a list of facts to memorize. It weaves the facts into an intellectual and emotional tapestry that tells us who we are, what our lives are about, and what kind of people we should aspire to be. It should be:

  • Informative: Helping us understand the past by telling us what happened, when, and why.
  • Enlightening: Helping us understand the present by comparing it to the past.
  • Inspiring: Helping us develop moral character by learning stories of past heroism and villainy.
  • Supportive: Helping our countries flourish by legitimizing the social order.

In his History of Rome, the ancient Roman writer Livy explained those four goals in a way that eerily foreshadowed America’s current predicament:

My wish is that each reader will pay closest attention to how men lived, what their moral principles were, under what leaders and by what measures our empire was won; then how, as discipline broke down bit by bit, morality at first foundered, subsided in ever-greater collapse and toppled headlong in ruin—until the advent of our own age, in which we can endure neither our vices nor the remedies needed to cure them.

An honest account of the facts is essential, but it’s not enough. To survive, any country must believe that it is good (even if imperfect) and that it deserves to survive. Truthful and inspiring historical stories about the country’s origin, leaders, and ideals provide that foundation. Conversely, stories that are biased and negative tend to undermine the foundation.

Any history book must balance those goals against each other. Some books are unabashedly patriotic, such as Our Island Story in Great Britain and A Patriot’s History of the United States in America. Others are very negatively biased, such as Howard Zinn’s bestselling and influential People’s History of the United States, which depicts the United States as an unrelenting criminal enterprise of genocide, racism, and exploitation.

McClay’s new textbook Land of Hope, on the other hand, strikes the right balance. It is optimistic without being jingoistic, acknowledging America’s mistakes without reading like a brief for the prosecution. It celebrates America’s achievements, but not uncritically: “celebration and criticism are not necessarily enemies.” And its goals are explicit:

To help us learn . . . the things we must know to become informed, self-aware, and dedicated citizens of the United States of America, capable of understanding and appreciating the nation in which we find ourselves, of carrying out our duties as citizens, including protecting and defending what is best in its institutions and ideals.

The most popular competing textbooks are Jill Lepore’s These Truths and James Fraser’s By the People. McClay’s book lacks the former’s globalist glibness and the latter’s dizzying overload of textbook-y features. But how does Land of Hope fare by the criteria of good history?

It Is Informative
Land of Hope gives an accurate account of America’s history that is undistorted by the selective emphasis and omission found in other textbooks. One key piece of evidence comes in McClay’s description of the U.S. Constitution, which:

. . . is not, for the most part, a document filled with soaring rhetoric and high-sounding principles. Instead, it is a somewhat dry and functional document laying out a complex system of boundaries, markers, and rules of engagement, careful divisions of function and power that provide the means by which conflicts that are endemic and inevitable to us, and to all human societies, can be both expressed and contained; tamed; rendered harmless, even beneficial. Unlike the Declaration of Independence, the Constitution’s spirit is undeclared, unspoken; it would be revealed not through words but through actions.

Implicit in McClay’s description is that the United States was influenced but not formed by Enlightenment rationalism. The Founders had studied the history of failed republics to learn what worked and what didn’t. And they were the heirs of a British legal and social tradition from which they learned that well-informed pragmatism was wiser than well-intentioned rhetoric.

Napoleon Bonaparte dismissed England as “a nation of shopkeepers,” preoccupied with the practical issues of life instead of lofty ideals. Napoleon was wrong, and the British defeated him. The lofty ideals that led to the horror of the French Revolution largely had been avoided in America by a Constitution designed for practical issues. McClay highlights that fact.

It Is Enlightening
Learning about our history reveals that many current quandaries are neither new nor unique. President Trump’s alternating use of provocation and conciliation seems strange until we learn that earlier presidents (like many world leaders) used the same strategy. McClay describes how Abraham Lincoln followed a similar path:

His initial thinking began to emerge more clearly in his eloquent First Inaugural Address on March 4, 1861. Its tone was, in the main, highly conciliatory. The South, he insisted, had nothing to fear from him . . . But secession was another matter. Lincoln was crystal clear about that: it would not be tolerated.

Compare that to President Trump’s inaugural address on January 20, 2017:

We, the citizens of America, are now joined in a great national effort to rebuild our country . . . Every four years, we gather on these steps to carry out the orderly and peaceful transfer of power, and we are grateful to President Obama and First Lady Michelle Obama for their gracious aid throughout this transition. They have been magnificent.

That was the conciliation. “We” are joined in a national effort. The Obamas “have been magnificent.” And then comes the crystal clear:

Today’s ceremony, however, has very special meaning. Because today . . . we are transferring power from Washington, D.C. and giving it back to you, the American people.

Apart from the tweeting, almost any of that could have been said in 1861 just as easily as it was now. By 2017, Washington had virtually seceded from the United States, and it was time for it to come back into the fold.

It Is Inspiring
Land of Hope is short on emotionally stirring tales, but the reason is obvious: it’s a textbook, not The Children’s Book of American Heroes. It says nothing about George Washington chopping down a cherry tree, Paul Bunyan creating the Grand Canyon, or Davy Crockett catching a bullet in his teeth (that was only done by actor Fess Parker in the movie version).

Instead, it tells factual stories about people who achieved great things. Quietly, humbly, and often without fanfare, they shaped our national character. Land of Hope portrays them not as saints or fanciful superheroes, but as prudent and courageous Americans trying to do their best.

One of the first would have been approved by the Greek philosopher Plato, who wrote that the only people who could be trusted with power were those who didn’t want it. George Washington, who led the American colonies to vanquish the mighty British army, became America’s first president. But he didn’t want the job:

Nearing the age of sixty, after enduring two grinding decades of war and politics in which he always found himself thrust into a central role in determining the direction of the country, he wanted nothing so much as to be free of those burdens . . . [but] if the task before the country was a great experiment on behalf of all humanity . . . how could he refuse to do his duty?

The only omission with which I disagreed was the story of Nathan Hale, an American soldier captured in 1776 by the British and executed as a spy. His final words, “I only regret that I have but one life to lose for my country,” were echoed almost 200 years later when newly-inaugurated President John F. Kennedy called on Americans to “ask not what your country can do for you; ask what you can do for your country.”

It Is Supportive
The final criterion of good national history is that it help our country flourish by legitimizing the social order. We don’t usually think of history as doing that, but its importance is evident when we consider books that do the opposite.

Take, for example, how Jill Lepore’s book portrays the United States and its origin. After noting correctly that “a nation is a people who share a common ancestry” she claims “the fiction that [America’s] people shared a common ancestry was absurd on its face; they came from all over”—a statement that is technically true but highly misleading, since the vast majority were British. Then comes the indictment:

The nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds . . .  Against conquest, slaughter, and slavery came the urgent and abiding question, by what right?

I don’t doubt that Lepore is being honest about how she sees America. If she wasn’t, she wouldn’t have an endowed chair as a history professor at Harvard. But her view leads only to the question of whether America should be destroyed now or later.

Land of Hope presents our country’s history in an affirmative way that is more than just “technically true.” Alexander Hamilton identified the stakes in Federalist 1. America is a great experiment to decide:

. . . whether societies of men are really capable of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.

Amid the tumult and hysteria of 2019, it’s tempting to say that the decision has yet to be made. But the American record, checkered like that of all great nations, shows the answer to Hamilton’s question is a qualified “yes, we can.” Perfection exists only in Heaven. If the United States has sometimes fallen short of its heritage and its ideals, it has more often shown itself as a worthy heir and sturdy practitioner of both.

Land of Hope stands squarely in that American tradition.

Photo Credit: Getty Images

America • Book Reviews • History • Post

A review of Sacred Duty: A Soldier’s Tour at Arlington National Cemetery, by Tom Cotton (William Morrow, 320 pages, $28.99)

‘That We Here Highly Resolve . . . ’

Part memoir and part history, U.S. Senator Tom Cotton’s Sacred Duty recounts in vivid detail the stories of the men and women who make up Arlington’s military detachment—”The Old Guard.” Those military units comprise America’s oldest regiment (established in 1784), which includes the Revolutionary-garbed Fife and Drum Corps and Continental Color Guard, the skilled rifle-handling Drill Team, and—the elite of the elite—the sentinels at the Tomb of the Unknown Soldier, who have stood guard at their post for every minute of every day since 1937.

Mostly however, the Old Guard comprises the highly trained platoons who honor our military veterans through dignified funerals conducted with deep respect and exacting attention to military precision—including 21-gun-salute “full honors” for senior officers and Medal of Honor winners.

Cotton, a Republican senator from Arkansas, presided over more than 400 such funerals during his time in the Old Guard, commemorating, as he describes it, “our nation’s fallen, its warriors, their families, and our common heritage of freedom for which they sacrificed.” After graduating from Harvard Law School, he enlisted in the Army in 2005. To the surprise of his recruiting officer, he declined to serve in the JAG Corps and instead requested command of an infantry platoon. In between combat tours in Iraq and then Afghanistan (where he earned the Bronze Star) Captain Cotton served at Arlington in 2007 and 2008. Though the book includes several of Cotton’s personal recollections, it is virtually free of autobiography and self-promotion—a rare modesty for a U.S. Senator.

I visited the cemetery—for perhaps the fifth or sixth time—a few days ago, to connect with some of what I had read in the book, and get in the right frame of mind to write about it. As usual, one could see couples or small groups walking slowly along the paths. Occasionally one notices someone who has come to visit a specific gravesite. But most people, including the large groups of students typical this time of year, congregated near the Tomb of the Unknown Soldier.

The tomb is a large and impressive structure—and the changing of the guard is stirring to witness. Moreover, for those who don’t have a relative or loved one buried at Arlington, the tomb provides a connection that can appeal to every American.

“We venerate the Unknowns,” Cotton writes, “not merely as representatives of the unknown dead from four wars, but as heroes who embody the courage and sacrifice of all our war dead, from Lexington and Concord to Iraq and Afghanistan.”

I think there is another reason as well: people need something to gravitate toward. The cemetery, certainly on foot, seems vast—walking among the hundreds of thousands of individuals headstones easily can become overwhelming. One of the virtues of Sacred Duty is that it helps you get your bearings, to place the enormousness, the solemnity, and the ritual into perspective.

Cotton wisely avoids ambitious attempts at poetry; to capture the full meaning of Arlington would require a Shakespeare. Instead, he allows the historical facts, and especially the individual stories of both the living and the dead, to speak for themselves. From Private William Christman, the first soldier buried at Arlington in 1864, to Sergeant Jeff Dickerson—who took his emotional “last walk” as a tomb sentinel in March of 2018—the book is full of names, memories, and personal details that bring the heroes and guardians of Arlington into focus.

Sacrifice is a theme that naturally runs through the book, and one which has special importance in America. (“Now I know why your soldiers fight so hard,” one visiting foreign dignitary told the tomb guards. “You take better care of your dead than we do our living.”)

Since time immemorial soldiers have fought for home and hearth. That is no less true of America’s warriors. But what it means to defend this home—what it means to be an American soldier—has an extra meaning. Abraham Lincoln, in his eulogy of the great Senator Henry Clay, said, “He loved his county partly because it was his own country, but mostly because it was a free country.”

Living in a free country takes work. It even takes sacrifice—although from most of us the sacrifice is a far lesser one than that paid by those buried at Arlington. Nevertheless, there are virtues necessary for free government that cannot be ignored. Otherwise, self-government simply cannot work.

“Our founding principles are noble and just,” Cotton writes. “Our ancestors fought for those principles, and we ought to be ready to fight for them, too.” Citizens must fight for them, as well. But in a different way.

Republican citizenship means the willingness to subordinate our own narrow self-interest and policy preferences for the sake the common good. This is the essence of deliberative politics, where figurative battles take the place of literal ones. Politics, though a pale imitation of martial courage and the ultimate sacrifice, is in one sense incomparably easier than the physical and spiritual exertions needed in war.

But in another sense, it may be more challenging because it requires a permanent re-orientation of the soul. Republican citizenship has to be practiced as a way of life, so that toleration, compromise, and mutual respect become civic habits, and ultimately the foundation of civic friendship.

Our republic is in danger of losing all that has been won—at such immense cost—by those who gave everything, because, in our strident and punitive political climate, we seem no longer to know how to practice those essential virtues. Without them, the larger sacrifices won’t be able to save us. In the end, it is citizens who must be the guardians of “government by the people, for the people, and of the people.”

Without diminishing the need for Spartan virtue—the warrior’s courage and strength—we must remember why Americans fight. War, after all is for the sake of peace. America’s soldiers fight enemies abroad so that all Americans may remain friends at home.

Tom Cotton’s fine book shows us the nobility of those who sacrificed everything, and thus reminds us how small—and yet how necessary—are the sacrifices we must all make as citizens.

Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact licensing@centerforamericangreatness.com.

Photo Credit: Alex Wong/Getty Images

 

America • History • Post • The Culture • the family

Our Modern ‘Satyricon’

Sometime around A.D. 60, in the age of Emperor Nero, a Roman court insider named Gaius Petronius wrote a satirical Latin novel, “The Satyricon,” about moral corruption in Imperial Rome. The novel’s general landscape was Rome’s transition from an agrarian republic to a globalized multicultural superpower.

The novel survives only in a series of extended fragments. But there are enough chapters for critics to agree that the high-living Petronius, nicknamed the “Judge of Elegance,” was a brilliant cynic. He often mocked the cultural consequences of the sudden and disruptive influx of money and strangers from elsewhere in the Mediterranean region into a once-traditional Roman society.

The novel plots the wandering odyssey of three lazy, overeducated and mostly underemployed single young Greeks: Encolpius, Ascyltos and Giton. They aimlessly mosey around southern Italy. They panhandle and mooch off the nouveau riche. They mock traditional Roman customs. The three and their friends live it up amid the culinary, cultural and sexual excesses in the age of Nero.

Certain themes in “The Satyricon” are timeless and still resonate today.

The abrupt transition from a society of rural homesteaders into metropolitan coastal hubs had created two Romes. One world was a sophisticated and cosmopolitan network of traders, schemers, investors, academics and deep-state imperial cronies. Their seaside corridors were not so much Roman as Mediterranean. And they saw themselves more as “citizens of the world” than as mere Roman citizens.

In the novel, vast, unprecedented wealth had produced license. On-the-make urbanites suck up and flatter the childless rich in hopes of being given estates rather than earning their own money.

The rich in turn exploit the young sexually and emotionally by offering them false hopes of landing an inheritance.

Petronius seems to mock the very world in which he indulged.

His novel’s accepted norms are pornography, gratuitous violence, sexual promiscuity, transgenderism, delayed marriage, childlessness, fear of aging, homelessness, social climbing, ostentatious materialism, prolonged adolescence, and scamming and conning in lieu of working.

The characters are fixated on expensive fashion, exotic foods and pretentious name-dropping. They are the lucky inheritors of a dynamic Roman infrastructure that had globalized three continents. Rome had incorporated the shores of the Mediterranean under uniform law, science, institutions—all kept in check by Roman bureaucracy and the overwhelming power of the legions, many of them populated by non-Romans.

Never in the history of civilization had a generation become so wealthy and leisured, so eager to gratify every conceivable appetite—and yet so bored and unhappy.

But there was also a second Rome in the shadows. Occasionally the hipster antiheroes of the novel bump into old-fashioned rustics, shopkeepers and legionaries. They are what we might now call the ridiculed “deplorables” and “clingers.”

Even Petronius suggests that these rougher sorts built and maintained the vast Roman Empire. They are caricatured as bumpkins and yet admired as simple, sturdy folk without the pretensions and decadence of the novel’s urban drones.

Petronius is too skilled a satirist to paint a black-and-white picture of good old traditional Romans versus their corrupt urban successors. His point is subtler.

Globalization had enriched and united non-Romans into a world culture. That was an admirable feat. But such homogenization also attenuated the very customs, traditions and values that had led to such astounding Roman success in the first place.

The multiculturalism, urbanism and cosmopolitanism of “The Satyricon” reflected an exciting Roman mishmash of diverse languages, habits and lifestyles drawn from northern and Western Europe, Asia and Africa.

But the new empire also diluted a noble and unique Roman agrarianism. It eroded nationalism and patriotism. The empire’s wealth, size and lack of cohesion ultimately diminished Roman unity, as well as traditional marriage, child-bearing and autonomy.

Education likewise was seen as ambiguous. In the novel, wide reading ensures erudition and sophistication, and helps science supplant superstition. But sometimes education is also ambiguous. Students become idle, pretentious loafers. Professors are no different from loud pedants. Writers are trite and boring. Elite pundits sound like gasbags.

Petronius seems to imply that whatever the Rome of his time was, it was likely not sustainable—but would at least be quite exciting in its splendid decline.

Petronius also argues that with too much rapid material progress comes moral regress. His final warning might be especially troubling for the current generation of Western Europeans and Americans. Even as we brag of globalizing the world and enriching the West materially and culturally, we are losing our soul in the process.

Getting married, raising families, staying in one place, still working with our hands and postponing gratification may be seen as boring and out of date. But nearly 2,000 years later, all of that is what still keeps civilization alive.

Photo Credit: Universal History Archive/UIG via Getty Images

(C) 2019 TRIBUNE CONTENT AGENCY, LLC.