DRI-186 for week of 5-10-15: How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

An Access Advertising EconBrief:

How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

The previous EconBrief explains how the classical theory of voluntary exchange and the moral concept of individual responsibility mutually reinforce each other. The mutually beneficial character of voluntary exchange allows individuals to assume responsibility for their own actions in a free society. Individual responsibility permits voluntary exchange to function without the necessity of, say, review of each transaction by a neutral third party to insure fairness. The role of government in a voluntary society is minimal – to enforce contracts and prevent coercion.

Recently, the issue of responsibility for war crimes committed during World War II has been raised by various independent events. In Germany, a 93-year-old man is standing trial as an accessory to war crimes committed while he worked at the Auschwitz concentration camp during World War II. His presence in the camp is known, but his actual role and behavior is disputed. Should the prosecution have to prove he actually committed crimes, or would his participation as (say) a guard be enough to warrant his conviction as a war criminal?

A recent column in The Wall Street Journal by Bret Stephens (“From Buchenwald to Europe,” 05/05/2015) observes that many people in Germany were victims of Nazism, not Nazis – including many non-Jews. How should this affect Germany’s national policies today on European union, immigration and attitude toward systematic anti-Semitism and misogyny practiced by Muslim immigrants? “It isn’t easy, or ultimately wise, [for Germany] to live life in a state of perpetual atonement,” Mr. Stephens thinks.

Japan’s Prime Minister Shinzo Abe has publicly marveled about the transformation in relations between Japan and America, two countries who became deadly rivals in the late 1930s and waged total war in the 1940s, culminating in mankind’s only nuclear attack. Today we are two of the planet’s closest trading partners. Abe clearly wants to enlist the cooperation of the U.S. in Japan’s efforts to re-arm against the imminent threat of mainland’s China’s sabre-rattling territorial ambitions. But Abe has also made disturbing noises in domestic politics, worshipping at the shrine of Japan’s war dead and speaking equivocally about Japan’s aggressive invasion of its Asian neighbors in the 1930s. These speeches are a rough Japanese analogue to holocaust-denial.

In deciding what to make of these events, our analytical anchor is once again the economic logic of individual responsibility arising in a context of voluntary exchange.

The Flawed Notion of National Responsibility for War Crimes

In his Wall Street Journal piece, Bret Stephens depicts “the drama of postwar Germany” as its “effort to bury the Nazi corpse,” which “haunts Germany at every turn.” This phrasing is troubling. It implies that Germany’s residents bear a collective burden for sins committed long before most of them were even born.

Not surprisingly, this burden hasn’t just been heavy – it has been unshakeable. “Should Germany’s wartime sins be expiated by subsidizing the spendthrift habits of corrupt Greek governments? Should fear of being accused of xenophobia require Germans to turn a blind eye to Jew-hatred and violent misogyny when the source if Germany’s Muslim minority?” These questions, posed rhetorically by Mr. Stephens, should be placed in the pantheon of pointlessness with queries about the angel-carrying capacity of pinheads.

Even before World War II ended, many people realized that the Axis powers would have to be called to account for their sins. Members of the German and Japanese governments and military had committed acts that mined new depths of depravity. Civilization had institutions and standards for judging and punishing the familiar forms of crime, but the scope and magnitude of Axis atrocities persuaded the Allies to hold separate war-crimes tribunals for Germany and Japan. And the defendants at every trial were individual human beings, not collective entities called “Germany” or “Japan.”

To be sure, there were arguments – some of them almost as bitter as the fighting that preceded the trials – about which individuals should be tried. At least some of the disagreement probably reflected disappointment that the most deserving defendants (Hitler, Goering et al) had cheated the hangman by committing suicide beforehand. But nobody ever entertained the possibility of putting either nation on trial. In the first place, it would have been a practical impossibility. And without an actual trial, the proceedings would have been a travesty of justice. Even beyond that, though, the greater travesty would have been to suggest that the entirety of either nation had been at fault for acts such as the murder of millions of Jews by the Nazis.

We need look no farther than Stephens’ own article to substantiate this. He relates the story of his father-in-law, Hermann, who celebrated his 11th birthday on VE-Day, May 8th, 1945. He was the namesake of his father, a doctor who died in a German prison camp, where he was imprisoned for the crime of xenophilia, showing friendly feelings to foreign workers. Father Hermann apparently treated inhabitants of forced-labor camps and was indicating the likelihood of an ultimate Russian victory over Germany. Not only was he not committing atrocities, he was trying to compensate for their effects and got killed for his pains. Were we supposed to prosecute his 11-year old son? What madness that would have been! As Stephens put it, “what was a 10-ywar-old boy, whose father had died at Nazi hands, supposed to atone for?”

History tells us that Germany also harbored its own resistance movement, which worked behind the scenes to oppose Fascism in general and the war in particular. In fact, the Academy Award for Best Actor in 1943 went not to Humphrey Bogart, star of Best Picture winner Casablanca, but instead to Paul Lukas, who played a German who risked his life fighting the Nazis in the movie Watch On the Rhine. The Freiburg School of economists, a German free-market school of economists formed before the war, openly opposed Fascist economic policies even during World War II. Their prestige was such that the Nazis did not dare kill them, instead preferring to suppress their views and prevent their professional advancement. Then there were the sizable number of Germans who did not join the Nazi Party and were not politically active.

Hold every contemporary German criminally accountable for the actions of Hitler, Goebbels, Hess, Goering, Mengele and the rest? Unthinkable. In which case, how can we even contemplate asking today’s Germans, who had no part in the war crimes, weren’t even alive when they were committed and couldn’t have prevented them even if inclined to try, to “atone” for them?

The longer we think about the notion of contemporary national guilt for war crimes, the more we wonder how such a crazy idea ever wandered into our heads in the first place. Actually, we shouldn’t wonder too long about that. The notion of national, or collective, guilt came from the same source as most of the crazy ideas extant.

It came from the intellectual left wing.

The Origin of “Social Wholes”

There is no more painstaking and difficult pastime than tracing the intellectual pedigree of ideas. Apparently, the modern concept of the “social whole” or national collective seems traceable to the French philosopher, Claude Henri de Rouvroy, Comte de Saint Simon (hereinafter Saint-Simon). Saint-Simon is rightfully considered the father of Utopian Socialism. Born an aristocrat in 1760, he lived three lives – the first as a French soldier who fought for America in the Revolution, the second as a financial speculator who made and lost several fortunes, the third as an intellectual dilettante whose personal writings attracted the attention of young intellectuals and made him the focus of a cult.

Around age 40, Saint-Simon decided to focus his energies on intellectual pursuits. He was influenced by the intellectual ferment within France’s Ecole polytechnique, where the sciences of mathematics, chemistry, physics and physiology turned out distinguished specialists such as Lavoisier, Lagrange and Laplace. Unfortunately, Saint-Simon himself was able to appreciate genius but not to emulate it. Even worse, he was unable to grasp any distinction between the natural sciences and social sciences such as economics. In 1803, he wrote a pamphlet in which he proposed to attract funds by subscription for a “Council of Newton,” composed of twenty of the world’s most distinguished men of science, to be elected by the subscribers. They would be deemed “the representatives of God on earth,” thus displacing the Pope and other divinely ordained religious authorities, but with additional powers to direct the secular affairs of the world. According to Saint-Simon, these men deserved this authority because their competence in science would enable them to consciously order human affairs more satisfactorily than heretofore. Saint-Simon had received this plan in a revelation from God.

“All men will work; they will regard themselves as laborers attached to one workshop whose efforts will be directed to guide human intelligence according to my divine foresight [emphasis added]. The Supreme Council of Newton will direct their works… Anybody who does not obey their orders will be treated … as a quadruped.” Here we have the beginnings of the collective concept: all workers work for a single factory, under one central administration and one boss.

We can draw a direct line between this 1803 publication of Saint-Simon and the 20th century left-wing “Soviet of engineers” proposed by institutional economist Thorstein Veblen, the techno-socialism of J. K. Galbraith and the “keep the machines running” philosophy of Clarence Ayres. “Put government in the hands of technical specialists and give them absolute authority” has been the rallying cry of the progressive left wing since the 19th century.

Saint-Simon cultivated a salon of devotees who propagated his ideas after his death in 1825. These included most notably Auguste Comte, the founder of the “science” of sociology, which purports to aggregate all the sciences into one collective science of humanity. Comte inherited Saint-Simon’s disregard for individual liberty, referring contemptuously to “the anti-social dogma of the ‘liberty of individual conscience.'” It is no coincidence that socialism, which had its beginnings with Saint-Simon and his salon, eventually morphed into Nazism, which destroyed individual conscience so completely as to produce the Holocaust. That transformation from socialism to Nazism was described by Nobel laureate F. A. Hayek in The Road to Serfdom.

Today, the political left is committed to the concept of the collective. Its political constituencies are conceived in collective form: “blacks,” “women,” “labor,” “farmers,” “the poor.” Each of these blocs is represented by an attribute that blots out all trace of individuality: skin color, gender, economic class (or occupation), income. The collective concept implies automatic allegiance, unthinking solidarity. This is convenient for political purposes, since any pause for thought before voting might expose the uncomfortable truth that the left has no coherent policy program or set of ideas. The left traffics exclusively in generalities that attach themselves to social wholes like pilot fish to sharks: “the 1%,” the 99%,” “Wall St. vs. Main St.,” “people, not profit,” “the good of the country as a whole.” This is the parlor language of socialism. The left finds it vastly preferable to nitty-gritty discussion of the reality of socialism, which is so grim that it couldn’t even be broached on college campuses without first issuing trigger warnings to sensitive students.

The left-wing rhetoric of the collective has special relevance to the question of war crimes. Actual war crimes are committed by individual human beings. Human beings live discrete, finite lives. But a collective is not bound by such limitations. For example, consider the business concept of a corporation. Every single human being whose efforts comprise the workings of the corporation will eventually die, but the corporation itself is – in principle – eternal. Thus, it is a collective entity that corresponds to left-wing notions because it acts as if animated by a single will and purpose. And the left constantly laments the obvious fact that the U.S. does not and cannot act with this singular unanimity of purpose. For decades, left-wing intellectuals such as Arthur Schlesinger and John Kenneth Galbraith have looked back with nostalgia at World War II because the U.S. united around the single goal of winning the war and subordinated all other considerations to it.

The Rhetorical Convenience of Collective Guilt

Given its collective bent, we would expect to find the left in the forefront of the “collective guilt” school of thought on the issue of war crimes. And we do. For the left, “the country” is one single organic unity that never dies. When “it” makes a ghastly error, “it” bears the responsibility and guilt until “it” does something to expiate the sin. That explains why Americans have been figuratively horsewhipped for generations about the “national shame” and “original sin” of slavery. It is now 153 years after the Emancipation Proclamation and 150 years since the end of the Civil War, when a half-million Americans died to prevent slaveholding states from seceding from the Union. The U.S. Constitution was amended specifically to grant black Americans rights previously denied them following the Civil War. Yet “we” – that is, collective entity of “the country” on which left-wing logic rests – have not yet expunged this legacy of slavery from “our” moral rap sheet. Exactly how the slate should be wiped clean is never clearly outlined – if it were, then the left wing would lose its rhetorical half-Nelson on the public debate over race – but each succeeding generation must carry this burden on its shoulders in a race-reversed reprise of the song “Old Man River” from the play Showboat. “Tote that barge, lift that bale” refers in this case not to cotton but to the moral burden of being responsible for things that happened a century or more before our birth.

If this burden can be made heavy enough, it can motivate support for legislation like forced school busing, affirmative action and even racial reparations. Thus, the collective concept is a potentially powerful one. As Bret Stephens observes, it is now being pressed into service to prod Germany into bailing out Greeks, whose status as international deadbeats is proverbial. Exactly how were Greeks victimized by Germans? Were they somehow uniquely tyrannized by the Nazis – more so than, say, the Jews who later emigrated to Israel? No, Germany’s Nazism of seventy or eighty years ago is merely a handy pig bladder with which to beat today’s German over the head to extract blackmail money for the latest left-wing cause du jour. Since the money must come from the German government, German taxpayers must fork it over. A justification must be found for blackmailing German taxpayers. The concept of collective guilt is the ideal lever for separating Germans from their cash. Every single German is part of the collective; therefore, every single German is guilty. Voila!

The Falsity of Social Wholes

In The Counterrevolution of Science (1952), Nobel laureate F.A. Hayek meticulously traced the pedigree of social wholes back to their roots. He sketched the life and intellectual career of Saint Simon and his disciple Auguste Comte. Hayek then carefully exposed the fallacies behind the holistic method and explained why the unit of analysis in the social sciences must be the individual human being.

Holistic concepts like “the country” are abstract concepts that have no concrete referent because they are not part of the data of experience for any individual. Nobody ever interacts directly with “the country,” nor does “the country” ever interact directly with any other “country.” The only meaning possible for “the country” is the sum of all the individual human beings that comprise it, and the only possible theoretical validity for social wholes generally arises when they are legitimately constructed from their individual component parts. Indeed, Hayek views one role for social scientists as the application of this “compositive” method of partial aggregation as a means of deriving theories of human interaction.

The starting point, though, must be the individual – and theory can proceed only as far as individual plans and actions can be summed to produce valid aggregates. The left-wing historical modus operandi has reversed this procedure, beginning with one or more postulated wholes and deriving results, sometimes drawing conclusions about individual behavior but more often subsuming individuals completely within a faceless mass.

An example may serve to clarify the difference in the two approaches. The individualist approach, common to classical and neoclassical economics, is at home with the multifarious differences in gender, race, income, taste, preferences, culture and historical background that typify the human race. There is only one assumed common denominator among people – they act purposefully to achieve their ends. (For purposes of simplicity, those ends are termed “happiness.”)Then economic theory proceeds to show how the price system tends to coordinate the plans and behavior of people despite the innumerable differences that otherwise characterize them.

In contrast, the aggregative or holistic theory begins with certain arbitrarily chosen aggregates – such as “blacks.” It assumes that skin color is the defining characteristic of members of this aggregate; that is, skin color determines both the actions of the people within the aggregate and the actions of non-members toward those in the aggregate. The theory derived from this approach is correct if, and only if, this assumption holds. The equivalent logic holds true of other aggregates like “women,” “labor,”et al, with respect to the defining characteristic of each. Since this basic assumption is transparently false to the facts, holistic theories – beginning with Saint Simonian socialism, continuing with Marxism, syndicalism and the theories of Fourier, the Fabian socialists, Lenin, Sombart, Trotsky, and the various modern socialists and Keynesians – have had to make numerous ad hoc excuses for the “deviationism” practiced by some members of each aggregate and for the failure of each theory.

The Hans Lipschis Case

Is it proper in principle that Hans Lipschis, a former employee of Auschwitz and now ninety-three years old, be repatriated to Germany from the U.S. and tried as accessory in the murder of 300,000 inmates of the notorious World War II death camp? Yes. The postwar tribunals, notably at Nuremberg, reaffirmed the principle that “following orders” of duly constituted authority is not a license to aid and abet murder.

Lipschis’s defense is that he was a cook, not a camp guard. But a relatively new legal theory, used to convict another elderly war-crimes defendant, John Demjanjuk, is that the only purpose of camps like Auschwitz was to inflict death upon inmates. Thus, the defendant’s presence at the camp as an employee is sufficient to provide proof of guilt. Is this theory valid? No. A cook’s actions benefitted the inmates; a guard’s actions harmed them. If guards refused to serve, the camps could not have functioned. But if cooks refused to serve, the inmates would have died of starvation.

Verdicts such as that in the Demjanjuk case were undoubtedly born of the extreme frustration felt by prosecutors and men like Simon Wiesenthal and other Nazi hunters. It is almost beyond human endurance to have lived through World War II and then be forced to watch justice be cheated time after time after time. First the leading Nazis escaped or committed suicide. Then some of them were recruited to aid Western governments. Then some were sheltered by governments in South America and the Middle East. Over time, attrition eventually overtook figures such as Josef Mengele. Occasionally, an Adolf Eichmann was brought to justice – but even he had to be kidnapped by Israeli secret agents before he could be prosecuted. Now the job of legally proving actual criminal acts committed by minor functionaries fifty, sixty or seventy years after the fact becomes too difficult. So we cannot be surprised when desperate prosecutors substitute legal fancies for the ordinary rules of evidence.

Nevertheless, if the prosecution cannot prove that Lipschis committed actual crimes, then he must be acquitted. This has nothing to do with his age or the time lapse between the acts and the trial. Any other decision is a de facto application of the bogus principle of collective guilt.

Shinzo Abe and Guilt for Japanese Aggression in World War II

Japanese Prime Minister Abe is a classic politician. Like the Roman god Janus, he wears two faces, one when speaking abroad to foreign audiences and another when seeking reelection by domestic voters. His answers to questions about whether he was repudiating the stance taken by a previous Prime Minister in 1996 – that Japan was indeed guilty of aggression for which the Japanese government formally apologized – were delicately termed “equivocal” by the U.S. magazine U.S. News and World Report. That is a euphemism meaning that Abe was lying by indirection, a political tactic used by politicians the world over. He wanted his answer to be interpreted one way by Japanese voters without having to defend that interpretation to the foreign press.

Abe’s behavior was shameful. But that has absolutely nothing to do with the question of Japanese guilt for war crimes committed during and prior to World War II. That guilt was borne by specific individual Japanese and established by the Tokyo war-crimes tribunal. Indeed, one government spokesman eventually admitted this in just those words, albeit grudgingly, after Abe’s comments had attracted worldwide attention and criticism.

The implications of this are that Japanese today bear no “collective guilt” for the war crimes committed by previous Japanese. (It would be wrong to use the phrase “by their ancestors,” since presumably few Japanese today are related by blood to the war criminals of seventy or eighty years ago.) The mere coincidence of common nationality does not constitute common ancestry except in the broad cultural sense, which is meaningless when discussing moral guilt. Are we really supposed to believe, for example, that the surviving relatives of Jesse James or Billy the Kid should carry around a weighty burden of guilt for the crimes of their forebear? In a world where the lesson of the Hatfield’s and McCoy’s remains unlearned in certain precincts, this presumption seems too ridiculous for words.

Similarly, the fact that Japanese leaders in the 1920s, 30s and 40s were aggressively militaristic does not deny Japanese today the right to self-defense against a blatantly aggressive Chinese military establishment.

Much is made of Abe’s unwillingness to acknowledge the “comfort women” – women from Korea, China and other Asian nations who were held captive as prostitutes by Japanese troops. Expecting politicians to behave as historians is futile. If Japanese war criminals remain at large, apprehend and indict them. If new facts are unearthed about the comfort women or other elements of Japanese war crimes, publish them. But using these acts as a club against contemporary Japanese leaders is both wrong and counterproductive.

Besides, it’s not as if no other ammunition was available against Abe. He has followed Keynesian fiscal policies and monetary policies of quantitative easing since his accession to prime minister. These may not be crimes against humanity, but they are crimes against human reason.

Macro vs. Micro

Academic economics today is segregated between macroeconomics and microeconomics. The “national economy” is the supposed realm of macroeconomics, the study of economic aggregates. But as we have just shown, it is the logic of individual responsibility that actually bears on the issue of war crimes committed by the nations of Germany and Japan – because the crimes were committed by individuals, not by “nations.” 

One of the most valuable lessons taught by classical economic theory is that the unit of analysis is the individual – in economics or moral philosophy.

DRI-135 for week of 1-4-15: Flexible Wages and Prices: Economic Shock Absorbers

An Access Advertising EconBrief:

Flexible Wages and Prices: Economic Shock Absorbers

At the same times that free markets are becoming an endangered species in our daily lives, they enjoy a lively literary existence. The latest stimulating exercise in free-market thought is The Forgotten Depression: 1921 – The Crash That Cured Itself. The author is James Grant, well-known in financial circles as editor/publisher of “Grant’s Interest Rate Observer.” For over thirty years, Grant has cast a skeptical eye on the monetary manipulations of governments and central banks. Now he casts his gimlet gaze backward on economic history. The result is electrifying.

The Recession/Depression of 1920-1921

The U.S. recession of 1920-1921 is familiar to students of business cycles and few others. It was a legacy of World War I. Back then, governments tended to finance wars through money creation. Invariably this led to inflation. In the U.S., the last days of the war and its immediate aftermath were boom times. As usual – when the boom was the artifact of money creation – the boom went bust.

Grant recounts the bust in harrowing detail.  In 1921, industrial production fell by 31.6%, a staggering datum when we recall that the U.S. was becoming the world’s leading manufacturer. (The President’s Conference on Unemployment reported in 1929 that 1921 was the only year after 1899 in which industrial production had declined.) Gross national product (today we would cite gross domestic product; neither statistic was actually calculated at that time) fell about 24% in between 1920 and 1921 in nominal dollars, or 9% when account is taken of price changes. (Grant compares this to the figures for the “Great Recession” of 2007-2009, which were 2.4% and 4.3%, respectively.) Corporate profits nosedived commensurately. Stocks plummeted; the Dow Jones Industrial average fell by 46.6% between the cyclical peak of November, 1919 and trough of August, 1921. According to Grant, “the U.S. suffered the steepest plunge in wholesale prices in its history (not even eclipsed by the Great Depression),” over 36% within 12 months. Unemployment rose dramatically to a level of some 4,270,000 in 1921 – and included even the President of General Motors, Billy Durant. (As the price of GM’s shares fell, he augmented his already-sizable shareholdings by buying on margin – ending up flat broke and out of a job.) Although the Department of Labor did not calculate an “unemployment rate” at that time, Grant estimates the nonfarm labor force at 27,989,000, which would have made the simplest measure of the unemployment rate 15.3%. (That is, it would have undoubtedly included labor-force dropouts and part-time workers who preferred full-time employment.)

A telling indicator of the dark mood enveloping the nation was passage of the Quota Act, the first step on the road to systematic federal limitation of foreign immigration into the U.S. The quota was fixed at 3% of foreign nationals present in each of the 48 states as of 1910. That year evidently reflected nostalgia for pre-war conditions since the then-popular agricultural agitation for farm-price “parity” sought to peg prices to levels at that same time.

In the Great Recession and accompanying financial panic of 2008 and subsequently, we had global warming and tsunamis in Japan and Indonesia to distract us. In 1920-1921, Prohibition had already shut down the legal liquor business, shuttering bars and nightclubs. A worldwide flu pandemic had killed hundreds of thousands. The Black Sox had thrown the 1919 World Series at the behest of gamblers.

The foregoing seems to make a strong prima facie case that the recession of 1920 turned into the depression of 1921. That was the judgment of the general public and contemporary commentators. Herbert Hoover, Secretary of Commerce under Republican President Warren G. Harding, who followed wartime President Woodrow Wilson in 1920, compiled many of the statistics Grant cites while chairman of the President’s Conference on Unemployment. He concurred with that judgment. So did the founder of the study of business cycles, the famous institutional economist Wesley C. Mitchell, who influenced colleagues as various and eminent as Thorstein Veblen, Milton Friedman, F. A. Hayek and John Kenneth Galbraith. Mitchell referred to “…the boom of 1919, the crisis of 1920 and the depression of 1921 [that] followed the patterns of earlier cycles.”

By today’s lights, the stage was set for a gigantic wave of federal-government intervention, a gargantuan stimulus program. Failing that, economists would have us believe, the economy would sink like a stone into a pit of economic depression from which it would likely never emerge.

What actually happened in 1921, however, was entirely different.

The Depression That Didn’t Materialize

We may well wonder what might have happened if the Democrats had retained control of the White House and Congress. Woodrow Wilson and his advisors (notably his personal secretary, Joseph Tumulty) had greatly advanced the project of big government begun by Progressive Republicans Theodore Roosevelt and William Howard Taft. During World War I, the Wilson administration seized control of the railroads, the telephone companies and the telegraph companies. It levied wage and price controls. The spirit of the Wilson administration’s efforts is best characterized by the statement of the Chief Price Controller of the War Industries Board, Robert Brookings. “I would rather pay a dollar a pound for [gun]powder for the United States in a state of war if there was no profit in it than pay the DuPont Company 50 cents a pound if they had 10 cents profit in it.” Of course, Mr. Brookings was not actually himself buying the gunpowder; the government was only representing the taxpayers (of whom Mr. Brookings was presumably one). And their attitude toward taxpayers was displayed by the administration’s transformation of an income tax initiated at insignificant levels in 1913 and to a marginal rate of 77% (!!) on incomes exceeding $1 million.

But Wilson’s obsession with the League of Nations and his 14 points for international governance had not only ruined his health, it had ruined his party’s standing with the electorate. In 1920, Republican Warren G. Harding was elected President. (The Republicans had already gained substantial Congressional majorities in the off-year elections of 1918.) Except for Hoover, the Harding circle of advisors was comprised largely of policy skeptics – people who felt there was nothing to be done in the face of an economic downturn but wait it out. After all, the U.S. had endured exactly this same phenomenon of economic boom, financial panic and economic bust before in 1812, 1818, 1825, 1837, 1847, 1857, 1873, 1884, 1890, 1893, 1903, 1907, 1910 and 1913. The U.S. economy had not remained mired in depression; it had emerged from all these recessions – or, in the case of 1873, a depression. If the 19th-century system of free markets were to be faulted, it would not be for failure to lift itself out of recession or depression, but for repeatedly re-entering the cycle of boom and bust.

There was no Federal Reserve to flood the economy with liquidity or peg interest rates at artificially low levels or institute a “zero interest-rate policy.” Indeed, the rules of the gold-standard “game” called for the Federal Reserve to raise interest rates to stem the inflation that still raged in the aftermath of World War I. Had it not done so, a gold outflow might theoretically have drained the U.S. dry.  The Fed did just that, and interest rates hovered around 8% for the duration. Deliberate deficit spending as an economic corrective would have been viewed as madness. As Grant put it, “laissez faire had its last hurrah in 1921.”

What was the result?

In the various individual industries, prices and wages and output fell like a stone. Auto production fell by 23%. General Motors, as previously noted, was particularly hard hit. It went from selling 52,000 vehicles per month to selling 13,000 to 6,150 in the space of seven months. Some $85 million in inventory was eventually written off in losses.

Hourly manufacturing wages fell by 22%. Average disposable income in agriculture, which comprised just under 20% of the economy, fell by over 55%. Bankruptcies overall tripled to nearly 20,000 over the two years ending in 1921. In Kansas City, MO, a haberdashery shop run by Harry Truman and Eddie Jacobson held out through 1920 before finally folding in 1921. The resulting personal bankruptcy and debt plagued the partners for years. Truman evaded it by taking a job as judge of the Jackson County Court, where his salary was secure against liens. But his bank accounts were periodically raided by bill collectors for years until 1935, when he was able to buy up the remaining debt at a devalued price.

In late 1920, Ford Motor Co. cut the price of its Model T by 25%. GM at first resisted price cuts but eventually followed suit. Farmers, who as individuals had no control over the price of their products, had little choice but to cut costs and increase productivity – increasing output was an individual’s only way to increase income. When all or most farmers succeeded, this produced lower prices. How much lower? Grant: “In the second half of [1920], the average price of 10 leading crops fell by 57 percent.” But how much more food can humans eat; how many more clothes can they wear? Since the price- and income-elasticities of demand for agricultural goods were less than one, this meant that agricultural revenue and incomes fell.

As noted by Wesley Mitchell, the U.S. slump was not unique but rather part of a global depression that began as a series of commodity-price crashes in Japan, the U.K., France, Italy, Germany, India, Canada, Sweden, the Netherlands and Australia. It encompassed commodities including pig iron, beef, hemlock, Portland cement, bricks, coal, crude oil and cotton.

Banks that had speculative commodity positions were caught short. Among these was the largest bank in the U.S., National City Bank, which had loaned extensively to finance the sugar industry in Cuba. Sugar prices were brought down in the commodity crash and brought the bank down with them. That is, the bank would have failed had it not received sweetheart loans from the Federal Reserve.

Today, the crash of prices would be called “deflation.” So it was called then and with much more precision. Today, deflation can mean anything from the kind of nosediving general price level seen in 1920-1921 to relatively stable prices to mild inflation – in short, any general level of prices that does not rise fast enough to suit a commentator.

But there was apparently general acknowledgment that deflation was occurring in the depression of 1921. Yet few people apart from economists found that ominous. And for good reason. Because after some 18 months of panic, recession and depression – the U.S. economy recovered. Just as it had done 14 times previously.

 

It didn’t merely recover. It roared back to life. President Harding died suddenly in 1923, but under President Coolidge the U.S. economy experienced the “Roaring 20s.” This was an economic boom fueled by low tax rates and high productivity, the likes of which would not be seen again until the 1980s. It was characterized by innovation and investment. Unfortunately, in the latter stages, the Federal Reserve forgot the lessons of 1921 and increases the money supply to “keep the price level stable” and prevent deflation in the face of the wave of innovation and productivity increases. This helped to usher in the Great Depression, along with numerous policy errors by the Hoover and Roosevelt administrations.

Economists like Keynes, Irving Fisher and Gustav Cassel were dumbfounded. They had expected deflation to flatten the U.S. economy like a pancake, increasing the real value of debts owed by debtor classes and discouraging consumers from spending in the expectation that prices would fall in the future. Not.

There was no economic stimulus. No TARP, no ZIRP, no QE. No wartime controls. No meddlesome regulation a la Theodore Roosevelt, Taft and Wilson. The Harding administration and the Fed left the economy alone to readjust and – mirabile dictu – it readjusted. In spite of the massive deflation or, much more likely, because of it.

The (Forgotten) Classical Theory of Flexible Wages and Prices

James Grant wants us to believe that this outcome was no accident. The book jacket for the Forgotten Depression bills it as “a free-market rejoinder to Bush’s and Obama’s Keynesian stimulus applied to the 2007-9 recession,” which “proposes ‘less is more’ with respect to federal intervention.”

His argument is almost entirely empirical and very heavily oriented to the 1920-1921 depression. That is deliberate; he cites the 14 previous cyclical contractions but focuses on this one for obvious reasons. It was the last time that free markets were given the opportunity to cure a depression; both Herbert Hoover and Franklin Roosevelt supervised heavy, continual interference with markets from 1929 through 1941. We have much better data on the 1920-21 episode than, say, the 1873 depression.

Readers may wonder, though, whether there is underlying logical support for the result achieved by the deflation of 1921. Can the chorus of economists advocating stimulative policy today really be wrong?

Prior to 1936, the policy chorus was even louder. Amazing as it now seems, it advocated the stance taken by Harding et al. Classical economists propounded the theory of flexible wages and prices as an antidote to recession and depression. And, without stating it in rigorous fashion, that is the theory that Grant is following in his book.

Using the language of modern macroeconomics, the problems posed by cyclical downturns are unemployment due to a sudden decline in aggregate (effective) demand for goods and services. The decline in aggregate demand causes declines in demand for all or most goods; the decline in demand for goods causes declines in demand for all or most types of labor. As a first approximation, this produces surpluses of goods and labor. The surplus of labor is defined as unemployment.

The classical economists pointed out that, while the shock of a decline in aggregate demand could cause temporary dislocations such as unsold goods and unemployment, this was not a permanent condition. Flexible wages and prices could, like the shock absorbers on an automobile, absorb the shock of the decline in aggregate demand and return the economy to stability.

Any surplus creates an incentive for sellers to lower price and buyers to increase purchases. As long as the surplus persists, the downward pressure on price will remain. And as the price (or wage) falls toward the new market-clearing point, the amount produced and sold (or the amount of labor offered and purchases) will increase once more.

Flexibility of wages and prices is really a two-part process. Part one works to clear the surpluses created by the initial decline in aggregate demand. In labor markets, this serves to preserve the incomes of workers who remain willing to work at the now-lower market wage. If they were unemployed, they would have no wage, but working at a lower wage gives them a lower nominal income than before. That is only part of this initial process, though. Prices in product markets are decreasing alongside the declining wages. In principle, fully flexible prices and wages would mean that even though the nominal incomes of workers would decline, their real incomes would be restored by the decline of all prices in equal proportion. If your wage falls by (say) 20%, declines in all prices by 20% should leave you able to purchase the same quantities of goods and services as before.

The emphasis on real magnitudes rather than nominal magnitudes gives rise to the name given to the second part of this process. It is called the real-balance effect. It was named by the classical economist A. C. Pigou and refined by later macroeconomist Don Patinkin.

When John Maynard Keynes wrote his General Theory of Employment Interest and Income in 1936, he attacked classical economists by attacking the concepts of flexible wages and prices. First, he attacked their feasibility. Then, he attacked their desirability.

Flexible wages were not observed in reality because workers would not consent to downward revisions in wages, Keynes maintained. Did Keynes really believe that workers preferred to be unemployed and earn zero wages at a relatively high market wage rather than work and earn a lower market wage? Well, he said that workers oriented their thinking toward the nominal wage rather than the real wage and thus did not perceive that they had regained their former position with lower prices and a lower wage. (This became known as the fallacy of money illusion.) His followers spent decades trying to explain what he really meant or revising his words or simply ignoring his actual words. (It should be noted, however, that Keynes was English and trade unions exerted vastly greater influence on prevailing wage levels in England that they did in the U.S. for at least the first three-quarters of the 20th century. This may well have biased Keynes’ thinking.)

Keynes also decried the assumption of flexible prices for various reasons, some of which continue to sway economists today. The upshot is that macroeconomics has lost touch with the principles of price flexibility. Even though Keynes’ criticisms of the classical economists and the price system were discredited in strict theory, they were accepted de facto by macroeconomists because it was felt that flexible wages and prices would take too long to work, while macroeconomic policy could be formulated and deployed relatively quickly. Why make people undergo the misery of unemployment and insolvency when we can relieve their anxiety quickly and compassionately by passing laws drafted by macroeconomists on the President’s Council of Economic Advisors?

Let’s Compare

Thanks to James Grant, we now have an empirical basis for comparison between policy regimes. In 1920-1921, the old-fashioned classical medicine of deflation, flexible wages and prices and the real-balance effect took 18 months to turn a panic, recession and depression into a rip-roaring recovery that lasted 8 years.

Fast forward to December, 2007. The recession has begun. Unfortunately, it is not detected until September, 2008, when the financial panic begins. The stimulus package is not passed until January, 2009 – barely in time for the official end of the recession in June, 2009. Whoops – unemployment is still around 10% and remains stubbornly high until 2013. Moreover, it only declines because Americans have left the labor force in numbers not seen for over thirty years. The recovery, such as it is, is so anemic as to hardly merit the name – and it is now over 7 years since the onset of recession in December, 2007.

 

It is no good complaining that the stimulus package was not large enough because we are comparing it with a case in which the authorities did nothing – or rather, did nothing stimulative, since their interest-rate increase should properly be termed contractionary. That is exactly what macroeconomists call it when referring to Federal Reserve policy in the 1930s, during the Great Depression, when they blame Fed policy and high interest rates for prolonging the Depression. Shouldn’t they instead be blaming the continual series of government interventions by the Fed and the federal government under Herbert Hoover and Franklin Roosevelt? And we didn’t even count the stimulus package introduced by the Bush administration, which came and went without making a ripple in term of economic effect.

Economists Are Lousy Accident Investigators 

For nearly a century, the economics profession has accused free markets of possessing faulty shock absorbers; namely, inflexible wages and prices. When it comes to economic history, economists are obviously lousy accident investigators. They have never developed a theory of business cycles but have instead assumed a decline in aggregate demand without asking why it occurred. In figurative terms, they have assumed the cause of the “accident” (the recession or the depression). Then they have made a further assumption that the failure of the “vehicle’s” (the economy’s) automatic guidance system to prevent (or mitigate) the accident was due to “faulty shock absorbers” (inflexible wages and prices).

Would an accident investigator fail to visit the scene of the accident? The economics profession has largely failed to investigate the flexibility of wages and prices even in the Great Depression, let alone the thirty-odd other economic contractions chronicled by the National Bureau of Economic Research. The work of researchers like Murray Rothbard, Vedder and Galloway, Benjamin Anderson and Harris Warren overturns the mainstream presumption of free-market failure.

The biggest empirical failure of all is one ignored by Grant; namely, the failure to demonstrate policy success. If macroeconomic policy worked as advertised, then we would not have recessions in the first place and could reliably end them once they began. In fact, we still have cyclical downturns and cannot use policy to end them and macroeconomists can point to no policy successes to bolster their case.

Now we have this case study by James Grant that provides meticulous proof that deflation – full-blooded, deep-throated, hell-for-leather deflation in no uncertain terms – put a prompt, efficacious end to what must be called an economic depression.

Combine this with the 40-year-long research project conducted on Keynesian theory, culminating in its final discrediting by the early 1980s. Throw in the existence of the Austrian Business Cycle Theory, which combines the monetary theory of Ludwig von Mises and interest-rate theory of Knut Wicksell with the dynamic synthesis developed by F. A. Hayek. This theory cannot be called complete because it lacks a fully worked out capital theory to complete the integration of monetary and value theory. (We might think of this as the economic version of the Unified Field Theory in the natural sciences.) But an incomplete valid theory beats a discredited theory every time.

In other words, free-market economics has an explanation for why the accident repeatedly happens and why its effects can be mitigated by the economy’s automatic guidance mechanism without the need for policy action by government. It also explains why the policy actions are ineffective at both remedial and preventive action in the field of accidents.

James Grant’s book will take its place in the pantheon of economic history as the outstanding case study to date of a self-curing depression.

DRI-259 for week of 2-2-14: Kristallnacht for the Rich: Not Far-Fetched

An Access Advertising EconBrief:

Kristallnacht for the Rich: Not Far-Fetched

Periodically, the intellectual class aptly termed “the commentariat” by The Wall Street Journal works itself into frenzy. The issue may be a world event, a policy proposal or something somebody wrote or said. The latest cause célèbre is a submission to the Journal’s letters column by a partner in one of the nation’s leading venture-capital firms. The letter ignited a firestorm; the editors subsequently declared that Tom Perkins of Kleiner Perkins Caulfield & Byers “may have written the most-read letter to the editor in the history of The Wall Street Journal.”

What could have inspired the famously reserved editors to break into temporal superlatives? The letter’s rhetoric was both penetrating and provocative. It called up an episode in the 20th century’s most infamous political regime. And the response it triggered was rabid.

“Progressive Kristallnacht Coming?”

“…I would call attention to the parallels of fascist Nazi Germany to its war on its “one percent,” namely its Jews, to the progressive war on the American one percent, namely “the rich.” With this ice breaker, Tom Perkins made himself a rhetorical target for most of the nation’s commentators. Even those who agreed with his thesis felt that Perkins had no business using the Nazis in an analogy. The Wall Street Journal editors said “the comparison was unfortunate, albeit provocative.” They recommended reserving Nazis only for rarefied comparisons to tyrants like Stalin.

On the political Left, the reaction was less measured. The Anti-Defamation League accused Perkins of insensitivity. Bloomberg View characterized his letter as an “unhinged Nazi rant.”

No, this bore no traces of an irrational diatribe. Perkins had a thesis in mind when he drew an analogy between Nazism and Progressivism. “From the Occupy movement to the demonization of the rich, I perceive a rising tide of hatred of the successful one percent.” Perkins cited the abuse heaped on workers traveling Google buses from the cities to the California peninsula. Their high wages allowed them to bid up real-estate prices, thereby earning the resentment of the Left. Perkins’ ex-wife Danielle Steele placed herself in the crosshairs of the class warriors by amassing a fortune writing popular novels. Millions of dollars in charitable contributions did not spare her from criticism for belonging to the one percent.

“This is a very dangerous drift in our American thinking,” Perkins concluded. “Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?” Perkins point is unmistakable; his letter is a cautionary warning, not a comparison of two actual societies. History doesn’t repeat itself, but it does rhyme. Kristallnacht and Nazi Germany belong to history. If we don’t mend our ways, something similar and unpleasant may lie in our future.

A Short Refresher Course in Early Nazi Persecution of the Jews

Since the current debate revolves around the analogy between Nazism and Progressivism, we should refresh our memories about Kristallnacht. The name itself translates loosely into “Night of Broken Glass.” It refers to the shards of broken window glass littering the streets of cities in Germany and Austria on the night and morning of November 9-10, 1938. The windows belonged to houses, hospitals, schools and businesses owned and operated by Jews. These buildings were first looted, then smashed by elements of the German paramilitary SA (the Brownshirts) and SS (security police), led by the Gauleiters (regional leaders).

In 1933, Adolf Hitler was elevated to the German chancellorship after the Nazi Party won a plurality of votes in the national election. Almost immediately, laws placing Jews at a disadvantage were passed and enforced throughout Germany. The laws were the official expression of the philosophy of German anti-Semitism that dated back to the 1870s, the time when German socialism began evolving from the authoritarian roots of Otto von Bismarck’s rule. Nazi officialdom awaited a pretext on which to crack down on Germany’s sizable Jewish population.

The pretext was provided by the assassination of German official Ernst vom Rath on Nov. 7, 1938 by a 17-year-old German boy named Herschel Grynszpan. The boy was apparently upset by German policies expelling his parents from the country. Ironically, vom Rath’s sentiments were anti-Nazi and opposed to the persecution of Jews. Von Rath’s death on Nov. 9 was the signal for release of Nazi paramilitary forces on a reign of terror and abduction against German and Austrian Jews. Police were instructed to stand by and not interfere with the SA and SS as long as only Jews were targeted.

According to official reports, 91 deaths were attributed directly to Kristallnacht. Some 30,000 Jews were spirited off to jails and concentration camps, where they were treated brutally before finally winning release some three months later. In the interim, though, some 2-2,500 Jews died in the camps. Over 7,000 Jewish-owned or operated businesses were damaged. Over 1,000 synagogues in Germany and Austria were burned.

The purpose of Kristallnacht was not only wanton destruction. The assets and property of Jews were seized to enhance the wealth of the paramilitary groups.

Today we regard Kristallnacht as the opening round of Hitler’s Final Solution – the policy that produced the Holocaust. This strategic primacy is doubtless why Tom Perkins invoked it. Yet this furious controversy will just fade away, merely another media preoccupation du jour, unless we retain its enduring significance. Obviously, Tom Perkins was not saying that the Progressive Left’s treatment of the rich is now comparable to Nazi Germany’s treatment of the Jews. The Left is not interning the rich in concentration camps. It is not seizing the assets of the rich outright – at least not on a wholesale basis, anyway. It is not reducing the homes and businesses of the rich to rubble – not here in the U.S., anyway. It is not passing laws to discriminate systematically against the rich – at least, not against the rich as a class.

Tom Perkins was issuing a cautionary warning against the demonization of wealth and success. This is a political strategy closely associated with the philosophy of anti-Semitism; that is why his invocation of Kristallnacht is apropos.

The Rise of Modern Anti-Semitism

Despite the politically correct horror expressed by the Anti-Defamation Society toward Tom Perkins’ letter, reaction to it among Jews has not been uniformly hostile. Ruth Wisse, professor of Yiddish and comparative literature at HarvardUniversity, wrote an op-ed for The Wall Street Journal (02/04/2014) defending Perkins.

Wisse traced the modern philosophy of anti-Semitism to the philosopher Wilhelm Marr, whose heyday was the 1870s. Marr “charged Jews with using their skills ‘to conquer Germany from within.’ Marr was careful to distinguish his philosophy of anti-Semitism from prior philosophies of anti-Judaism. Jews “were taking unfair advantage of the emerging democratic order in Europe with its promise of individual rights and open competition in order to dominate the fields of finance, culture and social ideas.”

Wisse declared that “anti-Semitism channel[ed] grievance and blame against highly visible beneficiaries of freedom and opportunity.” “Are you unemployed? The Jews have your jobs. Is your family mired in poverty? The Rothschilds have your money. Do you feel more secure in the city than you did on the land? The Jews are trapping you in the factories and charging you exorbitant rents.”

The Jews were undermining Christianity. They were subtly perverting the legal system. They were overrunning the arts and monopolizing the press. They spread Communism, yet practiced rapacious capitalism!

This modern German philosophy of anti-Semitism long predated Nazism. It accompanied the growth of the German welfare state and German socialism. The authoritarian political roots of Nazism took hold under Otto von Bismarck’s conservative socialism, and so did Nazism’s anti-Semitic cultural roots as well. The anti-Semitic conspiracy theories ascribing Germany’s every ill to the Jews were not the invention of Hitler, but of Wilhelm Marr over half a century before Hitler took power.

The Link Between the Nazis and the Progressives: the War on Success

As Wisse notes, the key difference between modern anti-Semitism and its ancestor – what Wilhelm Marr called “anti-Judaism” – is that the latter abhorred the religion of the Jews while the former resented the disproportionate success enjoyed by Jews much more than their religious observances. The modern anti-Semitic conspiracy theorist pointed darkly to the predominance of Jews in high finance, in the press, in the arts and running movie studios and asked rhetorically: How do we account for the coincidence of our poverty and their wealth, if not through the medium of conspiracy and malefaction? The case against the Jews is portrayed as prima facie and morphs into per se through repetition.

Today, the Progressive Left operates in exactly the same way. “Corporation” is a pejorative. “Wall Street” is the antonym of “Main Street.” The very presence of wealth and high income is itself damning; “inequality” is the reigning evil and is tacitly assigned a pecuniary connotation. Of course, this tactic runs counter to the longtime left-wing insistence that capitalism is inherently evil because it forces us to adopt a materialistic perspective. Indeed, environmentalism embraces anti-materialism to this day while continuing to bunk in with its progressive bedfellows.

We must interrupt with an ironic correction. Economists – according to conventional thinking the high priests of materialism – know that it is human happiness and not pecuniary gain that is the ultimate desideratum. Yet the constant carping about “inequality” looks no further than money income in its supposed solicitude for our well-being. Thus, the “income-inequality” progressives – seemingly obsessed with economics and materialism – are really anti-economic. Economists, supposedly green-eyeshade devotees of numbers and models, are the ones focusing on human happiness rather than ideological goals.

German socialism metamorphosed into fascism. American Progressivism is morphing from liberalism to socialism and – ever more clearly – honing in on its own version of fascism. Both employed the technique of demonization and conspiracy to transform the mutual benefit of free voluntary exchange into the zero-sum result of plunder and theft. How else could productive effort be made to seem fruitless? How else could success be made over into failure? This is the cautionary warning Perkins was sounding.

The Great Exemplar

The great Cassandra of political economy was F.A. Hayek. Early in 1929, he predicted that Federal Reserve policies earlier in the decade would soon bear poisoned fruit in the form of a reduction in economic activity. (His mentor, Ludwig von Mises, was even more emphatic, foreseeing “a great crash” and refusing a prestigious financial post for fear of association with the coming disaster.) He predicted that the Soviet economy would fail owing to lack of a functional price system; in particular, missing capital markets and interest rates. He predicted that Keynesian policies begun in the 1950s would culminate in accelerating inflation. All these came true, some of them within months and some after a lapse of years.

Hayek’s greatest prediction was really a cautionary warning, in the same vein as Tom Perkins’ letter but much more detailed. The 1945 book The Road to Serfdom made the case that centralized economic planning could operate only at the cost of the free institutions that distinguished democratic capitalism. Socialism was really another form of totalitarianism.

The reaction to Hayek’s book was much the same as reaction to Perkins’ letter. Many commentators who should have known better have accused both of them of fascism. They also accused both men of describing a current state of affairs when both were really trying to avoida dystopia.

The flak Hayek took was especially ironic because his book actually served to prevent the outcome he feared. But instead of winning the acclaim of millions, this earned him the scorn of intellectuals. The intelligentsia insisted that Hayek predicted the inevitable succession of totalitarianism after the imposition of a welfare state. When welfare states in Great Britain, Scandinavia, and South America failed to produce barbed wire, concentration camps and German Shepherd dogs, the Left advertised this as proof of Hayek’s “exaggerations” and “paranoia.”

In actual fact, Great Britain underwent many of the changes Hayek had feared and warned against. The notorious “Rules of Engagements,” for instance, were an attempt by a Labor government to centrally control the English labor market – to specify an individual’s work and wage rather than allowing free choice in an impersonal market to do the job. The attempt failed just a dismally as Hayek and other free-market economists had foreseen it would. In the 1980s, it was Hayek’s arguments, wielded by Prime Minister Margaret Thatcher, which paved the way for the rolling back of British socialism and the taming of inflation. It’s bizarre to charge the prophet of doom with inaccuracy when his prophecy is the savior, but that’s what the Left did to Hayek.

Now they are working the same familiar con on Tom Perkins. They begin by misconstruing the nature of his argument. Later, if his warnings are successful, they will use that against him by claiming that his “predictions” were false.

Enriching Perkins’ Argument

This is not to say that Perkins’ argument is perfect. He has instinctively fingered the source of the threat to our liberties. Perkins himself may be rich, but argument isn’t; it is threadbare and skeletal. It could use some enriching.

The war on the wealthy has been raging for decades. The opening battle is lost to history, but we can recall some early skirmishes and some epic brawls prior to Perkins.

In Europe, the war on wealth used anti-Semitism as its spearhead. In the U.S., however, the popularity of Progressives in academia and government made antitrust policy a more convenient wedge for their populist initiatives against success. Antitrust policy was a crown jewel of the Progressive movement in the early 1900s; Presidents Theodore Roosevelt and William Howard Taft cultivated reputations as “trust busters.”

The history of antitrust policy exhibits two pronounced tendencies: the use of the laws to restrict competition for the benefit of incumbent competitors and the use of the laws by the government to punish successful companies for various political reasons. The sobering research of Dominick Armentano shows that antitrust policy has consistently harmed consumer welfare and economic efficiency. The early antitrust prosecution of Standard Oil, for example, broke up a company that had consistently increased its output and lowered prices to consumers over long time spans. The Orwellian rhetoric accompanying the judgment against ALCOA in the 1940s reinforces the notion that punishment, not efficiency or consumer welfare, was behind the judgment. The famous prosecutions of IBM and AT&T in the 1970s and 80s each spawned book-length investigations showing the perversity of the government’s claims. More recently, Microsoft became the latest successful firm to reap the government’s wrath for having the temerity to revolutionize industry and reward consumers throughout the world.

The rise of the regulatory state in the 1970s gave agencies and federal prosecutors nearly unlimited, unsupervised power to work their will on the public. Progressive ideology combined with self-interest to create a powerful engine for the demonization of success. Prosecutors could not only pursue their personal agenda but also climb the career ladder by making high-profile cases against celebrities. The prosecution of Michael Milken of Drexel Burnham Lambert is a classic case of persecution in the guise of prosecution. Milken virtually created the junk-bonk market, thereby originating an asset class that has enhanced the wealth of investors by untold billions or trillions of dollars. For his pains, Milken was sent to jail.

Martha Stewart is a high-profile celebrity who was, in effect, convicted of the crime of being famous. She was charged and convicted of lying to police about a case in which the only crime could have been the offense of insider-trading. But she was the trader and she was not charged with insider-trading. The utter triviality and absence of any damage to consumers or society at large make it clear that she was targeted because of her celebrity; e.g., her success.

Today, the impetus for pursuing successful individuals and companies today comes primarily from the federal level. Harvey Silverglate (author of Three Felonies Per Day) has shown that virtually nobody is safe from the depredations of prosecutors out to advance their careers by racking up convictions at the expense of justice.

Government is the institution charged with making and enforcing law, yet government has now become the chief threat to law. At the state and local level, governments hand out special favors and tax benefits to favored recipients – typically those unable to attain success on their own efforts – while making up the revenue from the earned income of taxpayers at large. At the federal level, Congress fails in its fundamental duty and ignores the law by refusing to pass budgets. The President appoints czars to make regulatory law, while choosing at discretion to obey the provisions of some laws and disregard others. In this, he fails his fundamental executive duty to execute the laws faithfully. Judges treat the Constitution as a backdrop for the expression of their own views rather than as a subject for textual fidelity. All parties interpret the Constitution to suit their own convenience. The overarching irony here is that the least successful institution in America has united in a common purpose against the successful achievers in society.

The most recent Presidential campaign was conducted largely as a jihad against the rich and successful in business. Mitt Romney was forced to defend himself against the charge of succeeding too well in his chosen profession, as well as the corollary accusation that his success came at the expense of the companies and workers in which his private-equity firm invested. Either his success was undeserved or it was really failure. There was no escape from the double bind against which he struggled.

It is clear, than, that the “progressivism” decried by Tom Perkins dates back over a century and that it has waged a war on wealth and success from the outset. The tide of battle has flowed – during the rampage of the Bull Moose, the Depression and New Deal and the recent Great Recession and financial crisis – and ebbed – under Eisenhower and Reagan. Now the forces of freedom have their backs to the sea.

It is this much-richer context that forms the backdrop for Tom Perkins’ warning. Viewed in this panoramic light, Perkins’ letter looks more and more like the battle cry of a counter-revolution than the crazed rant of an isolated one-percenter.