DRI-186 for week of 5-10-15: How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

An Access Advertising EconBrief:

How Can the Framework of Economics Help Us Assign Responsibility for War Crimes in World War II?

The previous EconBrief explains how the classical theory of voluntary exchange and the moral concept of individual responsibility mutually reinforce each other. The mutually beneficial character of voluntary exchange allows individuals to assume responsibility for their own actions in a free society. Individual responsibility permits voluntary exchange to function without the necessity of, say, review of each transaction by a neutral third party to insure fairness. The role of government in a voluntary society is minimal – to enforce contracts and prevent coercion.

Recently, the issue of responsibility for war crimes committed during World War II has been raised by various independent events. In Germany, a 93-year-old man is standing trial as an accessory to war crimes committed while he worked at the Auschwitz concentration camp during World War II. His presence in the camp is known, but his actual role and behavior is disputed. Should the prosecution have to prove he actually committed crimes, or would his participation as (say) a guard be enough to warrant his conviction as a war criminal?

A recent column in The Wall Street Journal by Bret Stephens (“From Buchenwald to Europe,” 05/05/2015) observes that many people in Germany were victims of Nazism, not Nazis – including many non-Jews. How should this affect Germany’s national policies today on European union, immigration and attitude toward systematic anti-Semitism and misogyny practiced by Muslim immigrants? “It isn’t easy, or ultimately wise, [for Germany] to live life in a state of perpetual atonement,” Mr. Stephens thinks.

Japan’s Prime Minister Shinzo Abe has publicly marveled about the transformation in relations between Japan and America, two countries who became deadly rivals in the late 1930s and waged total war in the 1940s, culminating in mankind’s only nuclear attack. Today we are two of the planet’s closest trading partners. Abe clearly wants to enlist the cooperation of the U.S. in Japan’s efforts to re-arm against the imminent threat of mainland’s China’s sabre-rattling territorial ambitions. But Abe has also made disturbing noises in domestic politics, worshipping at the shrine of Japan’s war dead and speaking equivocally about Japan’s aggressive invasion of its Asian neighbors in the 1930s. These speeches are a rough Japanese analogue to holocaust-denial.

In deciding what to make of these events, our analytical anchor is once again the economic logic of individual responsibility arising in a context of voluntary exchange.

The Flawed Notion of National Responsibility for War Crimes

In his Wall Street Journal piece, Bret Stephens depicts “the drama of postwar Germany” as its “effort to bury the Nazi corpse,” which “haunts Germany at every turn.” This phrasing is troubling. It implies that Germany’s residents bear a collective burden for sins committed long before most of them were even born.

Not surprisingly, this burden hasn’t just been heavy – it has been unshakeable. “Should Germany’s wartime sins be expiated by subsidizing the spendthrift habits of corrupt Greek governments? Should fear of being accused of xenophobia require Germans to turn a blind eye to Jew-hatred and violent misogyny when the source if Germany’s Muslim minority?” These questions, posed rhetorically by Mr. Stephens, should be placed in the pantheon of pointlessness with queries about the angel-carrying capacity of pinheads.

Even before World War II ended, many people realized that the Axis powers would have to be called to account for their sins. Members of the German and Japanese governments and military had committed acts that mined new depths of depravity. Civilization had institutions and standards for judging and punishing the familiar forms of crime, but the scope and magnitude of Axis atrocities persuaded the Allies to hold separate war-crimes tribunals for Germany and Japan. And the defendants at every trial were individual human beings, not collective entities called “Germany” or “Japan.”

To be sure, there were arguments – some of them almost as bitter as the fighting that preceded the trials – about which individuals should be tried. At least some of the disagreement probably reflected disappointment that the most deserving defendants (Hitler, Goering et al) had cheated the hangman by committing suicide beforehand. But nobody ever entertained the possibility of putting either nation on trial. In the first place, it would have been a practical impossibility. And without an actual trial, the proceedings would have been a travesty of justice. Even beyond that, though, the greater travesty would have been to suggest that the entirety of either nation had been at fault for acts such as the murder of millions of Jews by the Nazis.

We need look no farther than Stephens’ own article to substantiate this. He relates the story of his father-in-law, Hermann, who celebrated his 11th birthday on VE-Day, May 8th, 1945. He was the namesake of his father, a doctor who died in a German prison camp, where he was imprisoned for the crime of xenophilia, showing friendly feelings to foreign workers. Father Hermann apparently treated inhabitants of forced-labor camps and was indicating the likelihood of an ultimate Russian victory over Germany. Not only was he not committing atrocities, he was trying to compensate for their effects and got killed for his pains. Were we supposed to prosecute his 11-year old son? What madness that would have been! As Stephens put it, “what was a 10-ywar-old boy, whose father had died at Nazi hands, supposed to atone for?”

History tells us that Germany also harbored its own resistance movement, which worked behind the scenes to oppose Fascism in general and the war in particular. In fact, the Academy Award for Best Actor in 1943 went not to Humphrey Bogart, star of Best Picture winner Casablanca, but instead to Paul Lukas, who played a German who risked his life fighting the Nazis in the movie Watch On the Rhine. The Freiburg School of economists, a German free-market school of economists formed before the war, openly opposed Fascist economic policies even during World War II. Their prestige was such that the Nazis did not dare kill them, instead preferring to suppress their views and prevent their professional advancement. Then there were the sizable number of Germans who did not join the Nazi Party and were not politically active.

Hold every contemporary German criminally accountable for the actions of Hitler, Goebbels, Hess, Goering, Mengele and the rest? Unthinkable. In which case, how can we even contemplate asking today’s Germans, who had no part in the war crimes, weren’t even alive when they were committed and couldn’t have prevented them even if inclined to try, to “atone” for them?

The longer we think about the notion of contemporary national guilt for war crimes, the more we wonder how such a crazy idea ever wandered into our heads in the first place. Actually, we shouldn’t wonder too long about that. The notion of national, or collective, guilt came from the same source as most of the crazy ideas extant.

It came from the intellectual left wing.

The Origin of “Social Wholes”

There is no more painstaking and difficult pastime than tracing the intellectual pedigree of ideas. Apparently, the modern concept of the “social whole” or national collective seems traceable to the French philosopher, Claude Henri de Rouvroy, Comte de Saint Simon (hereinafter Saint-Simon). Saint-Simon is rightfully considered the father of Utopian Socialism. Born an aristocrat in 1760, he lived three lives – the first as a French soldier who fought for America in the Revolution, the second as a financial speculator who made and lost several fortunes, the third as an intellectual dilettante whose personal writings attracted the attention of young intellectuals and made him the focus of a cult.

Around age 40, Saint-Simon decided to focus his energies on intellectual pursuits. He was influenced by the intellectual ferment within France’s Ecole polytechnique, where the sciences of mathematics, chemistry, physics and physiology turned out distinguished specialists such as Lavoisier, Lagrange and Laplace. Unfortunately, Saint-Simon himself was able to appreciate genius but not to emulate it. Even worse, he was unable to grasp any distinction between the natural sciences and social sciences such as economics. In 1803, he wrote a pamphlet in which he proposed to attract funds by subscription for a “Council of Newton,” composed of twenty of the world’s most distinguished men of science, to be elected by the subscribers. They would be deemed “the representatives of God on earth,” thus displacing the Pope and other divinely ordained religious authorities, but with additional powers to direct the secular affairs of the world. According to Saint-Simon, these men deserved this authority because their competence in science would enable them to consciously order human affairs more satisfactorily than heretofore. Saint-Simon had received this plan in a revelation from God.

“All men will work; they will regard themselves as laborers attached to one workshop whose efforts will be directed to guide human intelligence according to my divine foresight [emphasis added]. The Supreme Council of Newton will direct their works… Anybody who does not obey their orders will be treated … as a quadruped.” Here we have the beginnings of the collective concept: all workers work for a single factory, under one central administration and one boss.

We can draw a direct line between this 1803 publication of Saint-Simon and the 20th century left-wing “Soviet of engineers” proposed by institutional economist Thorstein Veblen, the techno-socialism of J. K. Galbraith and the “keep the machines running” philosophy of Clarence Ayres. “Put government in the hands of technical specialists and give them absolute authority” has been the rallying cry of the progressive left wing since the 19th century.

Saint-Simon cultivated a salon of devotees who propagated his ideas after his death in 1825. These included most notably Auguste Comte, the founder of the “science” of sociology, which purports to aggregate all the sciences into one collective science of humanity. Comte inherited Saint-Simon’s disregard for individual liberty, referring contemptuously to “the anti-social dogma of the ‘liberty of individual conscience.'” It is no coincidence that socialism, which had its beginnings with Saint-Simon and his salon, eventually morphed into Nazism, which destroyed individual conscience so completely as to produce the Holocaust. That transformation from socialism to Nazism was described by Nobel laureate F. A. Hayek in The Road to Serfdom.

Today, the political left is committed to the concept of the collective. Its political constituencies are conceived in collective form: “blacks,” “women,” “labor,” “farmers,” “the poor.” Each of these blocs is represented by an attribute that blots out all trace of individuality: skin color, gender, economic class (or occupation), income. The collective concept implies automatic allegiance, unthinking solidarity. This is convenient for political purposes, since any pause for thought before voting might expose the uncomfortable truth that the left has no coherent policy program or set of ideas. The left traffics exclusively in generalities that attach themselves to social wholes like pilot fish to sharks: “the 1%,” the 99%,” “Wall St. vs. Main St.,” “people, not profit,” “the good of the country as a whole.” This is the parlor language of socialism. The left finds it vastly preferable to nitty-gritty discussion of the reality of socialism, which is so grim that it couldn’t even be broached on college campuses without first issuing trigger warnings to sensitive students.

The left-wing rhetoric of the collective has special relevance to the question of war crimes. Actual war crimes are committed by individual human beings. Human beings live discrete, finite lives. But a collective is not bound by such limitations. For example, consider the business concept of a corporation. Every single human being whose efforts comprise the workings of the corporation will eventually die, but the corporation itself is – in principle – eternal. Thus, it is a collective entity that corresponds to left-wing notions because it acts as if animated by a single will and purpose. And the left constantly laments the obvious fact that the U.S. does not and cannot act with this singular unanimity of purpose. For decades, left-wing intellectuals such as Arthur Schlesinger and John Kenneth Galbraith have looked back with nostalgia at World War II because the U.S. united around the single goal of winning the war and subordinated all other considerations to it.

The Rhetorical Convenience of Collective Guilt

Given its collective bent, we would expect to find the left in the forefront of the “collective guilt” school of thought on the issue of war crimes. And we do. For the left, “the country” is one single organic unity that never dies. When “it” makes a ghastly error, “it” bears the responsibility and guilt until “it” does something to expiate the sin. That explains why Americans have been figuratively horsewhipped for generations about the “national shame” and “original sin” of slavery. It is now 153 years after the Emancipation Proclamation and 150 years since the end of the Civil War, when a half-million Americans died to prevent slaveholding states from seceding from the Union. The U.S. Constitution was amended specifically to grant black Americans rights previously denied them following the Civil War. Yet “we” – that is, collective entity of “the country” on which left-wing logic rests – have not yet expunged this legacy of slavery from “our” moral rap sheet. Exactly how the slate should be wiped clean is never clearly outlined – if it were, then the left wing would lose its rhetorical half-Nelson on the public debate over race – but each succeeding generation must carry this burden on its shoulders in a race-reversed reprise of the song “Old Man River” from the play Showboat. “Tote that barge, lift that bale” refers in this case not to cotton but to the moral burden of being responsible for things that happened a century or more before our birth.

If this burden can be made heavy enough, it can motivate support for legislation like forced school busing, affirmative action and even racial reparations. Thus, the collective concept is a potentially powerful one. As Bret Stephens observes, it is now being pressed into service to prod Germany into bailing out Greeks, whose status as international deadbeats is proverbial. Exactly how were Greeks victimized by Germans? Were they somehow uniquely tyrannized by the Nazis – more so than, say, the Jews who later emigrated to Israel? No, Germany’s Nazism of seventy or eighty years ago is merely a handy pig bladder with which to beat today’s German over the head to extract blackmail money for the latest left-wing cause du jour. Since the money must come from the German government, German taxpayers must fork it over. A justification must be found for blackmailing German taxpayers. The concept of collective guilt is the ideal lever for separating Germans from their cash. Every single German is part of the collective; therefore, every single German is guilty. Voila!

The Falsity of Social Wholes

In The Counterrevolution of Science (1952), Nobel laureate F.A. Hayek meticulously traced the pedigree of social wholes back to their roots. He sketched the life and intellectual career of Saint Simon and his disciple Auguste Comte. Hayek then carefully exposed the fallacies behind the holistic method and explained why the unit of analysis in the social sciences must be the individual human being.

Holistic concepts like “the country” are abstract concepts that have no concrete referent because they are not part of the data of experience for any individual. Nobody ever interacts directly with “the country,” nor does “the country” ever interact directly with any other “country.” The only meaning possible for “the country” is the sum of all the individual human beings that comprise it, and the only possible theoretical validity for social wholes generally arises when they are legitimately constructed from their individual component parts. Indeed, Hayek views one role for social scientists as the application of this “compositive” method of partial aggregation as a means of deriving theories of human interaction.

The starting point, though, must be the individual – and theory can proceed only as far as individual plans and actions can be summed to produce valid aggregates. The left-wing historical modus operandi has reversed this procedure, beginning with one or more postulated wholes and deriving results, sometimes drawing conclusions about individual behavior but more often subsuming individuals completely within a faceless mass.

An example may serve to clarify the difference in the two approaches. The individualist approach, common to classical and neoclassical economics, is at home with the multifarious differences in gender, race, income, taste, preferences, culture and historical background that typify the human race. There is only one assumed common denominator among people – they act purposefully to achieve their ends. (For purposes of simplicity, those ends are termed “happiness.”)Then economic theory proceeds to show how the price system tends to coordinate the plans and behavior of people despite the innumerable differences that otherwise characterize them.

In contrast, the aggregative or holistic theory begins with certain arbitrarily chosen aggregates – such as “blacks.” It assumes that skin color is the defining characteristic of members of this aggregate; that is, skin color determines both the actions of the people within the aggregate and the actions of non-members toward those in the aggregate. The theory derived from this approach is correct if, and only if, this assumption holds. The equivalent logic holds true of other aggregates like “women,” “labor,”et al, with respect to the defining characteristic of each. Since this basic assumption is transparently false to the facts, holistic theories – beginning with Saint Simonian socialism, continuing with Marxism, syndicalism and the theories of Fourier, the Fabian socialists, Lenin, Sombart, Trotsky, and the various modern socialists and Keynesians – have had to make numerous ad hoc excuses for the “deviationism” practiced by some members of each aggregate and for the failure of each theory.

The Hans Lipschis Case

Is it proper in principle that Hans Lipschis, a former employee of Auschwitz and now ninety-three years old, be repatriated to Germany from the U.S. and tried as accessory in the murder of 300,000 inmates of the notorious World War II death camp? Yes. The postwar tribunals, notably at Nuremberg, reaffirmed the principle that “following orders” of duly constituted authority is not a license to aid and abet murder.

Lipschis’s defense is that he was a cook, not a camp guard. But a relatively new legal theory, used to convict another elderly war-crimes defendant, John Demjanjuk, is that the only purpose of camps like Auschwitz was to inflict death upon inmates. Thus, the defendant’s presence at the camp as an employee is sufficient to provide proof of guilt. Is this theory valid? No. A cook’s actions benefitted the inmates; a guard’s actions harmed them. If guards refused to serve, the camps could not have functioned. But if cooks refused to serve, the inmates would have died of starvation.

Verdicts such as that in the Demjanjuk case were undoubtedly born of the extreme frustration felt by prosecutors and men like Simon Wiesenthal and other Nazi hunters. It is almost beyond human endurance to have lived through World War II and then be forced to watch justice be cheated time after time after time. First the leading Nazis escaped or committed suicide. Then some of them were recruited to aid Western governments. Then some were sheltered by governments in South America and the Middle East. Over time, attrition eventually overtook figures such as Josef Mengele. Occasionally, an Adolf Eichmann was brought to justice – but even he had to be kidnapped by Israeli secret agents before he could be prosecuted. Now the job of legally proving actual criminal acts committed by minor functionaries fifty, sixty or seventy years after the fact becomes too difficult. So we cannot be surprised when desperate prosecutors substitute legal fancies for the ordinary rules of evidence.

Nevertheless, if the prosecution cannot prove that Lipschis committed actual crimes, then he must be acquitted. This has nothing to do with his age or the time lapse between the acts and the trial. Any other decision is a de facto application of the bogus principle of collective guilt.

Shinzo Abe and Guilt for Japanese Aggression in World War II

Japanese Prime Minister Abe is a classic politician. Like the Roman god Janus, he wears two faces, one when speaking abroad to foreign audiences and another when seeking reelection by domestic voters. His answers to questions about whether he was repudiating the stance taken by a previous Prime Minister in 1996 – that Japan was indeed guilty of aggression for which the Japanese government formally apologized – were delicately termed “equivocal” by the U.S. magazine U.S. News and World Report. That is a euphemism meaning that Abe was lying by indirection, a political tactic used by politicians the world over. He wanted his answer to be interpreted one way by Japanese voters without having to defend that interpretation to the foreign press.

Abe’s behavior was shameful. But that has absolutely nothing to do with the question of Japanese guilt for war crimes committed during and prior to World War II. That guilt was borne by specific individual Japanese and established by the Tokyo war-crimes tribunal. Indeed, one government spokesman eventually admitted this in just those words, albeit grudgingly, after Abe’s comments had attracted worldwide attention and criticism.

The implications of this are that Japanese today bear no “collective guilt” for the war crimes committed by previous Japanese. (It would be wrong to use the phrase “by their ancestors,” since presumably few Japanese today are related by blood to the war criminals of seventy or eighty years ago.) The mere coincidence of common nationality does not constitute common ancestry except in the broad cultural sense, which is meaningless when discussing moral guilt. Are we really supposed to believe, for example, that the surviving relatives of Jesse James or Billy the Kid should carry around a weighty burden of guilt for the crimes of their forebear? In a world where the lesson of the Hatfield’s and McCoy’s remains unlearned in certain precincts, this presumption seems too ridiculous for words.

Similarly, the fact that Japanese leaders in the 1920s, 30s and 40s were aggressively militaristic does not deny Japanese today the right to self-defense against a blatantly aggressive Chinese military establishment.

Much is made of Abe’s unwillingness to acknowledge the “comfort women” – women from Korea, China and other Asian nations who were held captive as prostitutes by Japanese troops. Expecting politicians to behave as historians is futile. If Japanese war criminals remain at large, apprehend and indict them. If new facts are unearthed about the comfort women or other elements of Japanese war crimes, publish them. But using these acts as a club against contemporary Japanese leaders is both wrong and counterproductive.

Besides, it’s not as if no other ammunition was available against Abe. He has followed Keynesian fiscal policies and monetary policies of quantitative easing since his accession to prime minister. These may not be crimes against humanity, but they are crimes against human reason.

Macro vs. Micro

Academic economics today is segregated between macroeconomics and microeconomics. The “national economy” is the supposed realm of macroeconomics, the study of economic aggregates. But as we have just shown, it is the logic of individual responsibility that actually bears on the issue of war crimes committed by the nations of Germany and Japan – because the crimes were committed by individuals, not by “nations.” 

One of the most valuable lessons taught by classical economic theory is that the unit of analysis is the individual – in economics or moral philosophy.

DRI-315 for week of 4-20-14: Is GDP NDG in the Digital Age?

An Access Advertising EconBrief:

Is GDP NDG in the Digital Age?

For years, we have heard the story of stagnant American wages, of the supposed stasis in which the real incomes of the middle and lower class are locked while the rich get richer. Various sophisticated refutations of this hypothesis have appeared. Households have been getting smaller, so the fact that “household income” is falling reflects mainly the fact that fewer people are earning the income that comprise it. “Wages” do not include the (largely untaxed) benefits that have made up a steadily larger share of workers’ real incomes ever since World War II.

But there is something else going on, something more visceral than statistics that leads us

to reject this declinism. It is the evidence of our own senses, our eyes and ears. As we go about our daily lives, each of us and the people around us do not exhibit the symptoms of a people getting materially worse off as we go.

For over thirty years, we have been forsaking the old broadcast trinity of network television stations, at first in favor of cable television and recently for a broadening array of alternative media. For over twenty years, our work and home lives have been dominated by desktop computers that have revolutionized our working and personal lives. For over ten years, an amazing profusion of digital products have taken over the way we live. Cell phones, smart phones, tablets, pads and other space-age electronic wonders have shot us out of a consumer cannon into a new world.

Can it really, truly be that we are worse off than we were before all this happened? As the late John Wayne would say if he were here to witness this phenomenon: “Not hardly.”

The pace of this technological revolution has not only been too fast for most of us to stay abreast of it. It has left many of our 20th century institutions blinking in the dust and gasping for breath. Mainstream economic theory and national income accounting, in particular, are trying to gauge the impact of a 21st-century revolution using the logic and measurement tools they developed in the first half of the 20th century.

The Case Study of Music

Music was one of the great consumer success stories of the 20th century. Thomas Edison’s invention of the phonograph paved the way for the recording of everything from live artistic performances to studio recordings of musicians and singers to the use of recorded sound tracks for motion pictures. The recordings themselves were contained on physical media that ranged from metal discs to vinyl to plastic. At first, these “records” were sold to consumers and played on the phonographs. Sales were in the hundreds of millions. Artists included some of the century’s most visible and talented individuals. The monetary value of these sales grew into billions of dollars.

Since recordings were consumer goods rather than capital goods, sales of records were recorded in the national income and product accounts. Or rather, the value added in the final, or retail, transaction was included. The value-added style of accounting was developed with the inauguration of the accounts in the late 1930s and early 40s in order to do three things: (1) show activity at various stages of production, but (2) highlight the new production of consumption goods each year to reflect the fact that the end-in-view behind all economic activity is consumption (3) by including only the additional value created at each stage to avoid double-counting.

As the 20th century came to a close, however, record albums were replaced by small audio discs that could be played on more compact devices. And these were soon supplanted by computers – that is, the playing medium became a computer and the music itself was housed within a computer file rather than a substantial physical object. As technology advanced, in other words, the media grew smaller and less substantial. But the message itself was unaffected; indeed, it was even improved.

How do we know that the value people derive from music has not been adversely affected by this transition to digitization? In The Second Machine Age, authors Erik Brynjolfsson and Andrew McAfee consider the question at length. In terms of physical units, sales of music have fallen off the table. Just in the years 2004-2008, they fell from roughly 800 million units to less than 400 million units – a decline of over 50% in four years! And the total revenue from sales of music fell 40% from $12.3 billion to $7.4 billion over the same period. By the standards we usually apply to business, this sounds like an industry in freefall.

In this case, though, those standards are misleading. During that same time span, the total unit-volume of music purchased still grew when purchases of digitized music where factored in. And acquisitions of music free of charge by various means swelled the total much, much larger. One of the things economists are best at is analyzing non-traditional markets, which is why Joel Waldfogel of the University of Minnesota was able to infer that the quality of music available to consumers has actually increased in the digital era. Today, anybody with a smartphone can access some 20 million songs via services like Spotify and Rhapsody. For those of us who recall the days of LPs and phonograph needles, the transition to today has been dizzying.

But the economics of the digital age have driven prices through the floor. As Brynjolfsson and McAfee observe, it is the same process that has driven the newspaper business to the wall and its readers online; the same one that has driven classified-advertising from newspapers to Craigslist; the same one that impels us to share photos on Facebook rather than buying prints for friends and family. “Analog dollars,” they conclude, “are becoming digital pennies.”

This creates an unprecedented marketplace anomaly. Measured by the value it creates for human beings, which is how economists want to measure it, the music industry is booming. But measured in dollars’ worth of marketplace transactions, which is how economists are currently able to measure it, the music industry is declining rapidly.

GDP RIP?

If the music industry were a singularity, we might treat it as a mere curiosity. It is not, of course; the gap between price/quantity product and value created yawns wide across the spectrum of industry. “By now, the number of pages and digital text and images on the Web is estimated to exceed one trillion…children with smartphones today have access to more information in real time via the mobile web than the President of the United States had twenty years ago. [!] Wikipedia alone claims to have over fifty times as much information as Encyclopedia Britannica, the premier compilation of knowledge for most of the twentieth century.”

“…Bits are created at virtually zero cost and transmitted almost instantaneously worldwide. What’s more, a copy of a digital good is exactly identical to the original… Because they have zero price, these services are virtually invisible in the official statistics. They add value to the economy, but not dollars to GDP… When a business traveler calls home to talk to her children via Skype, that may add zero to GDP, but it’s hardly worthless. Even the wealthiest robber baron would have been unable to buy this service [in the 19th century]. How do we measure the benefits of free goods or services that were unavailable at any price in previous eras?”

This understates the case. As Brynjolfsson and McAfee acknowledge, most of the new digital services substitute for existing services whose sales contribute to GDP. Thus, the digital bonanza actually lowers measured GDP at the same time that our well-being rises. In economic jargon, the effect on GDP’s function as index of national welfare is perverse.

This leads many people, including these authors, to the conclusion that GDP is no longer an adequate measure of national output. If this is true, it makes our monthly, quarterly and annual preoccupations with the growth rate of GDP seem pretty silly. The government agency whose task is the compilation of economic statistics is the U.S. Bureau of Economic Analysis. Its definition of the economy’s “information sector” aggregates sales of software, publishing, movies, audio recordings, broadcasting, telecommunications, and data processing and information services. These sales account for about 4% of measured GDP today. Yet we are commonly understood to be chest-deep in a new “economy of information” that is replacing the economy of tangible goods and services. Either this perception or that 4% metric is wrong; the latter seems vastly more probable.

What’s more, the irrelevance of GDP increases by the nanosecond.

New Products

Of course, not all digital products and services are substitutes for existing counterparts. Some of them are genuinely new. If these are similarly hard to incorporate in GDP, the distortion may be only half as great as that described above. But the digital revolution has displayed a propensity for creating things that were unknown heretofore but that soon became necessary accoutrements of daily life.

Longtime macroeconomist and textbook author Robert Gordon estimated the value of new goods and services added but missed by GDP at about 0.4% of GDP. That may not sound like much, but since the long-term average annual rate of productivity growth is around 2%, it would mean that we are overlooking 20% of annual productivity.

GDP and Investment: The Bad News Gets Worse

GDP is failing because it neglects to measure the tremendous increases in consumption and well-being conferred by the digital age. But GDP also measures investment, or purports to. Are its failings on the consumption side mitigated by its performance with investment?

No, they are magnified. The production of digital goods and services is heavily dependent on intangible assets rather than the familiar plant and equipment that are the focus of traditional investment. Brynjolfsson and McAfee identify four categories of these intangibles: intellectual property, organizational capital, user-generated content and human capital. It comes as no surprise to find that the measurement of these assets largely eludes GDP as well.

Intellectual property encompasses any creation of the human mind to which legal ownership can be attached. Patents and copyrights form the backbone of this category. A great deal of spending on research and development (R&D) constitutes investment in intellectual property.

Yet R&D has long been recognized as almost impossible to accurately measure because only its cost is transparent, while the value (e.g., capital) it creates often escapes measurement.

Organizational capital is an even broader concept intended to capture the value inhering in brands, processes, techniques and conceptual structures owned by particular businesses. This category long predates the digital age but is idealized by companies like Apple, whose brand and unique corporate style complement its portfolio of intellectual property to create perhaps the world’s most productive company. Accountants have long sought to put a price tab on things like “good will” and “brand name.” We have observed that the transition to a computer-savvy work force has necessitated investment in procedures and processes far greater than the initial spending on the computer hardware and software – spending that doesn’t show up in the national income accounts as investment.

User-generated content is a true digital innovation. Facebook, Twitter, YouTube, Pinterest, Instagram, Yelp and countless other websites are largely created by their users. The value of this approach is both undeniable and subjective, as anybody who has every previewed a restaurant on Yelp or planned a vacation with TripAdvisor can testify. The feedback generated these sites provides an object lesson in the generation of information – the kind of information that economists had to assume that people already knew because we didn’t know how markets could make it available to them. Now we do.

Human capital was a concept invented and popularized by economists Theodore Schultz and Gary Becker decades before the Internet existed. The talents, skills and training that we receive make us better productive “machines,” which inspired the analogy with physical capital.

How important are these intangible assets in the modern economy? Nobody knows with certainty, but – as always – economists have made educated guesses. Brynjolfsson and McAfee estimate the value of organizational assets as some $2 trillion. The preeminent theorist of investment, Dale Jorgenson, estimated that human capital is worth 5-10 times as much as the stock of all physical capital in the U.S. Investment in R&D has been estimated at roughly 3% of GDP in recent decades.

The degree of distortion in GDP numbers – specifically in measures of productivity, which compares growth in inputs and output – is harder to gauge in this case than in the consumption example. Some intangible assets, like R&D and human capital, are longtime thorns in the sides of statisticians; their measurement has always been bad and may be no worse now than before. In some cases, the distortions in investment may offset those in consumption, so that the measure of productivity may be accurate even though the numerator and denominator of the ratio are inaccurate. But the elements most closely associated with the digital revolution, such as user-generated content, impart a huge downward bias to measured productivity in the national income accounts.

A New, Improved GDP?

Economists and other commentators have done a good job of diagnosing the havoc wreaked on GDP by the digital revolution. Alas, they have rested on those laurels. In the “solutions and policy proposals” section of their work, they have fallen back on the tried and trite. GDP was a sibling of macroeconomics; the economic logic underlying the two is the same, with the operative word being “lying.” Macroeconomists are loathe to repudiate their birthright, so their reflex is to cast about for ways to mend the measurement holes in GDP rather than abandon it as a bad job. Hence the rosy glow cast by Brynjolfsson and McAfee over nebulous concoctions like the “Social Progress Index” and the “Gallup-Healthways Well-Being Index.” As for the touted “Gross National Index” of Bhutan, the less said about this laughable fantasy (treated in a previous EconBrief), the better.

The authors cite the comments of Joseph Stiglitz, whom they call “Joe” to profit by the implied familiarity with a Nobel laureate: “…Changes in society and the economy may have heightened the problems at the same time that advances in economics and statistical techniques may have provided opportunities to improve our metrics.” The “improvements” don’t seem to have included the ability to stop the scandalous misuse of the concept of “statistical significance” that has plagued the profession for many decades.

In fact, GDP has been known to be a failure almost since inception. Introductory economics textbooks routinely inculcate students in the shortcomings of GDP as a “welfare index” by listing a roster of flaws that predate the digital age, the Internet and computers. It has ignored the value of household services (predominantly provided by women), ignored the value created by secondary transactions of all kinds of used goods, undervalued services and thrown up its figurative hands when confronted by non-market transactions of all kinds. Its continued use has been a grim tribute to Lord Kelvin’s dubious dictum that “science is measurement,” the implication being that measuring badly must be better than not measuring at all.

What’s more, the blame cannot be laid at the feet of economic theory. It is certainly true that the digital age has brought with it a veritable flood of “free” goods – seemingly in contradiction with Milton Friedman’s famous aphorism that “there is no such thing as a free lunch.” Hearken back to Brynjolfsson and McAfee’s words that “bits are created at virtually zero cost.” A fundamental principle – perhaps the fundamental principle – of neoclassical microeconomics is that price should equal marginal cost, so that the value placed on an additional unit of something by consumers should equal its (opportunity) cost of production. When marginal cost equals zero, there is nothing inherently perverse about a price approaching zero. No, the laws of economics have not been suspended on the Internet.

Careful comparison of the age-old flaws of GDP and its current failure to cope with the challenges posed by digital innovation reveal a common denominator. Both evince a neglect of real factors for lack of a monetary nexus. The source of this insistence upon monetary provenance is the Keynesian economic theory to which the national income accounts owe their origin. Keynesian theory dropped the classical theory of interest in favor of a superficial monetary theory of liquidity preference. That is now proving bogus, as witness the failure of Federal Reserve interest-rate policies since the 1960s. Keynesian theory gives spending the pride of place among economic activity and relegates saving and assets to a subordinate role. Indeed, the so-called “paradox of thrift” declares saving bad and spending good. No wonder, then, that the national income accounts fail to account for assets and capital formation in a satisfactory manner.

Instead of tinkering around the margins with new statistical techniques and gimmicks when they have not even mastered basic statistical inference, economists should instead rip out the rotting growth root and branch. Reform of macroeconomics and reform of the national income accounts go hand in hand.

End the Reign of GDP

The digital age has merely exposed the inherent flaws of GDP and widened its internal contradictions to the breaking point. It is time to dump it. The next measure of national output must avoid making the same mistakes as did the founders of the national income accounts nearly 80 years ago.

The next EconBrief will outline one new proposal for reform of the national income accounts and explain both its improvements and shortcomings.

DRI-248 for week of 1-26-14: Economics as Movie ‘Spoiler’: Some Famous Cases

An Access Advertising EconBrief:

Economics as Movie ‘Spoiler’: Some Famous Cases

Motion pictures evolved into the great popular art form of the 20th century. In the 21st century, many popular cultural references derive from movies. One of these is the “spoiler” – prematurely revealing the ending of a book, play, movie or presentation of any kind.

Economists sometimes experience a slightly different sort of “spoiler.” Their specialized understanding often defeats the internal logic of a presentation, completely spoiling the author’s intended effect. Movies are especially vulnerable to this effect.

The casual perception is that our attitude toward movies is distorted by the high quotient of improbably beautiful and talented people who populate them. While it is true that physical beauty has always been highly prized by Hollywood, it is also true that plain or even ugly people like Wallace Beery, Marie Dressler, Jean Gabin and Rodney Dangerfield have become champions of the movie box office. The locus of unreality in movies has actually been the stories told.

Movies are best regarded as fairy tales for adults. They over-emphasize dramatic conflict and exaggerate the moral divide between protagonist and antagonist. It is difficult to find a real-world referent to the “happy ending” that resolves the typical movie. Protagonists are all too often “heroes” whose actions exceed the normal bounds of human conduct. In recent years, this tendency has escalated; veteran screenwriter William Goldman has complained that movie protagonists are now not heroes but “gods” whose actions exceed the bounds of physics and other natural laws.

In this context, it is hardly surprising that movie plots have sometimes ignored the laws of economics in order to achieve the stylized dramatic effects demanded by the medium. Since public knowledge of economics is, if anything, less well developed than knowledge of natural science, these transgressions have generally gone unremarked. Indeed, the offending movies are often praised for their realism and power. Thus, it is worthwhile to correct the mistaken economic impressions left by the movies, some of which have found their way into popular folklore.

In each of the following movies, the major plot point – the movie’s resolution – rests on an obvious fallacy or failure to apply economic logic.

Scrooge (U.S. title: A Christmas Carol) (1951)

We know the plot of this most classic of all Christmas tales by heart. Victorian businessman Ebenezer Scrooge, famed miser and misanthrope, abhors the spirit of Christmas. He is visited by three ghosts, emblematic of his youthful past, his empty present life and the lonely, friendless end that awaits him in the future. Their guidance awakens him to the waste of his single-minded pursuit of material gain and rejection of personal affection and warmth. He realizes the cruelty he has visited upon his clerk, the good-hearted family man, Bob Cratchit. Most of all, he keenly regrets the fate of Cratchit’s crippled son, Tiny Tim, who seems doomed by Cratchit’s poverty.

Having witnessed Scrooge’s emotional reformation, the audience is now primed for the crowning culmination. On the day after Christmas, Bob Cratchit shows up at Scrooge’s office, a bit late and encumbered by holiday festivities. Fearfully, he tiptoes to his desk, only to be brought up short by Scrooge’s thunderous greeting. Expecting a verbal pink slip, Cratchit receives instead the news that Scrooge is doubling his wage – and that their working relationship will be hereafter cordial. Tiny Tim’s future is redeemed, and the audience has experienced one of the most cathartic moments on film.

Unless, that is, the viewer happens to be an economist – in which case, the reaction will be a double take accompanied by an involuntary blurt like “I beg your pardon?” For this is a resolution that just simply makes no sense. In order to understand why, the first thing to realize is that the scriptwriter (translating Charles Dickens’ timeless story to the screen) is asking us to believe that Bob Cratchit has heretofore been working for half of what Scrooge is now proposing to pay him.

In the 17th and 18th centuries, historical novelists like Charles Dickens played the role played by filmmakers in the 20th century. They brought history alive to their audiences. Ideally, they stimulated further study of their subject matter – indeed, many famous historians have confessed that their initial stimulus came from great storytellers such as Dickens and Dumas. But many readers searched no further than the stories told by these authors for explanations to the course taken by events. Dickens was an exponent of what the great black economist Thomas Sowell called “volitional economics.” In this case, for example, the wage paid by Scrooge and received by Cratchit ostensibly depended on Scrooge’s will or volition, and nothing else. No role existed for a labor market. Cratchit was not a partisan in his own cause, but rather a passive pawn of fate.

This is not a theory likely to commend itself to an economist. Scrooge and Cratchit are working to produce services purchased by their customers. Who are these? Well might you ask, for neither Dickens nor the filmmakers chose to clutter up the narrative with such extraneous considerations. Yet it is this consumer demand that governs the demand for Scrooge’s output, which in turn values the productivity of Cratchit’s work. In a competitive labor market, the market wage will gravitate toward the marginal value product of labor; e.g., the value of Cratchit’s product at the margin translated into money with the aid of the market price for Scrooge’s services. And in crowded London, there is no doubt about the competitive demand for the low-skilled labor provided by Bob Cratchit. That is what attracted the Bob Cratchits of the world to London in the first place during the Industrial Revolution.

Two possibilities suggest themselves. Either Bob Cratchit was working for half of his marginal value product previously and is only now being elevated to that level, or Scrooge is now proposing to pay Cratchit a wage equal to twice Cratchit’s marginal value product. The first possibility requires us to believe not only that Cratchit was and is a complete idiot, but that henot Scrooge as Dickens clearly implies – is responsible for Tiny Tim’s tenuous medical situation. After all, all Cratchit had to do was step outside Scrooge’s firm and wander off a block or two in order to better his circumstances dramatically and pay Tiny Tim’s medical tab without having to bank on Scrooge’s miraculous reformation. Cratchit was guaranteed a job at slightly less than double his then-current wage by simply underbidding the market wage slightly. But he inexplicably continued to work for Scrooge at half the wage his own productivity commanded.

Alternatively, consider possibility number two. Scrooge is now going to pay Cratchit a wage equal to twice his (Cratchit’s) marginal value product. If Scrooge insists on raising his price commensurate with this wage hike, he will go out of business. If he keeps his price the same, he will now be working for much less net income than all the other business owners in his position. (See below for the implications of this.)

There is no third possibility here. Either Cratchit was (is) crazy or Scrooge is. And either way, it completely upsets Dickens’ cozy suggestions that all’s right with the world, Scrooge has restored the natural order of things and everybody lived happily ever after.

Of course, Scrooge may have accumulated considerable assets over the course of his life and business career. He may choose to make an ongoing gift to Cratchit in the form of a wage increase, as opposed to a bonus or an outright transfer of cash. But it is important to note that this is not what Dickens or the filmmakers imply. The tone and tenor of Dickens’ original story and subsequent films adapted from it unambiguously suggest that Scrooge has righted a wrong. He has not committed a random act of generosity. In other words, Dickens implies – absurd as it now clearly seems – that possibility number one above was his intention.

It is clear to an economist that Dickens has not provided a general solution to the problem of poverty in 19th century England. What if Scrooge were the one with the sick child – would his acquisitive ways then be excusable? Dickens makes it clear that Scrooge’s wealth flows directly from his miserliness. But if miserliness produces wealth and good-heartedness promotes poverty, economic growth and happiness are simply mutually exclusive. After all, the message of the movie is that Scrooge promises to reform year-round, not just one day per year. Henceforward, when approached by collectors for charity, he will refuse not out of meanness but out of genuine poverty, his transformation having stripped him of the earning power necessary to contribute to charity.

In actual fact, of course, Scrooge never existed. Neither did Cratchit. And they are not reasonable approximations of actual 19th-century employers or workers, either. But these figments of Dickens’ imagination have been tragically influential in shaping opinions about the economic history of Victorian England.

The Man in the White Suit (1951)

This comedy from England’s famed Ealing Studios (the world’s oldest movie studio) is justly famous, but for the wrong reasons. It highlights the inefficiency of British socialism and the growing welfare state, but its fame derives from its plot highlight. Inventor Alec Guinness worms his way into the R&D division of a local textile business, where he develops a fabric so durable that it will never wear out. Instead of gaining him the wealth and immortality he craves, it gains the opprobrium of the textile owners, who fear that the fabric will ruin them by cutting replacement sales to zero. They block his efforts at production and the film ends when his formula is revealed to contain a flaw – which he may or may not ever get the chance to de-bug, since he is now a pariahin the industry.

The film is often cited as an example of how big business prevents new technology from empowering consumers – that is, it is cited as if it were a factual case study rather than a fictional movie. Actually, it is a classic example of the failure to deploy economic logic.

Would a textile firm find it profitable to produce an “indestructible” fabric of the sort depicted in the film? Certainly. The firm would achieve a monopoly in the supply of fabric and could obtain finance to expand its operations as necessary to meet the immediate demand. In practice, of course, such a fabric would not really be indestructible in the same sense as, say, Superman’s costume. It would be impervious to normal wear but would suffer damage from tearing, fire, water and other extreme sources. Changes in fashion would also necessitate replacement production. Nevertheless, we can safely grant the premise that the invention would drastically reduce the replacement demand for fabric. But that would not deter an individual firm from developing the invention – far from it.

The film depicts textile firms striving in combination to buy out the inventor. Perhaps overtures of that kind might be made in reality. They would be doomed to failure, though, because in order to afford to pay the inventor’s price the firms would have to compensate the inventor for the discounted present value of the monopoly profits available in prospect. But in order to raise an amount of money equal to those monopoly profits, the firms would themselves have to be monopolists willing to mortgage their future monopoly profits. Textile companies may enjoy legislative protection from foreign competition in the form of tariffs and/or quotas, but they will still not possess the kind of market power enabling them to do this, even if they were so predisposed. Thus, both of the movie’s key plot points are undermined by economic logic.

This reasoning explains why there is so little proof for longstanding allegations that large corporations buy off innovators. While it will often be profitable to acquire competitors, it will normally be prohibitively expensive to buy and suppress revolutionary inventions. The value of a competitive firm reflects its competitive rate of return. The value of a revolutionary innovation reflects the value of a (temporary) monopoly, heavily weighted toward the relatively near future.

The Formula (1980)

The Formula was one of the most eagerly awaited movies of its day because it starred two of the most legendary stage and screen actors of all time, Marlon Brando and George C. Scott. It also boasted a topical plot describing a conspiracy to suppress a secret formula for producing synthetic gasoline. Who was behind the conspiracy? None other than “the big oil companies” – in the 1970s and 80s, as today, the oil companies were periodically trotted out as public whipping boys for the adverse effects of public policies on energy prices.

The film begins during World War II with the escape into Switzerland of a German military officer carrying secret documents. In the present day, Scott plays a homicide policeman investigating the grisly murder of his former supervisor. The decedent was working abroad for a large oil company at the time of his death, and his boss (Brando) reveals that his duties included making payoffs to Middle Eastern officials. Scott’s character also learns about the existence of a formula for conversion of coal into petroleum, supposedly developed secretly by German scientists during World War II and used by the Nazis to fuel their war machine.

Scott’s character seeks the killer and the formula for the remainder of the film. Each successive information source is murdered mysteriously after speaking with him. Eventually he learns the formula from its originator, who tells him that the oil companies plan to suppress it until its value is enormously enhanced by the extinction of remaining petroleum reserves. Brando’s character blackmails Scott’s character into relinquishing the formula and the film ends with the understanding that it will be suppressed indefinitely. The world is denied its chance at plentiful oil and the oil companies enforce an artificial oil shortage.

Novelist Steve Shagan also wrote the screenplay, but it should be noted that the version of the film released to theaters was the result of a conflict with director John G. Avildsen. Although no claim was advanced about the veracity of events depicted or information presented, the audience is clearly invited to take the film’s thesis seriously. Alas, history and economics preclude this.

The film makes much of the fact that Germany was able to conduct military operations around the world for a decade despite having no internal source of petroleum and only tenuous external sources. Germany must have had the ability to manufacture synthetic fuels, we think; otherwise, how could she have waged war so long and effectively?

The premise is sound enough. Germany’s oil refineries in the Ruhr Valley were perhaps the leading military target of Allied bombings; both crude and refined oil were in critically short supply throughout the 1940s. And there really was a “formula” for synthetic fuel – or, more precisely, a chemical process. But the film’s conclusion is all wrong, almost banally so.

The Fischer-Tropsch process was invented by two German scientists – not in World War II, but in 1925. It was not secret, but rather a matter of public knowledge. German companies used it openly in the 1930s. During World War II, when Germany had little or no petroleum or refining capability, the process provided about 25% of the country’s auto fuels and a significant share of other fuels as well. After the war, the process traveled to the U.S. and several plants experimented with it. In fact, it is still used sparsely today. Possible feedstocks for conversion into petroleum are coal, natural gas and biomass.

The reason that few people know about it is that it is too expensive for widespread use. Biomass plants using it have gone broke. Natural gas is too valuable for direct use by consumers to waste on indirect conversion into petroleum. And coal conversion wavers on the edge of commercial practicality; just about the time it begins to seem feasible, something changes unfavorably.

In real life – as opposed to reel life – the problem is not that secret formulas for synthetic fuels are being hidden by the all-powerful oil cartel. It is that the open and above-board chemical processes for conversion to synthetic fuel are just too darned expensive to be economically feasible under current conditions.

Erin Brockovich (2000)

Erin Brockovich is the film that sealed the motion-picture stardom of Julia Roberts by earning her an Academy Award for Best Actress. It was based on events in the life of its title character. Erin Brockovich was an unemployed single mother of three who met liability attorney Ed Masry when he unsuccessfully represented her in her suit for damages in a traffic accident. She took a job with his firm interviewing plaintiffs in a real-estate settlement against Pacific Gas & Electric.

In the course of her interviews, Brockovich claimed (and the film portrayed) that she unearthed a laundry list of diseases and ailments suffered by the 634 plaintiffs, who were residents of Hinkley, CA. These included at least five different forms of cancer, asthma and various other complaints. Brockovich was surprised to learn that PG&E had paid the medical expenses of these residents because of the presence of chromium in the drinking water, despite having assured the residents that the water was safe to drink. Eventually, Brockovich interviewed a company employee who claimed that corporate officials at PG&E were aware of the presence of “hexavalent chromium” (e.g.; chromium from multiple sources) in the drinking water and told employees in Hinkley to hide this information from residents. The whistleblower had been told to destroy incriminating documents but kept them instead and supplied them to Brockovich.

The film does everything but accuse the company of murder in so many words. It reports the jury verdict that awarded the Hinkley residents $333 million in damages. (The standard contingency fee to the law firm is 33%.) Brockovich received a $2 million bonus from her delighted boss. The film received a flock of award nominations in addition to Roberts’s Oscar, made a pile of money and got excellent reviews.

However, a few dissenting voices were raised in the scientific community. Scathing op-eds were published in The Wall Street Journal and The New York Times by scientists who pointed out that little or no science backed up the movie’s claims – or, for that matter, the legal case on which the movie was based.

It seems that the only scientific black mark against hexavalent chromium was lung cancer suffered by industrial workers who inhaled the stuff in large quantities. In contrast, the hexavalent chromium in Hinkley was ingested in trace amounts in drinking water. The first law of toxicology (the science of toxicity) is “the dose makes the poison.” Ingestion allows a substance to be attacked by digestive acids and eliminated via excretion; inhalation would permit it to be absorbed by organs like the lungs. Ironically, lung cancer wasn’t among the varieties identified by Brockovich.

What about the lengthy list of cancers grimly recited in the movie? Doesn’t that constitute a prima facie case of wrongdoing by somebody? No – just the reverse. As the scientists pointed out, biological or industrial agents are normally targeted in their effects; after all, they were usually created for some very specific purpose in the first place. So the likelihood of one agent, like hexavelent chromium, being the proximate cause of various diverse cancers is very remote. In any town or city, a medical census covering a reasonable time span will produce a laundry list of diseases like the one Brockovich compiled.

Economics provides equal grounds for skepticism of the movie’s conclusions. The movie imputes both wrongdoing and evil motives to a company. Somewhere within that company, human beings must have harbored the motives and committed the wrongs. But why? The standard motivation behind corporate wrongdoing is always money. The monetary category involved is normally profit. Presumably the imputed rationale would run somewhere along these lines: “Corporate executives feared that admitting the truth would result in adverse publicity and judgments against the company, costing the company profits and costing them their jobs.” But that motivation can’t possibly have applied to this particular case, because PG&E was a profit-regulated public utility.

Public-utility profits are determined by public-utility commissions in hearings. If a utility earns too much profit, its rates are adjusted downward. If it earns too little, its rates are adjusted upward. For over a century, economists have tried but failed to think up ways to get utility managers to behave efficiently by cutting costs. Economists have even argued in favor of allowing utilities to keep profits earned in between rate hearings, hoping that managers will have an incentive to cut costs if the company could actually keep profits in that scenario.

But here, according to the filmmakers, PG&E executives were so fanatically dedicated to safeguarding profits that the company couldn’t keep anyway that they were willing to knowingly poison their customers. They were willing to risk losing their jobs and going to jail (if their deception was uncovered) to guard against losing their jobs for loss of profits that were never going to be gained or lost in the first place. No economist will swallow this.

If the filmmakers had an explanation for this otherwise insane behavior, they didn’t offer in the movie. And without a scientific case or an economic motive, it is impossible to accept the film’s scenario of corporate conspiracy at face value. Instead, the likely motivational scenario is that PG&E executives didn’t confess their crimes and beg forgiveness because they had absolutely no scientific reason to think they had committed any crimes. They didn’t warn Hinkley residents about “known dangers” because they didn’t know about any dangers. They didn’t need to admit the presence of chromium in the drinking water because everybody already knew there were trace amounts of chromium in the drinking water. But they certainly weren’t going to advertise the presence of non-existent dangers for fear that somebody would seize the opportunity to make a legal case where none really existed.

Movies are Fairy Tales for Adults

The moral to these cases is that movies are fairy tales for adults. Given that, the absence of economic logic in the movies is not hard to fathom. How much economic logic did we learn from the fairy tales we heard in childhood?

This is not to indict movies – or fairy tales, either. We need them for the emotional sustenance they provide. Fairy tales help cushion our childhood introduction to reality. Movies help us cope with the wear and tear of daily life by recharging our emotional batteries.

But we must never confuse the fairy tale world of movies with the rational world in which we live. Our ultimate progress as a species depends on our reliance on markets, rational choice and free institutions. Of necessity, movies operate according to the visual logic of dramatic action. We expect reel life to liberate us from the conventions of real life and this is why movies seldom make economic sense.

DRI-186 for week of 1-5-14: The Secular Stagnation of Macroeconomic Thought

An Access Advertising EconBrief:

The Secular Stagnation of Macroeconomic Thought

The topic du jour in economic-policy circles is “secular stagnation,” thanks to two recent speeches on that topic by high-powered macroeconomist Lawrence Summers. The term originated just after World War II when Keynesian economists, particularly Alvin Hansen, used it to justify their forecast of the high unemployment and low growth that ostensibly awaited the U.S. after the war.

Now, nearly 70 years later, it is back. In a recent Wall Street Journal op-ed, monetary economist John Taylor likened its re-emergence to a vampire arising from his crypt. There is indeed something ghoulish about the propensity of Keynesian economists to ransack outdated textbooks in search of conceptual support for their latest brainstorm.

The backstory behind secular stagnation is only half the story, though. The other half is the insight it offers into the mindset of its patrons.

The Birth of the Secular Stagnation Hypothesis

As World War II drew to a close, economists gradually turned their attention to a problem that had intermittently occupied them since the late 1930s. The Great Depression had soured the profession on the workings of free markets. The publication of John Maynard Keynes’ General Theory of Employment Interest and Money had suggested a new framework for economic analysis that placed emphasis on unemployment and its elimination. While war mobilization had made this issue moot, the return of servicemen and readjustment to a peacetime economy brought it back to prominence.

Many Keynesians foresaw a return to mass unemployment and Depression. The leading American exponent, Alvin Hansen, developed a specific hypothesis along those lines. Keynes had posited a simple theory of aggregate consumption: consumption was a stable, linear function of income. These properties implied that, over time, it might become progressively more difficult to maintain full employment.

A numerical example using the simple Keynesian macroeconomic model will clarify this point. Y = real income or output, which is the sum of C (Consumption), I (Investment) and G (net Government spending). Further, C is a linear function of Y; that is, C = a + bY, where the “a” term reflects the influence on Consumption of factors other than real income and “b” (the slope of the Consumption function depicted diagrammatically) is the marginal propensity to consume from additional income acquired. Assume, purely for expository purposes, that a = 50, b = .75, I = 100 and G = 100. If Y = 1000, then C = 50 + .75 (1000) = 800. The influence of technology, which improves from year to year, will cause productivity to increase and output to increase over time, all other things equal. Assume, again purely for illustrative purposes, that this increase is 5%. In that case, the full employment level of income will increase from 1000 to 1050. But C does not increase by 5% to 840; it increases only to 837.50. In order to preserve full employment (according to Keynesian logic), the sum of I and G will have to increase by 212.50, an increase of 6.25% over its previous value of 200 – which is more than 5%. Over time, this putative annual shortfall in Consumption would get larger and larger, requiring successively larger doses of I and G to keep us at full employment.

Already we can see the germ of logic behind Hansen’s secular stagnation hypothesis, which is that Consumption over time will fall farther and farther behind the level necessary to preserve full employment. (The word “secular” does not reflect its customary meaning of “non-religious or worldly” but rather its technical economic meaning of “a long time series of indefinite duration.”) Underconsumption is a theme dear to the hearts of Keynesian economists. In this case, it depends as a first approximation on the algebraic structure of the simple Keynesian model, in which Consumption is a simple linear function of income (Y).

There was much more to the analysis than this. In principle, Consumption might increase for reasons unrelated to income. But Hansen predicted just the opposite. He believed the primary source of autonomous increases in Consumption was population growth, and he foresaw a sharp in U.S. population growth after the war. He was equally pessimistic about increases in autonomous Investment because he thought the highest-returning investments had already been tapped. Thus, by default, government deficit spending was the only possible remedy for progressively worsening unemployment and stagnating economic growth – hence the term “secular stagnation.”

The Gruesome Death of the Secular Stagnation Hypothesis

Alvin Hansen was known as the “American Keynes.” Presumably this was because of the apostolic fervor with which he preached Keynesian gospel. In this case, he shared something else with Keynes: the thoroughness with which history repudiated his ideas.

Hansen predicted population decline. Instead, the U.S. experienced the biggest baby boom in history. Among other effects, this produced an explosion of household investment in consumer durables such as homes, automobiles and appliances. The shortages and government-imposed rationing of World War II had generated a pent-up demand that burst its boundaries in the postwar climate.

Rather than unemployment and depression, the U.S. enjoyed one of its biggest expansions ever in 1946. This eventually created problems when, during the Korean War, the Truman administration preferred to fund the war via money creation rather than employing the borrowing that had financed most defense expenditures during World War II. The result was inflation, which the Administration countered with wage and price controls.

The U.S. had borrowed to the max in its conquest over the Axis powers, with debt climbing to its highest level as a percentage of national output. In his recent book, David Stockman pointed out the important role played by the Eisenhower Administration in paying down this debt and returning a semblance of sanity to federal-government spending.

This combination of private-sector buoyancy and government fiscal retrenchment left no need or room for the Keynesian remedy proposed by Hansen. As the 1950s unfolded, economic theoreticians on all sides of the spectrum delivered the coup de grace to the secular stagnation hypothesis.

In 1957, Milton Friedman presented his “permanent income” hypothesis of consumption spending, which fleshed out the individual utility-maximizing theory of consumer behavior with the picture of a consumer whose spending is governed by an estimation of lifetime or “permanent” income. He or she will tend to dissave by borrowing when young and by drawing down accumulated assets when old, meanwhile accumulating assets via saving in prime earning years. It is not actual or realized income so much as this individualized conception of expected normal income that influences consumption spending.

Keynesian Franco Modigliani developed his own theory of “life cycle” consumption, rather broadly similar to Friedman’s, within the same time frame. Left-wing economist James Duesenberry developed a “relative income” hypothesis stating that consumption was influenced by the consumer’s income relative to that of others. While there were important theoretical and practical differences between the three theories, they all rejected the simple Keynesian linear dependence of consumption on income. And this drove a stake through the heart of the secularly widening gap between consumption and income. The slats had been kicked out from under the secular-stagnation platform.

The secular stagnation hypothesis had already been proved to be a resounding flop in practice. Now it was shown to be wrong in theory as well. Before Keynesian economics had even been adopted on a wholesale basis, it had suffered its first crushing defeat.

The Rise of the Undead: Secular Stagnation Rises from the Crypt

Broadway impresarios sometimes revive past productions, but they invariably choose to revive hit plays rather than flops. Based on its first run, secular stagnation would not seem to be a prime candidate for revival. Nevertheless, Lawrence Summers mounted a new version of the concept and took it out of town for a tryout in two recent speeches, supplemented by comments on subsequent blog posts.

In his first speech, made to the International Monetary Fund Research Council, Summers grappled with the theoretical issues involved in resurrecting Hansen’s ancient bogeyman. Paraphrasing Clemenceau on war and generals, Summers mused that “finance is too important to be left to financiers.” The U.S. quickly recovered from the financial panic of 2008-09, but the ensuing four years brought astonishingly little progress when measured in standard macroeconomic metrics like employment and output growth. Although the term “secular stagnation” has long been neglected by his profession, Summers now finds it “not…without relevance” in understanding our current situation.

If the U.S. suffered a mass power blackout, output would fall precipitously. It would be idiotic for economists to object that electricity constitutes “only 4%” of total output – obviously, its importance is not indicated by its fraction of total output. Similarly, finance should be viewed in the same light – as the intermediating, lubricating force that enables the bulk of our goods and services. If a power blackout did occur, we would naturally expect restoration of service to be followed by a catch-up period of increased output, rather than the sort of prolonged stagnation we have actually experienced after the financial crisis. So why hasn’t it happened?

Summers’ explanation to the IMF audience was technical – that the “natural rate of interest” is negative; e.g., below zero. “We may need to think about we manage an economy in which the zero nominal interest rate is a chronic and systemic inhibitor of economic activity, holding our economies back, below their potential.” Summers means that the practical inability to charge negative rates of interest – e.g., subsidize loans rather than charge money for them – is what is chaining the U.S. economy down.

In his second speech and follow-up blog  comments, Summers elaborated on the policy implications of his musings. “Our economy is constrained by lack of demand rather than lack of supply. Increasing our capacity to produce will not translate into increased output unless there is more demand for goods and services.” Of course, this is the old-time Keynesian religion of underconsumption, set to the background music of Cole Porter’s “Everything Old is New Again.” Secular stagnation has been brought down from the attic, fumigated with a dusting of demographics (the declining U.S. birth rate) to remove the stench of disgrace left by Hansen.

We need to “end the disastrous trends toward less and less government spending and employment each year.” In other words, the problem is not that we overspent and created too much sovereign debt in 2008-09; the problem is that we spent too little – and then cut spending after that. We should replace coal-first power plants – that will necessitate a huge program of capital spending to keep the power on. Following Keynes, Summers stresses the importance of supporting domestic demand by improving the trade balance.

Just as this program begins to sound suspiciously like a hair of the dog that bit us – or maybe the entire hair coat – Summers removes all doubt. It is “a chimera to rely on regulation” to pop asset bubbles in the face of the monetary excess necessary to underpin his program.

At the close of his first speech, Summers provided the only saving grace with the caveat: “This may all be madness and I may not have this right at all.”

Krugman’s Endorsement of Summers: For This We Need Economists?

Summers’ revival of the secular stagnation hypothesis was the talk of policymaking circles. Half of the talk was probably devoted to wondering what Summers was saying; the other half to wondering why he was saying it. Perhaps trying to be helpful, Summers’ partner in Keynesian crime Paul Krugman weighed in with his own interpretation of Summers’ remarks.

Inevitably, Krugman’s own views crept in to his discussion. The result was a blog post that could scarcely be believed even when read. (Readers with broad minds and strong stomachs are referred to “Secular Stagnation, Coalmines, Bubbles, and Larry Summers,” 11/16/2013, on the Krugman archive.)

Krugman begins with an uncharacteristic (and unrepeated) touch of humility. Noting the similarity between his own previous published diagnosis of our economic ills and Summers’ current one, he admits that Summers’ is “much clearer…more forceful, and altogether better.”

According to Krugman, he and Summers both view the U.S. economy as stuck in a “liquidity trap.” This is another Keynesian illustration of market pathology. As Keynes originally described the concept, a liquidity trap existed during an economic depression so intense that monetary policy was rendered impotent. Governments use banks as their tool for creating money; securities sold to the public are snapped up by banks, which in turn use them as the basis for making loans to businesses. But banks cannot force businesses to take out loans. If businesses decide that conditions are so bad that investing is too risky no matter how low the borrowing rate of interest, then monetary policymakers are helpless. In contrast, fiscal policy labors under no such constraint, since the government can always spend money for stimulative purposes. In a liquidity trap, though, monetary policy is likened to “pushing on a string” – a fruitless effort.

Krugman carries this notion further by identifying it with Summers’ evocation of a negative equilibrium interest rate. Investment demand is so weak and the desire to save so strong that the two are equilibrated only when “the” interest rate is below zero. In this climate, Krugman maintains, “the normal rules of economic policy don’t apply…virtue becomes vice and prudence becomes folly. Saving hurts the economy – it even hurts investment thanks to the paradox of thrift.” Krugman hereby drags in Keynesian anachronism #3. The so-called “paradox of thrift” states that the attempt to save more results in less saving because ex ante increases in saving will reduce income and employment, thus preventing the saving that consumers are trying to do, while reducing consumption as well. The only problem with this is that we have actually realized increases in saving and income at the same time, which is diametrically opposite to the effects predicted by the concept.

But these are trifles compared to the powerhouse contentions Krugman has coming up. Summers outlined a general program of public spending to increase demand and frankly admitted the futility of suppressing bubbles caused by the money creation necessary to finance the spending. Is Krugman troubled by this? Not merely “no,” but “Hell, no.”

“While productive spending is best, unproductive spending is still better than nothing…this isn’t just true of public spending. Private spending that is wholly or partially wasteful is also a good thing, unless it somehow stores up trouble for the future.” And how could that possibly happen? (See “Europe, Sovereign Debt of; Europe, Financial Crises of; Europe, Bailouts Multiply Across; Europe, Political Protests Blanket.”)

Krugman continues with an example of wasteful spending by U.S. corporations that produced virtually no payoff after three years. “Nevertheless, the resulting investment boom would have given us several years of much higher employment, with no real waste, since the resources employed would have otherwise been idle.[emphasis added] F.A. Hayek characterized Keynesian economics as the negation of the market, a description well befitting this rationalization. In Krugman’s world, the labor market and relative prices might as well not exist, for all the effect they have. Microeconomics either does not exist or operates on a different plane of existence than the macroeconomic plane on which the statistical construct of aggregate demand wields its decisive influence. For this we need economists?

Krugman now arrives at “the radical part of Larry’s presentation” – as if the foregoing weren’t radical enough! He straightforwardly, even proudly admits what Summers guardedly suggests – that asset bubbles are a good thing. In fact, according to Krugman, U.S. prosperity has been built on bubbles for quite a while. “We now know that the economy of 2003-2007 was built on a bubble.” Krugman is being coy here since he made a celebrated statement in 2002 calling for the Federal Reserve to create a bubble in the housing market. Oddly enough, this attracted almost no attention at the time and has brought him no adverse reaction since then. “You can say the same about the latter part of the 90s expansion; and… about the later years…of the Reagan expansion, which was driven …by runaway thrift institutions and a large bubble in commercial real estate.”

Krugman’s recall of history is curiously defective, especially considering that he was employed in the Reagan Administration at the time, albeit in a minor position. The 1986 tax reform law was, and still is, pinpointed for tax-law changes that helped pop a real-estate bubble largely built on tax-deductibility. The political Left is fond of criticizing Reagan for claiming to have lowered taxes in the early 80s while actually raising them later on. The Left is even fonder of excoriating Reagan and Paul Volcker for ending inflation on the backs of the poor by killing off inflation by stopping monetary expansion too abruptly. Now Krugman is criticizing Reagan for doing just the opposite!

Krugman’s piece de resistance is his riposte to future critics who will object to the runaway inflation that the Summers/Krugman project will promote. Krugman unblinkingly admits that inflation “expropriates the gains of savers,” but replies that “in a liquidity trap, saving may be a personal virtue but it’s a social vice.” And in an economy facing secular stagnation, the liquidity trap is “the norm. Assuring people they can get a positive rate of interest on safe assets means promising them something the market doesn’t want to deliver.”

Krugman implicitly and explicitly assumes that markets are as dysfunctional as life-support patients with no respirator. But when he needs a justification for deep-sixing the life savings of hundreds of millions of people, he suddenly pulls out “the market” and gives its ostensible verdict a personal blessing of moral authority. Yet in this very same blog post, he cavalierly dismisses his critics as “a lot of people [opponents of Krugman] want economics to be a morality play and they don’t care how many people suffer in the process” [!!] For the benefit of readers unfamiliar with the long-running debate between Krugman and his critics, those critics are free-market economists who want bubbles to end with unsustainable businesses being liquidated rather than bailed out, and the business cycle to be cut short rather than prolonged indefinitely with each iteration worse than the previous one.

Intellectual Stagnation, Not Economic

At this point, it is all too clear that secular stagnation has taken place. But the stagnation is intellectual, not economic. Keynesian economists are framing policy arguments using terms like “secular stagnation,” “liquidity trap” and “paradox of thrift.” These recondite terms went out of fashion over thirty years ago, along with the paleo-Keynesian economic theory that spawned them. They survive in the 20th-century textbooks and graduate-school memories of economists now approaching retirement.

The shocking character of the Summers/Krugman hypothesis doesn’t derive from its vintage, though. Its anti-economic character – relative prices are irrelevant, waste is a good thing, markets are worthless except when economic managers need a pretext for arbitrary action – is professionally repellent. Even more frightening is the hubris on display. Summers is a disgusting sight, standing up in front of an audience at the International Monetary Fund, pontificating with grandiose gravity about “managing an economy” – as if he were the CEO of a U.S. economy of some 315 million people and tens of thousands of businesses.

There are quite a few people who consider a large public corporation too unwieldy to manage effectively. The difficulty of one economist managing an entire economy must increase not merely linearly but exponentially, considering the interaction and feedback effects involved. At least Summers had the minimal presence of mind to recognize that he might be mistaken. Krugman, in contrast, displays the same mindset as his intellectual antecedent, John Maynard Keynes. Several biographers and friends – including F.A. Hayek, with whom his relations were cordial despite their opposing views – remarked that Keynes was obsessed with his own preeminence as a public intellectual rather than with mastery of economic theory as such. Hayek remarked that Keynes may have been the most brilliant man he ever encountered but was a bad economist. Summers and Krugman show no signs of possessing the intellectual diversity and flexibility of Keynes – only his arrogance and deep-seated need for personal attention.

There is another shocking aspect to this latest policy flap. Summers/Krugman are in the anomalous position of criticizing the results of their own policies. That is, even they cannot credibly maintain that we have lived under a regime of laissez-faire or tight fiscal or monetary discipline for the last five years. They can only insist that not enough was done. Of course, this is the standard big-government lament; when big-government fails, try bigger government. But in this case, they are telling us that the results they formerly called bad were really good and we should expect no more from them in the future. The friendliest left-winger would have to acknowledge that Summers/Krugman are confessing failure and telling us that this is the best we can do. Notice, for example, that neither man stressed the very short-term nature of their policy prescription or promised that once their strategy of fiscal inebriation reached its apogee, we could let the market take over. No, theirs was a counsel of despair reminiscent of late 1970s malaise.

You can’t get any more stagnant than that.

DRI-309 for week of 8-25-13: What Does ‘Social’ Mean Today?

An Access Advertising EconBrief:

What Does ‘Social’ Mean Today?

For decades, European political parties have rallied around the banner of “social democracy.” Today, Catholic churches throughout the world solemnly urge their congregants to work for “social justice.” Businesses have long been advised to practice “social responsibility.” Certain investment funds are now organized around the principle of “social investing.” Celebrities advertise the possession of a finely honed “social conscience.”

The rhetorical weight carried by the word “social” has never been heavier. Judging by this, one would suppose that the adjective’s meaning is well-defined and universally understood. Assuming that to be so, it should be relatively easy to explain its meanings above, as well as many other similar ones.

That turns out to be far from true. A great economist and social theorist called “social” the great “weasel word” of our time. In the words of the old popular song, how long has this been going on?

Well over two hundred years, believe it or not. The great English philosopher Lord Action accurately observed that, “Few discoveries are more irritating than those which expose the pedigree of ideas.” Those people who invoke the word “social” as a holy sacrament will be outraged to learn its pedigree. For the rest of us, though, the knowledge should prove illuminating.

“What is Social?”
One man above all others made it his business to learn the history and meaning of the word “social.” F.A. Hayek was a leading European economist before World War II, and among his friends were the Freiburg School of German economists who styled themselves the “Soziate Marktwirtschaft” or “Social Market” economists. Why, Hayek wondered, didn’t they simply call themselves “free-market” economists? What magic did the word “social” weave to gain precedence over the idea of freedom?

Over the years, Hayek morphed from world-class economist to world-renowned social philosopher. His fascination with the rhetorical preeminence of “social” eventually produced the article “What Is ‘Social?’ What Does It Mean?” It was published in 1957, then reworked and republished in 1961. In it, Hayek performed feats of semantic archaeology in order to expose the pedigree of “social” in economics and political philosophy.

Hayek’s research produced a scathing assessment. He declared that “the word ‘social’ has become an adjective which robs of its clear meaning every phrase it qualifies and transforms it into a phrase of unlimited elasticity, the implications of which can always be distorted if they are unacceptable, and the use of which…serves merely to conceal the lack of any real agreement between men regarding a formula upon which… they are supposed to be agreed.” It is symptomatic of “an attempt to dress up slogans in a guise acceptable to all tastes.” The word “always confuses and never clarifies;” “pretends to give an answer where no answer exists,” and “is so often used as camouflage for aspirations that have nothing to do with the common interest… .” It has served as a “magical incantation” and used to justify end-runs around traditional morality.

Whew. Can one word that is thrown around so casually and so widely really justify this indictment? Let’s briefly take one example of its usage and try on a few of Hayek’s criticisms for size.

The Example of “Social Justice”
A popular reference source (Wikipedia) has this to say about the concept of “social justice.” “Social justice is justice exercised within a society, particularly as it is applied to and among the various social classes of a society. A socially just society is one based on the principles of equality and solidarity;” it “understands and values human rights as well as recognizing the dignity of every human being.”

The origin of the phrase is ascribed to a Jesuit priest in 1840. It was used to justify the concept of a “living wage” in the late 19th century. The Fascist priest Father Coughlin (curiously, his Fascism goes unremarked by Wikipedia) often employed the term. It became a mainspring of practical Catholic teaching and of the Protestant Social Gospel. Social theorist John Rawls developed a theory of equity intended to give substance to a secular version of social justice.

We can easily locate all of the characteristics identified by Hayek even in this short précis. The definition of “social justice” as “justice exercised within a society” is tautological; this expresses the communal syrup that the word pours over every subject it touches. The “principles of equality and solidarity” sound satisfactorily concrete, but the trouble is that there are no such principles – unless you’re willing to sign off on the notion that everybody is supposed to be equal in all respects. “Solidarity,” of course, is the complementary noun to “social;” each purportedly sanctifies without really saying anything substantial. As such, solidarity became the all-purpose buzzword of the international labor movement. It implies fidelity to an unimpeachable ideal without defining the ideal, just as “social” implies an ideal without defining it.

The reference to “human rights” may well seem obscure to those unfamiliar with the age-old left-wing dichotomy between “property rights” and “human rights” – a false distinction, since all rights are human rights by implication. There may some day be a society that recognizes the dignity of every human being, but the sun has not yet shone on it. Thus, social justice illustrates Hayek’s reference to an underlying lack of agreement masked by a façade of universal accord. The roll call of dubious subscribers to the concept, ranging from Fascists to socialists to left-wing extremists and simplistic activists, dovetails perfectly with a concept of “unlimited elasticity,” which masquerades “in the guise acceptable to all tastes” as a “magical incantation” used to justify dubious means to achieve allegedly noble ends.

The Basic Uses of “Social”
Devotees of the various “social” causes have used the word in certain basic recurring ways. Each of them displays Hayek’s characteristics. We can associate these generic uses with specific “social” causes and government actions.

First, there is the plea for inclusiveness. As originally developed, this had considerable justification. As Hayek admitted, “in the last [19th] century…political discussion and the taking of political decisions were confined to a small upper class.” The appending of “social” was a shorthand way of reminding the upper classes that “they were responsible for the fate of the most numerous and poorest sections of the community.” But the concept “seems somewhat of an anachronism in an age when it is the masses who wield political power.” This is probably the dawn of the well-worn injunction to develop a “social conscience.” We associate the mid-19th century with famous “social” legislation ranging from the end of debtor’s prisons and reform of poor laws to the repeal of the Corn Laws in England.

Second, “social” is a plea to view personal morality abstractly rather than concretely by assigning to it remote consequences as well as immediate ones. For example, traditional ethics implores the businessman to treat his employees and customers fairly by respecting their rights and not hurting them. But “social responsibility” demands that businessmen know, understand and affect the consequences of all their buying decisions as well. They should refuse to buy inputs produced using labor that is paid “too little,” even though this benefits their own customers and workers, because it ostensibly hurts the workers who produce those inputs.

This stands the economic logic of free markets on its head. Businessmen are experts on their own business and the wants of their customers. Free markets allow them to know as little as possible about the input goods they buy because this economizes on information – which is scarce – and on the use of businessmen’s time – which is likewise scarce. But the illogic of “social responsibility” demands that businessmen specialize in learning things it is difficult or impossible for them to know instead of things they normally learn in the course of doing business. This is so absurdly inefficient it is downright crazy; instead of doing what they do best, businessmen are supposed to divert their attention to things they know little about and disregard the value generated by the free market.

The crowning absurdity is that “socially responsibility” expects businessmen to accept on faith the assertions of activists that buying goods produced with low-wage labor hurts the workers who produce those goods. And this is dead wrong, since it does just the opposite – by increasing the demand for the goods labor produces, it increases the marginal product of labor and labor’s wage. The same illogic is sometimes extended even farther to consumers, who are even less well placed to gauge the remote consequences of their personal buying decisions and, thus, are even more at the mercy of the bad economics propounded by “social” theory activists.

Thirdly, “social” theory demands that government also reverse its traditional ethical role by treating individuals concretely rather than abstractly. The traditional Rule of Law requires government to judge individuals by abstract rules of justice – and that the same abstract rules apply to all individuals. But “social justice” requires government to judge individuals according to their respective merits, which requires treating different individuals by different rules; e.g., repealing the traditional Rule of Law. Contemporary examples of this repeal abound: affirmative action, bailouts for firms adjudged “too big to fail,” eminent domain for the benefit of private business, augmented rights granted to certain politically identifiable groups while basic rights are denied to others, and on and on, ad nauseum.

Finally, “social” theory clearly implies the upsetting of traditional morality by the substitution of “social” criteria for traditional moral criteria. Although it seems superficially that traditional moral criteria are without rational foundation, this is misleading. In fact, those criteria evolved over thousands of years because they were conducive to a successful order within humanity. As the Spanish philosopher Ortega y Gasset reminds us, “order is not a pressure imposed on society from without, but an equilibrium which is set up from within.” The word “equilibrium” implies the existence of change which culminates in a new, improved order. Social evolution is thus comparable to economic equilibrium, in which new goods and services are subject to a market test and accepted or rejected. Surviving moral criteria are abstract rules that may not benefit every single individual in every single case but that have demonstrated powerful survival value for humanity over thousands of years. And these rules are subject to a powerful evolutionary test over time.

In contrast, “social” theory substitutes the concrete, ad hoc rules adapted to each situation by self-appointed social theorists. These self-appointed experts reject free competition in both economics and political philosophy; thus, these social theories do not receive the same rigorous evolutionary tests that vetted traditional morality.

Both the impersonal workings of the free economic market and the abstract, impersonal workings of the “market” for morality and social philosophy seem to be harsh because there is no inherent spokesman or advocate to explain their operation to the public. Economists have failed to perform this task for free markets, while the influence of moral arbiters like clergymen and philosophers has waned in recent decades. The plans of “social” theorists appear to be kind because they are designed with appearance in mind rather than to actually attain the results they advertise.

Corollaries to the Uses of “Social”
Certain corollary effects of these uses are implied and have, in fact, emerged. When the appeal to the communal of “social” effects of our actions predominates over our personal actions, our personal responsibility for our own lives and welfare erodes. And sure enough, the widespread reluctance to take responsibility for individual actions is palpable. Why should we take responsibility for saving when the federal government takes our money by force for the ostensible purpose of saving and investing it for our individual retirement uses? Thus does saving decline, asymptotically approaching zero. Why should we accept responsibility for our own errors when we are forced to take responsibility for the errors of others by taxation, criminal justice, economic policy and a host of other coercive actions by government? Hence the growing tendency to claim universal “rights” to goods and services such as food, health care, housing and more.

The irony is that each of us is the world’s leading expert on our self. “Social” policy forces us to shoulder responsibility for people and things we aren’t, and can never be, expert on, while forswearing responsibility for the one person on whom our expertise is preeminent. In economic terms, this cannot possibly be an efficient way to order a society.

This leads to another important point of information theory. The demands of “social” theory imply that certain select individuals possess talents and information denied to the rabble. These are the people who decide which particular distribution of income or wealth is “socially just, which business actions are or aren’t “socially responsible, what linguistic forms are or aren’t “socially aware,” and so forth.

The elevation of some people above others is practiced predominantly by government. In order to reward people according to merit, government must in principle have knowledge about the particular circumstances of individuals that justify the rewards (or deny them, as the case may be). In practice, of course, government is so distant from most individuals that it cannot begin to possess that kind of knowledge. That is why the concept of group rights has emerged, since it is often possible to identify individual membership in a group. Race, gender, religion, political preference and other group affiliations are among the various identifiers used to justify preferential treatment by government.

The blatant shortcomings of this philosophy have now become manifest to all. One need not be a political philosopher steeped in the Rule of Law to appreciate that envy now plays a pivotal role in politics and government. Rather than concentrate on producing goods and services, people now focus on redistributing real income and wealth in their own favor. This is the inevitable by-product of a “social” theory focused on fairness rather than growth. The laws of economics offer a straightforward path toward growth, but there is no comparably unambiguous theory of fairness that will satisfy the competing claimants of “social” causes.

And once again, the shortcomings of “social” theory as magnified by a further irony. For decades, government welfare programs have been recognized as failures by researchers, the general public and welfare recipients themselves. Only “social theorists,” bureaucrats and politicians still support them. This is bad enough. But even worse, the rejection of free markets by “social” theorists has eliminated the only practical means by which individual merit might be used as the basis for compensatory social action. Although you are the world’s leading expert on you and I am the leading authority on me, you will sometimes gain authoritative information about me. By allowing you to keep your own real income and the freedom to utilize it as you see fit, I am also allowing to conduct your own personal policy of “social justice.” This concept of neighborhood or community charity is one form of tribalism that has persisted for thousands of years because it is clearly efficient and has survival value. Yet it is one of the first victims of government-imposed “social justice.” Bureaucrats resent the competition provided by private charity. Even more, they resent watching money used privately when it could have been siphoned off for their own use.

Socialism
What is the relation between the adjective “social” and the noun “socialism?” Socialism had roots traceable at least to the Middle Ages, but its formal beginnings go back to the French philosophers Saint-Simon and Comte in the 18th century. It was Saint-Simon who visualized “society” as one single organic unity and longed to organize a nation’s productive activity as if it were one single unified factory.

It is this pretense that defines the essence of socialism and the appeal of its adjectival handmaiden, “social.” Participation in the sanctifying “social” enterprise at once washes the participant clean of sin and cloaks pursuit of personal gain in the guise of altruism and nobility. It makes the participant automatically virtuous and popular and “one with the universe” – well, part of a subset of like-minded people, anyway.

Socialism sputtered to life in 19th-century revolutionary Europe and enjoyed various incarnations throughout the late 19th and 20th centuries. It has failed uniformly, not just in achieving “the principles of equality and solidarity” but in providing goods and services for citizens. Failure was most complete in those polities where the approach to classical socialism was closest. (In this regard, it should be remembered that the Scandinavian welfare states fell far short of Great Britain on the classic socialist criterion of industrial nationalization.) Yet socialism as an ideal still thrives while capitalism, whose historical preeminence is inarguable, languishes in bad odor.

Hayek’s criticisms of “social” explain this paradox. Socialism’s shortcomings are its virtues. Its language encourages instant belief and acceptance. It smoothes over differences, enveloping them in a fog of good feeling and obscurantism. It promises an easy road to salvation, demanding little of the disciple and offering much. Words are valued for their immediate effects, and the immediate effects of “social” are favorable to the user and the hearer. True, it is an obstacle to clear thinking – but when the immediate products of clear thinking are unpalatable, who wants clear thinking, anyway?

“Social” keeps the ideal of socialism alive while burying its reality. As long as “social” prefaces anything except an “ism,” the listener has license to dissociate the adjective from the noun and luxuriate in the visceral associations of the former while ignoring the gruesome history of the latter.

Just One LIttle, Itsy-Bitsy, Teeny-Weenie Word
F.A. Hayek closed his essay on “social” by saying, “it seems to me that a great deal of what today professes to be social is, in the deeper and truer sense of the word, thoroughly and completely anti-social.” Hayek was right that “such a little word not only throws light upon the process of the evolution of ideas and the story of human error, but … also exercises an irrational power which becomes apparent only when… we lay bare its true meaning.”

Who would have dreamed that one word could say so much?

DRI-316 for week of 6-23-13: Is It Time to Start Selling The Wall Street Journal at the Grocery Store Checkout?

An Access Advertising EconBrief:

Is It Time to Start Selling The Wall Street Journal at the Grocery Store Checkout?

For three-quarters of a century, The Wall Street Journal has been one of the world’s financial periodicals of record. It has battled London’s Financial Times for supremacy since about the time that the sun set on the British Empire; i.e., the close of World War II. World leadership is a responsibility not lightly assumed or relinquished, and the Journal has taken its role seriously.

The American press has long followed the Journal‘s lead on financial and economic stories, citing it as the primary source of information and analysis. Business schools and economics departments at colleges and universities have offered Journal subscriptions at reduced rates, both to whet student interest and improve the student’s vocabulary and grasp of fundamentals. The need for an intermediary between the public and the professional elite of business, finance and economics is perennial. It has never been greater than today, when the tools of the trade are sophisticated and complex and the threat of economic collapse looms.

The advent of the Internet drove a stake through the heart of the American newspaper business, which had preyed on its public. High advertising rates, particularly for classified ads, were its life’s blood. Its left-leaning editorial stance gradually alienated its readership. Despite declining circulation, the industry obstinately continued to disdain its customers. The Journal and USA Today were lonely exceptions to this trend. Virtually alone, they have maintained circulation and operational scale.

Still, ominous signs are visible. The most disturbing has been the obvious divergence in viewpoint between the editorial page and reporting in the Journal. Technically, reporting is not supposed to reveal a viewpoint; it is supposed to present the “who, what, when, where, why” of the news impartially. Nowadays, though, this is apparently considered passé. And the viewpoint of Journal reporters is left of center, albeit to the right of (say) Mother Jones or The Nation. Meanwhile, Wall Street Journal editors remain a rare bulwark of support for free markets and political conservatism in the print media.

But that disturbing divergence is not stable. The Journal‘s reporting is showing increasing signs of the economic illiteracy and rabid anti-business, anti-market populism that characterizes the popular press. The latest example appeared in the Tuesday, June 25, Wall Street Journal‘s “Marketplace” section. The headline read: “McKesson CEO Due a Pension of $159 Million.”

The $159 Million “Pension”

The article’s byline belongs to one Mark Maremont. The piece opens with the unfavorable characterization of “executive pension plans [that] sometimes grow to a hefty size…as extra retirement cushions for long-serving CEOs.” The curious description of a retirement plan as an extra retirement cushion is advance notice that class warfare is about to break out. After all, the executive pension itself may run to millions of dollars, so it is difficult to justify the pejorative use of these words except as a lubricant for class envy.

That is merely the warm-up for the lead that, contrary to standard journalistic practice, follows in the second paragraph. “Then there’s the record $159 million pension benefit of John Hammergren, the current chairman and CEO of drug distributor McKessonCorp. That’s how much he would have received in a lump-sum benefit had he voluntarily departed on March 31,” according to a company proxy filing.

Mr. Maremont cites unnamed compensation consultants who declare this the largest pension on record for a public-company CEO, and maybe the largest in history. His one named source, James Reda of Gallagher HR Consulting, calls the pension “excessive” because Mr. Hammergren has been very highly paid in recent years. (Mr. Hammergrens annual compensation has averaged over $50 million.) Understandably, Mr. Hammergren refused the opportunity to comment on Mr. Maremont’s findings.

That is understandable because it is difficult to remain both calm and lucid in the face of anything so outrageous. Begin with the obvious; namely, the article’s characterization of the $159 million as a “pension.” To just about everyone in the English-speaking world, the word “pension” connotes a fixed, periodic (classically, annual) payment made according to the terms of a retirement plan. It is essentially synonymous with what insurance calls a “guaranteed life annuity.” Merriam Webster’s Collegiate Dictionary lists “a fixed sum paid regularly to a person” as the first definition of “pension.” According to Wikipedia:

A pension is a contract for a fixed sum to be paid regularly to a person, typically following retirement from service.[1] Pensions should not be confused with severance pay; the former is paid in regular installments, while the latter is paid in one lump sum. The terms retirement plan and superannuation refer to a pension granted upon retirement of the individual.

Given the everyday meaning of the word “pension,” a $159 million pension certainly ought to be a record. The sight of that headline must have set eyes rolling and brows rising throughout the world, because it said that McKesson’s CEO would receive $159 million a year for the rest of his life the equivalent of winning a lottery every year. And it is perfectly obvious that he will get nothing of the kind.

Those two words, “lump-sum benefit,” in paragraph two of Mr. Maremont’s article, are the tipoff to the charade he is staging. Diehard readers who inspect all 24 paragraphs of the article and its accompanying 13-paragraph sidebar detailing the calculation of Mr. Hammergren’s “pension,” eventually learn the truth; namely, that what Mr. Maremont calls a pension is closer to what Wikipedia calls a “severance.”

In journalism as it was formerly practiced – that is to say, reputable journalism – accuracy and clarity were the watchwords. Style and viewpoint were never issues. And political issues such as the fairness of CEO compensation belonged on the editorial page, not the front page of the “Marketplace” section. Ambiguity about the meaning of the word “pension” would never, ever have been allowed to mar the meaning of a story.

If It Doesn’t Waddle Like a Duck, Quack Like a Duck or Have Feathers…

One has the paralyzing suspicion that, when pressed to account for his conduct, Mr. Maremont would reply with a shrug, “But McKesson calls it a pension. I just accepted their terminology. It wasn’t my place to question or interpret it.”

If the Journal received a fisherman’s report touting a “world-record sardine” weighing a ton, possessing a pointed dorsal fin, skin instead of scales, double rows of sharp teeth and two unblinking eyes, would it unquestioningly print this account unedited and unqualified? Would it put “world-record sardine” in the headline, burying the likelihood that the sardine was really a shark somewhere in the fine print of the article?

Presumably not. But that is what the Journal has done here. We know this because Mr. Maremont himself grudgingly discloses it over the course of the article. After having spent the first 17 paragraphs implicitly flogging McKesson (and, by implication, Mr. Hammergren) for trying to put something over on shareholders and the public, Mr. Maremont comes clean about what is really going on. In paragraph 18, he discloses that in 2006 federal regulators changed the rules of disclosure for executive pensions. Under the new rules, companies were required to calculate a present discounted value for the pension each year under the hypothetical assumption that the executive retired that year.

Like many regulations, this initially seems weird and pointlessly complicated but actually has a logical purpose behind it. One obvious way to enable shareholders and analysts to understand how much executives are being compensated by boards of directors is to compare their compensation with that of executives at other firms. But coming up with a valid comparison for executives in different circumstances – different ages, different health profiles, etc. – is impossible without finding a format in which relevant values can be expressed in a common denominator. Thus, regulatory requirements have demanded that companies express compensation in just that common-denominator form – the present discounted value of the future pension payments. This is evaluated in various termination contexts, one of which is current termination of service.

And that is that Mr. Hammergren’s reported McKesson “pension” represents – not a pension in the generally understood sense, but in this special regulatory sense. There is no way under heaven that this pension – a lump sum representing the present discounted value of all future pension payments – belongs in the same comparative category as an ordinary “annual pension,” which is only one of those payments that gets discounted in computing the lump-sum payment. The lump sum can be compared to other executive pensions that are expressed identically – and this is the only comparison in which it should figure. Presumably, this is the comparison Mr. Maremont really refers to when he claims “record” status for it. Or rather, that is presumably what he would say if called upon to defend his article.

It is impossible to believe that a bylined reporter on the world’s leading financial daily could or would innocently mislead readers by suggesting one thing, then reluctantly and inferentially removing the suggestion through information supplied later. There is no innocent explanation; Mr. Maremont is either too incompetent to realize that he has confused his readers or he intended to profit from the confusion. That is, he is either a fool or a knave. There is no third possibility.

Either way, the episode indicates that The Wall Street Journal should take its place alongside The National Inquirer and News of the World on the tabloid rack at the grocery-store checkout.

Never Ascribe to Venality That Which Can be Explained by Mere Stupidity

Was Mr. Maremont devious or merely stupid? That remains to be determined. There is evidence on both sides.

Go back to the beginning of the article, where Mr. Maremont cites James Reda’s denunciation of McKesson’s lump-sum pension benefit as “excessive” because Mr. Hammergren is already making so much money. This is bizarre, to say the least. Every compensation specialist knows that the foundation of the pension calculation is the employee’s salary or wage. Mr. Hammergren is making lots of money, so of course his lump-sum benefit will be relatively large; that is how the system works.

But that’s not all. For years, the political left has complained about benefits that are not tied to performance. Here, Mr. Hammergren’s benefit is tied to his performance, which is a key factor in Mr. Hammergren’s high pay. “Under his leadership, the company’s stock price has more than tripled, significantly outperforming the overall market.” This came after his ascension to co-CEO in 1999 “after an accounting scandal decimated the company’s top ranks and led its stock to tank.”

Next, consider Mr. Maremont’s handling of the discount rate, which is usually the most contentious issue whenever the matter of present discounted value arises. In order to value benefits scheduled to occur in the future, a means to reduce the future value to its current equivalent must be found. The theoretically correct way is by “discounting” them; that is, dividing the benefit by a discount factor that embodies the relevant opportunity cost of generating the benefit. In the special case of a perpetuity – a fixed sum available forever – the perpetual benefit is divided by 1 + i, where i represents the interest rate or discount rate that reflects the opportunity cost. The general principle applies in slightly more complicated form to non-perpetual future benefits, which are discounted and summed to yield the present discounted value.

Mr. Maremont implies that the discount rate used by McKesson inflates the value of Mr. Hammergren’s pension; e.g., that it would give him too much money if he retired this year. Moreover, this point is anything but academic since he also says that “McKesson pays its executive pensions as a lump sum rather than annually.” Apparently, the company computes the pension in the conventional manner, using a formula based primarily on the executive’s final salary, years of service and performance. Then, using actuarial age to derive the length of the future payments, they calculate the present discounted value of the pension and pay it.

Mr. Maremont claims, quite correctly, that the selection of the discount rate can influence the outcome of the calculation. In general, the lower the discount rate, the higher will be the present discounted value. Mr. Maremont claims that “most firms” use “a rate set by the Internal Revenue Service of about 3.7% for a similar-age person [to Mr. Hammergren].” Thus, according to Mr. Maremont, McKesson’s chosen rate of 1% results in a windfall of some $52.4 million to Mr. Hammergren, compared to the IRS rate.

A good way to start a fistfight in a roomful of economists is to pose a problem involving selection of a discount rate. No problem is more contentious among forensic economists, the occupational niche that specializes in hiring out their services for legal testimony on valuation, regulation and antitrust. There is an excellent a priori case for assuming that “most firms” accept the IRS valuation simply because they don’t want to start a beef with the IRS, not because the government is a better economic evaluator of discount rates than private firms.

But rather than debate that point at length, ponder the implications of another admission by Mr. Maremont. “Mr. Hammergren will be eligible for full retirement benefits next year, when he turns 55. If interest rates rise, his lump-sum pension could decline [my emphasis].” Mr. Maremont’s conditional prediction clearly suggests that McKesson chooses the discount rate by linking it to current market rates, perhaps by pegging it to an interest-rate index. Indeed, this is just the sort of discount-rate-choice mechanism that appeals to economists generally. And this is tremendously important, not only to Mr. Hammergren but to our analysis, since it means that the CEO’s pension is running interest-rate risk. This would be important under any circumstances, but in today’s world it makes Mr. Maremont’s casual remark that “his lump-sum pension could decline” just about the biggest understatement since Noah said it looks like rain. If the difference between 3.7% and 1% is worth a $52 million gain to Mr. Hammergren, how big do you suppose the loss would be if interest rates rose from the (indexed) 1% up to 7%, or 10%, or 15% or 20%?

The Fed has just announced its anxiety to end the QE process. The market reaction has fallen just short of complete hysteria. Interest rates have already risen markedly. An interest-rate rise that could cost Mr. Hammergren tens of millions of dollars between now and next year is entirely plausible. This loss of value would occur within his retirement account, on the putative eve of his retirement. Elsewhere in the article, Mr. Maremont bemoans the special privilege ostensibly given to executives by these pensions and the detriment suffered by “rank-and-file employees” who have been stuck with defined-contribution plans instead defined-benefit (e.g., pension) plans. Yet now, in the face of his own admission of the special risk Mr. Hammergren is running, Mr. Maremont suddenly becomes analytically mute.

This is clear evidence of bad faith on his part – venality rather than mere stupidity.

Come to think of it, wouldn’t it have made much sense to run a piece on the phenomenon of companies paying executive pensions in lump-sum form rather than in standard life-annuity form? This would have alerted the general public to the existence of this new form of “pension” while simultaneously providing a valid forum in which to compare executive pensions in a truly valid way. For example, Mr. Maremont’s chart shows two line items – “added years of service” and “boosting bonus in formula” – that account for $50 million worth of the $159 million. We cannot evaluate them because he does not give us any information about them, but a genuine news article might have had the time and space to remedy that omission. This would have been The Wall Street Journal operating in traditional fashion; reporting the news and the facts, educating the public while informing them. But Mr. Maremont couldn’t very well have profited from the ambiguity attached to a “pension” by dispelling that ambiguity. That would have nixed the class-envy angle that apparently motivated him, and these days it’s the angle that drives the story.

Just about the time that the triumph of venality seems complete, the reader’s eye is caught by Mr. Maremont’s assertion that McKesson’s 1% discount rate “increased the size of its chief’s lump-sum pension benefit 52%, according to Bolton Partners Inc.” Wait a minute – Mr. Maremont’s chart shows a gain of $52 million (actually, $52.4 million), but now the gain is instead 52%? Confusing dollars with percentages is just plain stupid.

It’s beginning to be clear were our aphorism came from. It may derive from the fact that stupidity and venality are not independent of each other but positively correlated. Consequently, when they arise it becomes difficult to sort out their effects.

It isn’t difficult, alas, to associate their incidence with the decline of a great newspaper.

DRI-190 for week of 12-30-12: Stereotypes Overturned: Race, Hollywood and the Jody Call

An Access Advertising EconBrief:

Stereotypes Overturned: Race, Hollywood and the Jody Call

The doctrine often referred to as “political correctness” ostensibly aims to overturn reigning stereotypes governing matters such as race. Yet all too often it results in the substitution of new stereotypes for old. Economics relies on reason and motivation rather than political programming to provide answers to human choices. Nothing could be more subversive of stereotypes than that.

What follows is a tale of Hollywood, race and the American military. At the time, each of these elements was viewed through a stylized, stereotypical lens – as they still are to some extent. But in no case did this tale unfold according to type. The reasons for that were economic.

The Movie Battleground (1949)

In 1949, Metro Goldwyn Mayer produced one of the year’s biggest boxoffice-hit movies, Battleground. It told the story of World War II’s Battle of the Bulge as seen through the eyes of a single rifle squad in the 101st Airborne Division of the U.S. Army. In late 1944, Germany teetered on the edge of defeat. Her supreme commanders conceived the idea of a desperate mid-winter offensive to grab the initiative and rock the Allies back on their heels. The key geographic objective was the town of Bastogne, Belgium, located at the confluence of seven major roads serving the Ardennes region and Antwerp harbor. Germany launched an attack that drove such as conspicuous salient into the Allied line that the engagement acquired the title of the “Battle of the Bulge.”

The Screaming Eagles of the 101st Airborne were the chief defenders of Bastogne. This put them somewhat out of their element, since their normal role was that of attack paratroopers. Despite this, they put up an unforgettable fight even though outnumbered ten to one by the German advance. The film’s scriptwriter and associate producer, Robert Pirosh, was among those serving with the 101st and trapped at Bastogne.

Battleground accurately recounted the Battle of the Bulge, including an enlisted man’s view of the legendary German surrender demand and U.S. General McAuliffe’s immortal response: “Nuts.” But the key to the film’s huge box-office success – it was the second-leading film of the year in ticket receipts – was its continual focus on the battle as experienced by the combat soldier.

The men display the range of normal human emotions, heightened and intensified out of proportion by the context. Courage and fear struggle for supremacy. Boredom and the Germans vie for the role of chief nemesis. The film’s director, William Wellman, had flown in the Lafayette Escadrille in World War I and was one of Hollywood’s leading directors of war films, including the first film to win a Best Picture Oscar, Wings.

Some of MGM’s leading players headed up the cast, including Van Johnson, George Murphy, John Hodiak, and Ricardo Montalban. The film was nominated for six Academy Awards and won two, for Pirosh’s story and screenplay and Paul Vogel’s stark black-and-white cinematography. In his motion-picture debut, James Whitmore was nominated for Best Supporting Actor and won a Golden Globe Award as the tobacco-chewing sergeant, Kinnie.

Whitmore provides the dramatic highlight of the film. Starving and perilously low on ammunition, the men of the 101st grimly hold out. They are waiting for relief forces led by General George Patton. Overwhelming U.S. air superiority over the Germans is of no use because fog and overcast have Bastogne completely socked in, grounding U.S. planes. Whitmore’s squad is cut off, surrounded and nearly out of bullets. Advised by Whitmore to save their remaining ammo for the impending German assault, the men silently fix bayonets to their rifles and await their death. Hobbling back to his foxhole on frozen feet, Whitmore notices something odd that stops him in his tracks. Momentarily puzzled, he soon realizes what stopped him. He has seen his shadow. The sun has broken through the clouds – and right behind it come American planes to blast the attacking German troops and drop supplies to the 101st. The shadow of doom has been lifted from “the battered bastards of Bastogne.”

1949 audiences were captivated by two scenes that bookended Battleground. After the opening credits and scene-setting explanation, soldiers are seen performing close-order drill led by Whitmore. These men were not actors or extras but were actual members of the 101st Airborne. They executed Whitmore’s drill commands with precise skill and timing while vocalizing a cadence count in tandem with Whitmore. This count would eventually attain worldwide fame and universal acceptance throughout the U.S. military. It began:

You had a good home but you left

You’re right!

You had a good home but you left

You’re right!

Jody was there when you left

You’re right!

Your baby was there when you left

You’re right!

Sound Off – 1,2

Sound Off – 3,4

Cadence Count – 1,2,3,4

1,2 – 3-4!

At the end of the movie, surviving members of Whitmore’s squad lie exhausted beside a roadway. Upon being officially relieved and ordered to withdraw, they struggle to their feet and head toward the rear, looking as worn out and numb as they feel. They meet the relief column marching towards them, heading to the front. Not wishing for the men to seem demoralized and defeated, Van Johnson suggests that Whitmore invoke the cadence count to bring them to life. As the movie ends, the squad marches smartly off while adding two more verses to the cadence count, supported by the movie’s music score:

Your baby was lonely as lonely could be

Until he provided company

Ain’t it great to have a pal

who works so hard to keep up morale?

Sound Off – 1,2

Sound Off – 3,4

Cadence Count – 1,2,3,4

1,2 – 3-4!

You ain’t got nothing to worry about

He’ll keep her happy ’till I get out

And I won’t get out ’till the end of the war

In Nineteen Hundred and Seventy-four

Sound Off – 1,2

Sound Off – 3,4

Cadence Count – 1,2,3,4

1,2 – 3-4!

The story of this cadence count, its inclusion in Battleground, its rise to fame and the fate of its inventor and his mentor are the story-within-the-story of the movie Battleground. This inside story speaks to the power of economics to overturn stereotypes.

The Duckworth Chant

In early 1944, a black Army private named Willie Lee Duckworth, Sr., was returning to Fort Slocum, NJ, from a long, tiresome training hike with his company. To pick up the spirits of his comrades and improve their coordination, he improvised a rhythmic chant. According to Michael and Elizabeth Cavanaugh in their blog, “The Duckworth Chant, Sound Off and the Jody Call,” this was the birth of what later came to be called the Jody (or Jodie) Call.

Duckworth’s commanding officer learned of popularity of Duckworth’s chant. He encouraged Duckworth to compose additional verses for training purposes. Soldiers vocalized the words of the chant along with training commands as a means of learning and coordinating close-order drill. Duckworth’s duties exceeded those of composer – he also taught the chant to white troops at Fort Slocum. It does not seem overly imaginative to envision episodes like this as forerunners to the growth of rap music, although it would be just a logical to attribute both phenomena to a different common ancestor.

Who is Jody (or Jodie)? The likely derivation is from a character in black folklore, Joe de Grinder, whose name would have been shortened first to Jody Grinder, then simply to Jody. The word “grind” has a sexual connotation and Jody’s role in the cadence count was indeed been to symbolize the proverbial man back home and out of uniform, who threatens to take the soldier’s place with his wife or girlfriend.

Already our story has turned certain deeply ingrained racial stereotypes upside down. In 1944, America was a segregated nation, not just in the South but North, East and West as well. This was also true of our armed forces. Conventional thinking (as distinct from conventional wisdom) holds that a black Army private had no power to influence his fate and was little more than a pawn under the thumb of larger forces.

Yet against all seeming odds and expectations, a black draftee from the Georgia countryside spontaneously introduced his own refinement into military procedure – and that refinement was not only accepted but wholeheartedly embraced. The black private was even employed to train white troops – at a point when racial segregation was the status quo.

Pvt. Duckworth’s CO was not just any commanding officer. He was Col. Bernard Lentz, the senior colonel in the U.S. Army at that time. Col. Lentz was a veteran of World War I, when he had developed the Cadence System of Teaching Close-Order Drill – his own personal system of drill instruction using student vocalization of drill commands. When Lentz heard of Duckworth’s chant, he immediately recognized its close kinship with his own methods and incorporated it into Fort Slocum’s routine.

The public-choice school of economics believes that government bureaucrats do not serve the “public interest.” Partly, this is because there is no unambiguous notion of the public interest for them to follow. Consequently, bureaucrats can scarcely resist pursuing their own ends since it is easy to fill the object-function vacuum with their own personal agenda. This is a case in which the public interest was served by a bureaucrat pursuing his own interests.

Col. Lentz had a psychological property interest in the training system that he personally developed. He had a vocational property interest in that system since its success would advance his military career. And in this case, there seems to be little doubt that the Duckworth Chant improved the productivity of troop training. Its use spread quickly throughout the army. According to the Cavanaugh’s, it was being used in the European Theater of Operations (ERO) by V-E Day. Eventually, Duckworth’s name recognition faded, to be replaced by that of his chant’s eponymous character, Jody. But the Jody Call itself remains to this day as a universally recognized part of the military experience.

Thus, the stereotypes of racial segregation and bureaucratic inertia were overcome by the economic logic of property rights. And the morale of American troops has benefitted ever since.

Hollywood as User and Abuser – Another Myth Exploded

The name of Pvt. Willie Lee Duckworth, Sr. does not exit the pages of history with the military’s adoption of his chant as a cadence count. Far from it. To paraphrase the late Paul Harvey, we have yet to hear the best of the rest of the story.

As noted above, the Duckworth chant spread to the ETO by early 1945. It was probably there that screenwriter Robert Pirosh encountered it and germinated the idea of planting it in his retelling of the Battle of the Bulge. When Battleground went into production, MGM representative Lily Hyland wrote to Col. Lentz asking if the cadence count was copyrighted and requesting permission to use it in the film.

Col. Lentz replied, truthfully, that the cadence count was not under copyright. But he sincerely requested compensation for Pvt. Duckworth and for a half-dozen soldiers who were most responsible for conducting training exercises at Fort Slocum. The colonel suggested monetary compensation for Duckworth and free passes to the movie for the other six. MGM came through with the passes and sent Pvt. Duckworth a check for $200.

As the Cavanaugh’s point out, $200 sounds like a taken payment today. But in 1949, $200 was approximately the monthly salary of a master sergeant in the Army, so it was hardly trivial compensation. This is still another stereotype shot to pieces.

Hollywood has long been famed in song and story – and in its own movies – as a user and abuser of talent. In this case, the casual expectation would have been that a lowly black soldier with no copyright on a rhyming chant he had first made up on the spur of the moment, with no commercial intent or potential, could expect to be stiffed by the most powerful movie studio on earth. If nothing else, we would have expected that Duckworth’s employer, the Army, would have asserted a proprietary claim for any monies due for the use of the chant.

That didn’t happen because the economic interests of the respective parties favored compensating Duckworth rather than stiffing him. Col. Lentz wanted the Army represented in the best possible light in the film, but he particularly wanted the cadence count shown to best advantage. If Pvt. Duckworth came forward with a public claim against the film, that would hurt his psychological and vocational property interests. The last thing MGM wanted was a lawsuit by a soldier whose claim would inevitably resonate with the public, making him seem to be an exploited underdog and the studio look like a bunch of chiseling cheapskates – particularly when they could avoid it with a payment of significant size to him but infinitesimal as a fraction of a million-dollar movie budget.

A Hollywood Ending – Living Happily Ever After

We have still not reached the fadeout in our story of Col. Lentz and Pvt. Duckworth. Carefully observing the runaway success of Battleground, Col. Lentz engaged the firm of Shapiro, Bernstein & Co. to copyright an extended version of the Duckworth chant in 1950 under the title of “Sound Off.” Both he and Willie Lee Duckworth, Sr. were listed as copyright holders. In 1951, this was recorded commercially for the first of many versions by Vaughn Monroe. In 1952, a film titled Sound Off was released. All these commercial exploitations of “Sound Off” resulted in payments to the two men.

How much money did Pvt. Duckworth receive as compensation for the rights to his chant, you may ask? By 1952, Duckworth was apparently receiving about $1,800 per month. In current dollars, that would amount to an income well in excess of $100,000 per year. Of course, like most popular creations, the popularity of “Sound Off” rose, peaked and then fell off to a whisper. But the money was enough to enable Duckworth to buy a truck and his own small pulpwood business. That business supported him, his wife and their six children. It is fair to say that the benefits of Duckworth’s work continued for the rest of his life, which ended in 2004.

If still dubious about the value of what MGM gave Duckworth, consider this. The showcase MGM provided for Duckworth’s chant amounted to advertising worth many thousands of dollars. Without it, the subsequent success of “Sound Off” would have been highly problematic, to put it mildly. It seems unlikely that Col. Lentz would have been inspired to copyright the cadence count and any benefits received by the two would have been miniscule in comparison.

The traditional Hollywood movie ending is a fadeout following a successful resolution of the conflict between protagonist and antagonist, after which each viewer inserts an individual conception of perpetual bliss as the afterlife of the main characters. In reality, as Ernest Hemingway reminds us, all true stories end in death. But Willie Lee Duckworth, Sr.’s story surely qualifies as a reasonable facsimile of “happily ever after.”

This story is not the anomaly it might seem. Although Hollywood itself was not a powerful engine of black economic progress until much later, free markets were the engine that pulled the train to a better life for 20th century black Americans. Research by economists like Thomas Sowell has established that black economic progress long preceded black political progress in the courts (through Brown vs. Topeka Board of Education) and the U.S. Congress (through legislation like the Civil Rights Act of 1964).

The Movie that Toppled a Mogul

There were larger economic implications of Battleground. These gave the film the sobriquet of “the movie that toppled a mogul.” As Chief Operating Officer of MGM, Louis B. Mayer had long been the highest-paid salaried employee in the U.S. The size of MGM’s payroll made it the largest contributor on the tax rolls of Southern California. Legend had endowed Mayer with the power to bribe police and influence politicians. Seemingly, this should have secured his job tenure completely.

Battleground was a project developed by writer and executive Dore Schary while he worked at rival studio RKO. Schary was unable to get the movie produced at RKO because his bosses there believed the public’s appetite for war movies had been surfeited by the wave of propaganda-oriented pictures released during the war. When Schary defected to MGM, he brought the project with him and worked ceaselessly to get it made.

Mayer initially opposed Battleground for the same reasons as most of his colleagues in the industry. He called it “Schary’s Folly.” Yet the movie was made over his objections. And when it became a blockbuster hit, the fallout caused Mayer to be removed as head of the studio that bore his name. To add insult to this grievous injury, Schary replaced Mayer as COO.

For roughly two decades, economists had supported the hypothesis of Adolf Berle and Gardiner Means that American corporations suffered from a separation of ownership and control. Ostensibly, corporate executives were not controlled by boards of directors who safeguarded the interests of shareholders. Instead, the executives colluded with boards to serve their joint interests. If ever there was an industry to test this hypothesis, it was the motion-picture business, dominated by a tightly knit group of large studios run by strong-willed moguls. MGM and Louis B. Mayer were the locus classicus of this arrangement.

Yet the production, success and epilogue of Battleground made it abundantly clear that it was MGM board chairman Nicholas Schenck, not Mayer, who was calling the shots. And Schenck had his eye fixed on the bottom line. Appearances to the contrary notwithstanding, Louis B. Mayer was not the King of Hollywood after all. Market logic, not market failure, reigned. Economics, not power relationships, ruled.

Thanks to Battleground, stereotypes were dropping like soldiers of the 47th Panzer Corps on the arrival of Patton’sThird Army in Bastogne.

No Happy Ending for Hollywood

Battleground came at the apex of American movies. Average weekly cinema attendance exceeded the population of the nation. The studio system was a smoothly functional, vertically integrated machine for firing the popular imagination. It employed master craftsman at every stage of the process, from script to screen.

Although it would have seemed incredible at the time, we know now that it was all downhill from that point. Two antitrust decisions in the late 1940s put an end to the Hollywood studio system. One particular abomination forbade studios from owning chains of movie theaters; another ended up transferring creative control of movies away from the studios.

The resulting deterioration of motion pictures took place in slow motion because the demand for movies was still strong and the studio system left us with a long-lived supply of people who still preserved the standards of yore. But the vertically integrated studio system has been gone for over half a century. Today, Hollywood is a pale shadow of its former self. Most movies released by major studios do not cover their costs through ticket sales. Studio profits result from sales of ancillary merchandise and rights. Theater profits are generated via concession sales. Motion-picture production is geared toward those realities and targeted predominantly toward the very young. Subsidies by local, state and national governments are propping up the industry throughout the world. And those subsidies must disappear sooner or later – probably sooner.

This has proved to be the ultimate vindication of our thesis that economics, not stereotypical power relationships, governed the movie business in Hollywood’s Golden Age. Free markets put consumers and shareholders in the driver’s seat. The result created the unique American art form of the 20th century. We still enjoy its fruits today on cable TV, VHS, DVD and the Internet. Misguided government attempts to regulate the movie business ended up killing the golden goose or, more precisely, reducing it to an enfeebled endangered species.

DRI-391: The Man Who Won World War II

World War II was the transcendent historical event of the 20th century. It brought some 100 million men, representing most of the world’s nations, under arms. Between 50 and 70 million people died. In an event of this size and scope, who would so foolish as to assign credit for victory to one particular individual?

Five-star General of the Army Dwight D. Eisenhower, that’s who. Reviewing the war in his memoirs, Eisenhower named one person as “the man who won the war for us.” That man was Andrew Jackson Higgins.

Today few remember Higgins. Reviewing his story does more than restore rightful place to a forgotten hero. Higgins’ story teaches one of the most important lessons in economics.

Andrew Higgins and the Development of the Higgins Boat

Andrew Higgins was born in Nebraska in 1889. After dropping out of high school, Higgins entered the business world. In the 1920s, he started a lumber import/export business and operated it with his own large fleet of sailing ships. To service them, he built a shipyard. A few years along, Higgins designed a shallow-draft vessel with a spoonbill-shaped bow for coastal and river use, which he named Eureka. This boat was able to run up onto and off riverbanks to unload people and cargo. Higgins proved to be a genius at boat design and eventually shipbuilding replaced lumber trade as the primary business of Higgins Industries.

The Eurekaattracted the interest of the U.S. Marine Corps for use as a landing craft. It beat a boat designed by the Navy’s Bureau of Construction and Repair in tests in 1938 and 1939. The Eureka‘s only drawback was that men and cargo had to be offloaded over the side, risking exposure to enemy fire.

Since 1937, the Japanese navy had utilized ramps on its landing craft. Higgins directed his engineer to emulate this principle. The result was the unique landing craft whose technical name was “Landing Craft, Vehicle, Personnel,” (LCVP) but which became better known as the “Higgins Boat.”

The Higgins Boat was one of the most revolutionary advances in the history of military technology. Heretofore, maritime invasion forces had to disembark at port cities – a substantial disadvantage since the opponent could predict the likely arrival location(s) and accordingly prepare bristling defenses. A large army had to travel in large troop ships which, of necessity, were deep-draft vessels. Troop ships would run aground and founder if they tried to dock at most coastal locations. Only at ports with natural deep harbors could they safely dock and unload.

The Higgins Boat allowed men and equipment to be transferred from troop ships to smaller, shallow-draft vessels that could run right onto an ordinary beach, drop their ramps, unload men and equipment and then withdraw to repeat the process. This meant that opponents now had to worry about defending the majority of a coastline, not just one or a few ports.

Amazing as it seems today, the Higgins Boat did not win immediate acceptance upon rolling off the assembly line. Its fight for survival paralleled that of the Allies in the early stage of the war itself.

The Entrepreneur vs. the Bureaucracy

In the early part of World War II, authority for designing and building the Navy’s landing craft was entrusted to the Bureau of Ships. Higgins was viewed coolly by the Bureau. What could a Nebraskan teach the Navy about shipbuilding? Besides, the department had designed and built its own model. Somehow, the Navy could always find an excuse for dismissing Higgins’ designs even when they defeated the Navy’s boats in head-to-head competition.

There was one other minor obstacle standing between Higgins and the bureaucrats at the Bureau of Ships. He hated their guts, viewed their work with contempt and said so openly. “If the ‘red tape’ and the outmoded and outlandish Civil War methods of doing business were eliminated, the work could be done in the Bureau of Ships very efficiently with about one-sixth the present personnel,” Higgins observed. Fortunately for Higgins, he managed to sell his ideas to other friendly powers, this keeping himself in business even though lacking honor in his own land. “If the power structure in place during the early months of the war had stayed in place,” said historians Burton and Anita Fulsom in their book FDR Goes to War, “Higgins would have been out of work and Americans, according to Eisenhower, would have either lost the war or had victory long delayed.”

Two unlikely saviors came to Higgins’ aid. The first of them was none other than Franklin Delano Roosevelt. The President had waged a bitter fight with Republicans in favor of his New Deal economic policies for more than two terms in office, only to watch those policies fail to restore the U.S. economy to its pre-Depression levels of income and employment. His Treasury Secretary, Henry Morganthau, Jr., confessed to his diary that administration policymakers had tried everything they could think of to revive the economy – and failed. Despite Roosevelt’s unwavering faith in the New Deal – and in himself – he sensed that he would need the support of the business community in order to win the war.

Thus, FDR changed tactics abruptly, going from excoriating businessmen as “economic royalists” to abjuring them to ramp up production of war materiel in the national interest. Suddenly, it became politically correct to view business as part of the team, pulling together to win the war. Roosevelt announced this change in philosophy even before Pearl Harbor, in a fireside chat of May 26, 1940. He was fond of describing the change as a switch from “old Dr. New Deal” to “Dr. Win the War” – a characterization that reveals as much about Roosevelt’s view of himself as Physician-In-Chief for the country as it does about his strategy.

Part of Roosevelt’s new emphasis involved the creation of the Truman Committee, headed by then-Senator Harry Truman of Missouri, to investigate government waste and mismanagement. Truman’s efforts in his home state had made him very popular, so when the Marines went to bat with his committee on Higgins’ behalf, the combination was too much for the Bureau of Ships to resist. Truman told the Bureau to produce a landing craft and test it in competition with a Higgins Boat. The test took place on May 25, 1942.

Each boat carried a thirty-ton tank through rough seas. The Navy’s craft barely avoided the loss of cargo and all hands. The Higgins Boat delivered the tank to its destination. The Committee declared Higgins’ design the winner.

Truman was scathing in his verdict on the conduct of the Bureau of Ships. “The Bureau of Ships has, for reasons known only to itself, stubbornly persisted for over five years in clinging to an unseaworthy …design of its own… Higgins Industries did actually design and build a superior [design],” only to run up against the Bureau’s “flagrant disregard for the facts, if not for the safety and success of American troops.”

The Entrepreneur vs. the Rules

Higgins’ trials and tribulations did not cease when he won government contracts to produce his landing craft (and other boats, including PT boats). He succeeded in scrounging up the capital necessary to expand his boatbuilding plant in New Orleans into a facility capable of supplying the armies of the Free World. But in 1942, fully automated manufacturing plants did not exist – Higgins next faced the problem of attracting the labor necessary to man the plant. Even in peacetime, that problem would have been daunting. In the wartime environment of wage and price controls, the chief legal inducement for attracting labor, a wage increase, was limited.

Higgins attitude to this and other problems can be appreciated from his own summation of his personal philosophy: “I don’t wait for opportunity to knock. I send out a welcoming committee to drag the old harlot in.” Higgins raised wages to the allowable maximum. Then he helped to set a precedent that persists to the present day by offering free medical care to his employees. Since this did not qualify as a wage, it was exempt from the controls and from the confiscatory wartime income-tax rates as well.

One plentiful source of labor was black workers. But this was New Orleans; segregation reared its ugly head. Higgins gritted his teeth and complied, while providing equal wages and benefits to all workers.

Shortages of metals and minerals were a throbbing headache. Higgins treated it by buying steel on the black market and stealing some items (such as bronze) that he couldn’t buy. (He later paid for stolen materials.)

Victory in Europe

Andrew Higgins went from employing 50 people in a plant worth $14,000 to hiring 20,000 employees to work seven huge plants. Over 10,000 Higgins Boats were produced, comprising most U.S. landing craft. His plants also built PT and antisubmarine boats.

Prior to landings in Sicily and North Africa in early 1943, Eisenhower moaned that “when I die, my coffin should be in the shape of a landing craft,” since they were killing him with worry. By D-Day, Higgins Boats had forced Hitler to stretch his defenses thin along the French coast. Although favoring the port of Pas-de-Calais, Hitler set up a string of defenses in Normandy as well. The Germans had concentrated nearly enough firepower to defeat the American landing at Omaha Beach, but “nearly” wasn’t enough to thwart the eventual beachhead. Meanwhile, the other four landings went relatively smoothly; the Higgins Boats had made it impossible for Hitler to keep all his bases covered. As Rommel and other high-level strategists recognized, once the landings succeeded, the war’s outcome was a foregone conclusion. Even Hitler couldn’t conceal his admiration for Higgins, calling the boat builder the “new Noah.”

On Thanksgiving, 1944, Eisenhower gave thanks for the Higgins Boats. “Let us thank God,” he intoned, “for Higgins Industries, management, and labor which has given us the landing boats with which to conduct our campaign.” And after the war, in his memoirs, Eisenhower laid it on the line: “Andrew Higgins is the man who won the war for us… If Higgins had not designed and built those LCVPs, we never could have landed over an open beach. The whole strategy of the war would have been different.”

The Thanks of a Grateful Nation

Along with many of the other entrepreneurs whose Herculean efforts supplied the American, British, Chinese and Russian armies, Andrew Higgins was rewarded after the war with an IRS investigation into the “excess profits” earned by his firms during the war. Since his death in 1952, his name has been placed on a Navy ship and an expressway in Ohio. Recently, a memorial (including a statue) has been raised to him in his hometown of Columbus, NE.

At his death, Higgins held some 30 patents.

The Economic Lessons of Andrew Higgins and American Entrepreneurship in World War II: The Value of Profit Maximization

The story of Andrew J. Higgins is perhaps the most dramatic of many similar stories of American entrepreneurship in World War II. Jack Simplot developed a process for dehydrated potatoes that enabled him to feed American soldiers around the globe. After the war, he turned his expertise to frozen foods and ended by supplying frozen French fries to the McDonald’s fast-food restaurant chain. Henry Kaiser was the preeminent wartime shipbuilder. He cut average construction time per ship by a factor exceeding ten (!). Like Higgins, he sometimes resorted to buying steel on the black market. Before the war, he built roads. After the war, he switched to steel and aluminum.

Men like Higgins, Simplot and Kaiser were entrepreneurs of demonstrable, and demonstrated, skill. Today, we relate their exploits with something resembling awe, yet it should have been no surprise that they succeeded. Their success often came on the heels of government’s failure at the tasks they undertook; this should likewise come as no surprise. The fact that government actively resisted their best efforts should dismay us, but not surprise us. After all, we have seen the same lessons repeated since the war.

Consider the test of the Higgins Boat, in which the Navy landing craft designed by the Bureau of Ships faced off against the Higgins Boat. Had the Higgins Boat lost the contract, the Allies would have lost the war or been seriously threatened with losing it. (So said Dwight Eisenhower, the man who led the Allied war effort.) The tacit premise behind government regulation of business is that – of course – government regulators will always act in the “public interest” while private businessmen act from greedy self-interest which must run athwart the general welfare. Yet in this case, government bureaucrats spent years acting in a manner directly contradictory to the public interest, albeit apparently in their own interest. (So said Harry Truman, certainly no advocate of laissez-faire capitalism.)

Should we be surprised that it was the profit-maximizer who won the war and the government bureaucrats who pursued their own interest at the risk of losing it? Certainly not. Higgins had spent his life in private business, where he could gain success and happiness only by building boats that worked. The naval bureaucrats did not have to build boats that worked in order to “succeed” in their domain; e.g., remain bureaucrats and keep their staffs and budgets intact. We know this because they succeeded in thwarting Higgins’ design for five years in spite of Higgins’ triumphs in testing against the Navy. Indeed, granting a contract to the boat that worked would have threatened their success, since their own design and model would have been replaced.

The only surprising thing about the episode is that how close America came to losing the war. Had FDR not done two things that were utterly unexpected – namely, abandon his allegiance to the New Deal and set up the Truman Committee to overrule wasteful measures undertaken by bureaucrats – we might have done just that. In that sense, it might with some justice be said that it was really FDR who won the war. And, in fact, Roosevelt made several additional decisions that were crucial in determining the war’s outcome. Naming George Marshall as Chief of Staff is one such decision. Marshall chose men like Eisenhower, Omar Bradley and George Patton for key commands, in each case jumping them over many other men with seniority and more impressive resumes.

The problem with calling FDR the guarantor of victory is that each of his good decisions only offset other decisions that were dreadfully bad. FDR wouldn’t have had to abandon the New Deal had he not adopted such a disastrous mélange of counterproductive and unworkable policies in the first place. The appointment of Marshall merely undid the damage done by the appointment of mediocre yes-men like Harold Stark and Henry Stimson, to high administrative and cabinet positions in the military bureaucracy.

The Second Lesson: Abandonment of the New Deal

FDR’s abandonment of the New Deal illustrates the second lesson to be drawn from the example of Higgins and his fellow entrepreneurs. Conventional thinking posits wartime as the time of preoccupation with “guns,” whereas in peacetime we can safely concentrate on “butter.” Butter production, so tradition tells us, is effected using markets and a price system, but guns are a different proposition entirely. Combating militarism demands that we use the methods of the militarists by turning the country into an armed camp, as did Germany and Japan.

However difficult it may have been to see through this fallacy at the time, it is obvious in retrospect. America won the war with production, by supplying not only its own needs but those of Great Britain, Russia and China as well. Those needs included not only military goods narrowly defined, but foodstuffs, clothing, medicines and all manner of civilian goods as well. It is highly significant that both Roosevelt and Churchill concurred in urging major motion picture studios to maintain the volume and quality of their products rather than sacrificing resources to the war effort. They realized that movies were part of the war effort.

Put in this light, Roosevelt’s decision to substitute “Dr. Win-the-War” for “Dr. New Deal” takes on vastly more meaning. Without even consciously realizing it, he was admitting the failure of his own policies in peacetime as well as their unsuitability for war. And war’s end brought this lesson home with stunning force. During the war, Roosevelt had abandoned New Deal staples like the WPA and the CCC. After the war, President Truman was able to retain various “permanent” features of the New Deal, like Social Security, banking and stock-market regulation and pro-union regulations. Pervasive control of the price system faded away along with the gradual obsolescence of wartime price controls.

FDR had predicted that it would take total employment of 60 million to absorb the ramped-up levels of total production reached during the war. (Before the war, total employment had been only 47 million.) Keynesian economists predicted a return to Depression, with unemployment ranging from 10-20%, after the war unless massive federal-government spending was undertaken. Instead – appalled at the unprecedented level of federal-government debt as a percentage of gross national product – the Republican Congress of 1946 cut spending and taxes. The result was an increase in civilian employment from 39 million to 55 million and total employment (including government workers) reached Roosevelt’s goal of 60 million without the New Deal-type spending he had envisaged. Unemployment was 3.6%. Annual growth in gross national product reached an all-time high of 30%.

Wartime entrepreneurship battered New Deal economic policy to its knees. 1946 delivered the coup de grace.

The Third Lesson: The Primacy of the Price System

The third and final lesson to be learned concerns the impatience of the entrepreneurs with bureaucracy, rules and laws. In particular, their resort to the black market was exactly what patriotic citizens were being implored not to do during the war. Should we be surprised that entrepreneurs won the war by ignoring the anti-market directives of the bureaucrats?

Hardly. Everybody seemed to take for granted that normal commercial attitudes and impulses should be suppressed during wartime, that the success of any collective goal requires the suppression of all individual wants. But upon reflection, this simply cannot be true.

As usual, the most cogent analysis of the problem was provided by F. A. Hayek, in a short essay called, “Pricing Versus Rationing.” He pointed out that in wartime politicians’ standard recourse is to controls on prices and rationing of “essential” war materials by queue. Any professional economist would, he declared, recognize the fatuity and futility of that modus operandi. “It deprives industry of all basis of rational calculation. It throws the burden of securing economy on a bureaucracy which is neither equipped nor adequate in number for the task. Even worse, such a system would deprive those in control of even the whole economic machine of essential guides for their plans and reduce major decisions of policy and even strategy to little more than guesswork.” This will “inevitably cause inefficiency and waste of resources.”

In other words, the best policy for allocating resources in wartime is the same as the best policy for allocating resources in peacetime; namely, use the price system to determine relative values. Where so many people go wrong is by blithely assuming that because so many military goods are now required, command and control must be used to bring them into existence by forbidding the production of other things. But among the many problems caused by this approach is that the production of any good for a military use means the foreclosure of resource use for production of some other military good. Without a price system to determine relative values, the war itself cannot be run on any kind of rational or efficient basis. This is another reason why the Allies were lucky to win – the Axis were wedded to a Fascist, command-and-control economic system that foreswore free markets even more than the Allies did.

Black markets are the outgrowth of prohibition and/or price controls. They arise because legal, licit markets are forbidden. Whether it is bootleg liquor during Prohibition, illicit drugs in contemporary America or under-the-table trading of ration coupons during wartime, a black market is a sign of a free market trying to escape confinement. Higgins, Kaiser, et al were illustrating the logic of Hayek. Rationing and price controls are just as bad in wartime as in peacetime. They were violating statutory law, but were obeying the higher wartime law of salus populi suprema lex.

The Man Who Won World War II

The man who won World War II was not a soldier. He was a businessman. He won it by applying the great economic principles of free markets. This transcendent truth was acknowledged by World War II’s greatest soldier. The power and meaning of this should persuade even those unimpressed by the logic of economic theory itself.