DRI-254 for week of 7-6-14: The Selling of Environmentalism

An Access Advertising EconBrief:

The Selling of Environmentalism

The word “imperialism” was coined by Lenin to define a process of exploitation employed by developed nations in the West on undeveloped colonies in the Eastern hemisphere. In recent years, though, it has been used in a completely different context – to describe the use of economic logic to explain practically everything in the world. Before the advent of the late Nobel laureate Gary Becker, economists were parochial in their studies, confining themselves almost exclusively to the study of mankind in its commercial and mercantile life. Becker trained the lens of economic theory on the household, the family and the institution of marriage. Ignoring the time-honored convention of treating “capital” as plant and equipment, he (along with colleagues like Theodore Schultz) treated human beings as the ultimate capital goods.

Becker ripped the lid off Pandora’s Box and the study of society will never be the same again. We now recognize that any and every form of human behavior might profitably be seen in this same light. To be sure, that does not mean employing the sterile and limiting tools of the professional economist; namely, advanced mathematics and formal statistics. It simply means subjecting human behavior to the logic of purposeful action.

Environmentalism Under the Microscope

The beginnings of the environmental movement are commonly traced to the publication of Silent Spring in 1961 by marine biologist Rachel Carson. That book sought to dramatize the unfavorable effects of pesticides, industrial chemicals and pollution upon wildlife and nature. Carson had scientific credentials – she had previously published a well-regarded book on oceanography – but this book, completed during her terminal illness, was a polemic rather than a sober scientific tract. Its scientific basis has been almost completely undermined in the half-century since publication. (A recent book devoted entirely to re-examination of Silent Spring by scientific critics is decisive.) Yet this book galvanized the movement that has since come to be called environmentalism.

An “ism” ideology is, or ought to be, associated with a set of logical propositions. Marxism, for example, employs the framework of classical economics as developed by David Ricardo but deviates in its creation of the concept of “surplus value” as generated by labor and appropriated by capitalists. Capitalism is a term intended invidiously by Marx but that has since morphed into the descriptor of the system of free markets, private property rights and limited government. What is the analogous logical system implied by the term “environmentalism?”

There isn’t one. Generically, the word connotes an emotive affinity for nature and corresponding distaste for industrial civilization. Beyond that, its only concrete meaning is political. The problem of definition arises because, in and of itself, an affinity for nature is insufficient as a guide to human action. For example, consider the activity of recycling. Virtually everybody would consider it de rigueur as part of an environmentalist program. The most frequent stated purpose of recycling is to relieve pressure on landfills, which are ostensibly filling up with garbage and threatening to overwhelm humanity. The single greatest component of landfills is newsprint. But the leachates created by the recycling of newsprint are extremely harmful to” the environment;” e.g., their acidic content poisons soils and water and they are very costly to divert. We have arrived at a contradiction – is recycling “good for the environment” or “bad for the environment?” There is no answer to the question as posed; the effects of recycling are couched in terms of tradeoffs. In other words, the issue is dependent on economics, not emotion only.

No matter where we turn, “the environment” confronts us with such tradeoffs. Acceptance of the philosophy of environmentalism depends on getting us to ignore these tradeoffs by focusing on one side and ignoring the other. Environmental advocates of recycling, for instance, customarily ignore the leachates and robotically beat the drums for mandatory recycling programs. When their lopsided character is exposed, environmentalists retreat to the carefully prepared position that the purity of their motives excuses any lapses in analysis and overrides any shortcomings in their programs.

Today’s economist does not take this attitude on faith. He notes that the political stance of environmentalists is logically consistent even if their analysis is not. The politics of environmentalism can be understood as a consistent attempt to increase the real income of environmentalists in two obvious ways: first, by redistributing income in favor of their particular preferences for consumption (enjoyment) of nature; and second, by enjoying real income in the form of power exerted over people whose freedom they constrain and real income they reduce through legislation and administrative and judicial fiat.

Thus, environmentalism is best understood as a political movement existing to serve economic ends. In order to do that, its adherents must “sell” environmentalism just as a producer sells a product. Consumers “buy” environmentalism in one of two ways: by voting for candidates who support the legislation, agencies, rules and rulings that further the environmental agenda; and by donating money to environmental organizations that provide real income to environmentalists by employing them and lobbying for the environmental agenda.

Like the most successful consumer products, environmentalism has many varieties. Currently, the most popular and politically successful one is called “climate change,” which is a model change from the previous product, “global warming.” In order to appreciate the economic theory of environmentalism, it is instructive to trace the selling of this doctrine in recent years.

Why Was the Product Called “Climate Change” Developed?

The doctrine today known as “climate change” grew out of a long period of climate research on a phenomenon called “global warming.” This began in the 1970s. Just as businessmen spent years or even decades developing products, environmentalists use scientific (or quasi-scientific) research as their product-development laboratory, in which promising products are developed for future presentation on the market. Although global warming was “in development” throughout the 1970s and 80s, it did not receive its full “rollout” as a full-fledged environmental product until the early 1990s. We can regard the publication of Al Gore’s Earth in the Balance in 1992 as the completed rollout of global warming. In that book, Gore presented the full-bore apocalyptic prophesy that human-caused global warming threatened the destruction of the Earth within two centuries.

Why was global warming “in development” for so long? And after spending that long in development limbo, why did environmentalists bring it “to market” in the early 1990s? The answers to these questions further cement the economic theory of environmentalism.

Global warming joined a long line of environmental products that were brought to market beginning in the early 1960s. These included conservation, water pollution, air pollution, species preservation, forest preservation, overpopulation, garbage disposal, inadequate food production, cancer incidence and energy insufficiency.The most obvious, logical business rationale for a product to be brought to market is that its time has come, for one or more reasons. But global warming was brought to market by a process of elimination. All of the other environmental products were either not “selling” or had reached dangerously low levels of “sales.” Environmentalists desperately needed a flagship product and global warming was the only candidate in sight. Despite its manifest deficiencies, it was brought to market “before its time;” e.g., before its scientific merits had been demonstrated. In this regard, it differed from most (although not all) of the previous environmental products.

Those are the summary answers to the two key questions posed above. Global warming (later climate change) spent decades in development because its scientific merits were difficult if not impossible to demonstrate. It was brought to market in spite of that limitation because environmentalists had no other products with equivalent potential to provide real income and had to take the risks of introducing it prematurely in order to maintain the “business” of environmentalism as a going concern. Each of these contentions is fleshed out below.

The Product Maturation Suffered by Environmentalism

Businesses often find that their products lead limited lives. These limitations may be technological, competitive or psychological. New and better processes may doom a product to obsolescence. Competitors may imitate a product into senescence or even extinction. Fads may simply lose favor with consumers after a period of infatuation.

As of the early 1990s, the products offered by environmentalism were in various stages of maturity, decline or death.

Air pollution was a legitimate scientific concern when environmentalism adopted it in the early 1960s. It remains so today because the difficulty of enforcing private property rights in air make a free-market solution to the problem of air pollution elusive. But by the early 1990s, even the inefficient solutions enforced by the federal government had reduced the problem of air pollution to full manageability.

Between 1975 and 1991, the six air pollutants tracked by the Environmental Protection Agency (EPA) fell between 24% and 94%. Even if we go back to 1940 as a standard of comparison – forcing us to use emissions as a proxy for the pollution we really want to measure, since the latter wasn’t calculated prior to 1975 – we find that three of the six were lower in 1991 and total emissions were also lower in 1991. (Other developed countries showed similar progress during this time span.)

Water pollution was already decreasing when Rachel Carson wrote and continued to fall throughout the 1960s, 70s and 80s. The key was the introduction of wastewater treatment facilities to over three-quarters of the country. Previously polluted bodies of water like the CuyahogaRiver, the AndroscogginRiver, the northern Hudson River and several of the Great Lakes became pure enough to host sport-fishing and swimming. The Mississippi River became one of the industrialized world’s purest major rivers. Unsafe drinking water became a non-problem. Again, this was accomplished despite the inefficient efforts of local governments, the worst of these being the persistent refusal to price water at the margin to discourage overuse.

Forests were thriving in the early 1990s, despite the rhetoric of environmental organizations that inveighed against “clear-cutting” by timber companies. In reality, the number of wooded acres in the U.S. had grown by 20% over the previous two decades. The state of Vermont had been covered 35% by forest in the late nineteenth century. By the early 1990s, this coverage had risen to 76%.

This improvement was owed to private-sector timber companies, who practiced the principle of “sustainable yield” timber management. By the early 1990s, annual timber growth had exceeded harvest every year since 1952. By 1992, the actual timber harvest was a miniscule 384,000 acres, six-tenths of 1% of the land available for harvest. Average annual U.S. wood growth was three times greater than in 1920.

Environmentalists whined about the timberlands opened up for harvest by the federal government in the national parks and wildlife refuges, but less logging was occurring in the National Forests than at any time since the early 1950s. Clear-cut timber was being replaced with new, healthier stands that attracted more wildlife diversity than the harvested “old-growth” forest.

As always, this progress occurred in spite of government, not because of it. The mileage of roads hacked out of national-park land by the Forest Service is three times greater that of the federal Interstate highway system. The subsidized price at which the government sells logging rights on park land is a form of corporate welfare for timber companies. But the private sector bailed out the public in a manner that would have made John Muir proud.

Garbage disposal and solid-waste management may have been the most unheralded environmental victory won by the private sector. At the same time that Al Gore complained that “the volume of garbage is now so high that we are running out of places to put it,” modern technology had solved the problem of solid-waste disposal. The contemporary landfill had a plastic bottom and clay liner that together prevent leakage. It was topped with dirt to prevent odors and run-off. The entire U.S. estimated supply of solid waste for the next 500 years could be safely stored in one landfill 100-yards deep and 20 miles on a side. The only problem with landfills was a siting problem, owing to the NIMBY (“not in my back yard”) philosophy fomented by environmentalism. The only benefit to be derived from recycling could be had from private markets by recycling only those materials whose benefits (sales revenue) exceeded their reclamation costs (including a “normal” profit).

Overpopulation was once the sales leader of environmentalism. In 1968’s The Population Bomb, leading environmentalist Paul Ehrlich wrote that “the battle to feed all of humanity is over. In the 1970s, the world will undergo famines – hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date, nothing can prevent a substantial increase in the world death rate….” Ehrlich also predicted food riots and plummeting life expectancy in the U.S. and biological death for a couple of the Great Lakes.

Ehrlich was a great success at selling environmentalism. His book, and its 1990 sequel The Population Explosion, sold millions of copies and recruited untold converts to the cause. Unfortunately, his product had a limited shelf life because his prophecies were spectacularly inaccurate. The only famines were politically, not biologically, triggered; deaths were in the hundreds of thousands, not millions. Death rates declined instead of rising. The Great Lakes did not die; they were completely rehabilitated. Even worse, Ehrlich made a highly publicized bet with economist Julian Simon that the prices of five metals handpicked by Ehrlich would rise in real terms over a ten-year period. (The loser would pay the algebraic sum of the price changes incurred.) The prices went down in nominal terms despite the rising general level of prices over the interval – another spectacular prophetic failure by Ehrlich.

It’s not surprising that Ehrlich, rather than the population, bombed. In the 1960s, the world’s annual population growth was about 2.0%. By the 1990s, it would fall to 1.6%. (Today, of course, our problem is falling birth rates – the diametric opposite of that predicted by environmentalism.)

Therefore, the phantom population growth predicted by environmentalism did not comprise one component of the inadequate food supply foreseen with uncanny inaccuracy by environmentalists. Ehrlich and others had foreseen a Malthusian scenario in which rising population growth overtook diminishing agricultural productivity. They were just as wrong about productivity as about population. The Green Revolution ushered in by Norman Borlaug et al drove one of the world’s leading agricultural economists to declare that “the scourge of famine due to natural causes has been almost conquered….”

The other leg of environmentalism’s collapsing doomsday scenario of inadequate food was based on cancer incidence. Not only would the food supply prove insufficient, according to environmentalists, it was also unsafe. Industrial chemicals and pesticides were entering the food supply through food residues and additives. They were causing cancer. How did we know this? Tests on animals – specifically, on mice and rats – proved it.

There was only one problem with this assertion. Scientifically speaking, it was complete hooey. The cancer risk of one glass of wine was about 10,000 -12,000 times greater than that posed by the additives and pesticide residues (cumulatively) in most food products. Most of our cancer risk comes from natural sources, such as sunlight and natural pesticides produced by plants. Some of these occur in common foods. Still, cancer rates had remained steady or fallen over the previous fifty years except for lung cancers attributable to smoking and melanomas attributable to ultraviolet light. Cancer rates among young adults had decreased rapidly. Age-adjusted death rates had mostly fallen.

Energy insufficiency had been brought to market by environmentalists in the 1970s, during the so-called Energy Crisis. It sold well when OPEC was allowed to peg oil prices at stratospheric levels. But when the Reagan administration decontrolled prices, domestic production rose and prices fell. As the 1990s rolled around, environmentalists were reduced to citing on “proven reserves” of oil (45 years) and natural gas (63 years) as “proof” that we would soon run out of fossil fuels and energy prices would then skyrocket. Of course, this was more hooey; proven reserves are the energy equivalent of inventory. Businesses hold inventory as the prospective benefits and costs dictate. Current inventories say nothing about the long-run prospect of shortages.

In 1978, for example, proven reserves of oil stood at 648 billion barrels, or 29.2 years’ worth at current levels of usage. Over the next 14 years, we used about 84 billion barrels, but – lo and behold – proven reserves rose to nearly a billion barrels by 1992. That happened because it was now profitable to explore for and produce oil in a newly free market of fluctuating oil prices, making it cost-efficient to hold larger inventories of proven reserves. (And in today’s energy market, it is innovative technologies that are driving discoveries and production of new shale oil and gas.) Really, it is an idle pastime to estimate the number of years of “known” resources remaining because nobody knows how much of a resource remains. It is not worth anybody’s time to make an accurate estimate; it is easier and more sensible to simply let the free market take its course. If the price rises, we will produce more and discover more reserves to hold as “inventory.” If we can’t find any more, the resultant high prices will give us the incentive to invent new technologies and find substitutes for the disappearing resource. That is exactly what has just happened with the process called “fracking.” We have long known that conventional methods of oil drilling left 30-70% of the oil in the ground because it was too expensive to extract. When oil prices rose high enough, fracking allowed us to get at those sequestered supplies. We knew this in the early 1990s, even if we didn’t know exactly what technological process we would ultimately end up using.

Conservation was the first product packaged and sold by environmentalism, long predating Rachel Carson. It dated back to the origin of the national-park system in Theodore Roosevelt’s day and the times of John Muir and John Jacob Audubon. By the early 1990s, conservation was a mature product. The federal government was already the biggest landowner in the U.S. We already had more national parks than the federal government could hope to manage effectively. Environmentalists could no longer make any additional sales using conservation as the product.

Just about the only remaining salable product the environmentalists had was species preservation. Environmentalism flogged it for all it was worth, but that wasn’t much. After the Endangered Species Act was passed and periodic additions made to its list, what was left to do? Not nearly enough to support the upper-middle-class lifestyles of a few million environmentalists. (It takes an upper-middle-class income to enjoy the amenities of nature in all their glory.)

Environmentalism Presents: Global Warming

In the late 1980s, the theory that industrial activity was heating up the atmosphere by increasing the amount of carbon dioxide in the air began to gain popular support. In 1989, Time Magazine modified its well-known “Man of the Year” award to “Planet of the Year,” which it gave to “Endangered Earth.” It described the potential effects of this warming process as “scary.” The International Panel on Climate Change, an organization of environmentalists dedicated to selling their product, estimated that warming could average as much as 0.5 degrees Fahrenheit per decade over the next century, resulting in a 5.4 degree increase in average temperature. This would cause polar ice caps to melt and sea levels to rise, swamping coastal settlements around the world – and that was just the beginning of the adverse consequences of global warming.

No sooner had rollout begun than the skepticism rolled in along with the new product. Scientists could prove that atmospheric carbon dioxide was increasing and that industrial activity was behind that, but it could not prove that carbon dioxide was causing the amount of warming actually measured. As a matter of fact, there wasn’t actually an unambiguous case to be made for warming. What warming could be found had mostly occurred at night, in the winter and in the Southern Hemisphere (not the locus of most industrial activity). And to top it all off, it is not clear whether or not we should ascribe warming to very long-run cyclical forces that have alternated the Earth between Ice Ages and tropical warming periods for many thousands of years. By 1994, Time Magazine (which needed a continuous supply of exciting new headlines just as much as environmentalists needed a new supply of products with which to scare the public) had given up on global warming and resuscitated a previous global-climate scare from the 1970s, the “Coming Ice Age.”

It is easy to see the potential benefits of the global-warming product for environmentalists. Heretofore, almost all environmentalist products had an objective basis. That is, they spotlighted real problems. Real problems have real solutions, and the hullabaloo caused by purchase of those products led to varying degrees of improvement in the problems. Note this distinction: the products themselves did not cause or lead to the improvement; it was the uproar created by the products that did the job. Most of the improvement was midwived by economic measures, and environmentalism rejects economics the way vampires reject the cross. This put environmentalists in an anomalous position. Their very (indirect) success had worked against them. Their real income was dependent on selling environmentalism in any of various ways. Environmentalists cannot continue to sell more books about (say) air pollution when existing laws, regulations and devices have brought air quality to an acceptable level. They cannot continue to pass more coercive laws and regulations when the legally designated quality has been reached. Indeed, they will be lucky to maintain sales of previously written books to any significant degree. They cannot continue to (credibly) solicit donations on the strength of a problem that has been solved, or at least effectively managed.

Unfortunately for environmentalists, the environmental product is not like an automobile that gives service until worn out and needs replacement, ad infinitum. It is more like a vaccine that, once taken, needn’t be retaken. Once the public has been radicalized and sensitized to the need for environmentalism, it becomes redundant to keep repeating the process.

Global warming was a new kind of product with special features. Its message could not be ignored or softened. Either we reform or we die. There was no monkeying around with tradeoffs.

Unlike the other environmental products, global warming was not a real problem with real solutions. But that was good. Real problems get solved – which, from the environmentalist standpoint, was bad. Global warming couldn’t even be proved, let alone solved. That meant that we were forced to act and there could be no end to the actions, since they would never solve the problem. After all, you can’t solve a problem that doesn’t exist in the first place! Global warming, then, was the environmentalist gift that would keep on giving, endlessly beckoning the faithful, recruiting ever more converts to the cause, ringing the cash register with donations and decorating the mast of environmentalism for at least a century. Its very scientific dubiety was an advantage, since that would keep it in the headlines and keep its critics fighting against it – allowing environmentalists the perfect excuse to keep pleading for donations to fend off the evil global-warming deniers. Of course, lack of scientific credibility is also a two-edged sword, since environmentalists cannot force the public to buy their products and can never be quite sure when the credibility gap will turn the tide against them.

When you’re selling the environmentalist product, the last thing you want is certainty, which eliminates controversy. Controversy sells. And selling is all that matters. Environmentalists certainly don’t want to solve the problem of global warming. If the problem is solved, they have nothing left to sell! And if they don’t sell, they don’t eat, or at least they don’t enjoy any real income from environmentalism. Environmentalism is also aimed at gaining psychological benefits for its adherents by giving their lives meaning and empowering them by coercing people with whom they disagree. If there is no controversy and no problem, there is nothing to give their lives meaning anymore and no basis for coercing others.

The Economic Theory of Environmentalism

Both environmentalists and their staunchest foes automatically treat the environmental movement as a romantic crusade, akin to a religion or a moral reform movement. This is wrong. Reformers or altruists act without thought of personal gain. In contrast, environmentalists are self-interested individuals in the standard tradition of economic theory. Some of their transactions lie within the normal commercial realm of economics and others do not, but all are governed by economic logic.

That being so, should we view environmentalism in the same benign light as we do any other industry operating in a free market? No, because environmentalists reject the free market in favor of coercion. If they were content to persuade others of the merits of their views, their actions would be unexceptional. Instead, they demand subservience to their viewpoint via legal codification and all forms of legislative, executive, administrative and judicial tyranny. Their adherents number a few would-be dictators and countless petty dictators. Their alliance with science is purely opportunistic; one minute they accuse their opponents of being anti-scientific deniers and the next they are praying to the idol of Gaia and Mother Earth.

The only thing anti-environmentalists have found to admire about the environmental movement is its moral fervor. That concession is a mistake.

DRI-292 for week of 6-29-14: One in Six American Children is Hungry – No, Wait – One in Five!

An Access Advertising EconBrief:

One in Six American Children is Hungry – No, Wait – One in Five!

You’ve heard the ad. A celebrity – or at least somebody who sounds vaguely familiar, like singer Kelly Clarkson – begins by intoning somberly: “Seventeen million kids in America don’t know where their next meal is coming from or even if it’s coming at all.” One in six children in America is hungry, we are told. And that’s disgraceful, because there’s actually plenty of food, more than enough to feed all those hungry kids. The problem is just getting the food to the people who need it. Just make a donation to your local food pantry and together we can lick hunger in America. This ad is sponsored by the Ad Council and Feeding America.

What was your reaction? Did it fly under your radar? Did it seem vaguely dissonant – one of those things that strikes you wrong but leaves you not quite sure why? Or was your reaction the obvious one of any intelligent person paying close attention – “Huh? What kind of nonsense is this?”

Hunger is not something arcane and mysterious. We’ve all experienced it. And the world is quite familiar with the pathology of hunger. Throughout human history, hunger has been mankind’s number one enemy. In nature, organisms are obsessed with absorbing enough nutrients to maintain their body weight. It is only in the last few centuries that tremendous improvements in agricultural productivity have liberated us from the prison of scratching out a subsistence living from the soil. At that point, we began to view starvation as atypical, even unthinkable. The politically engineered famines that killed millions in the Soviet Union and China were viewed with horror; the famines in Africa attracted sympathy and financial support from the West. Even malnutrition came to be viewed as an aberration, something to be cured by universal public education and paternalistic government. In the late 20th century, the Green Revolution multiplied worldwide agricultural productivity manifold. As the 21st century dawned, the end of mass global poverty and starvation beckoned within a few decades and the immemorial problem of hunger seemed at last to be withering away.

And now we’re told that in America – for over a century the richest nation on Earth – our children – traditionally the first priority for assistance of every kind – are hungry at the ratio of one in six?

WHAT IS GOING ON HERE?

The Source of the Numbers – and the Truth About Child Hunger

Perhaps the most amazing thing about these ads, which constitute a full-fledged campaign, is the general lack of curiosity about their origins and veracity. Seemingly, they should have triggered a firestorm of criticism and investigation. Instead, they have been received with yawns.

The ads debuted last Fall. They were kicked off with an article in the New York Times on September 5, 2013, by Jane L. Levere, entitled “New Ad Campaign Targets Childhood Hunger.” The article is one long promotion for the ads and for Feeding America, but most of all for the “cause” of childhood hunger. That is, it takes for granted that a severe problem of childhood hunger exists and demands close attention.

The article cites the federal government as the source for the claim that “…close to 50 million Americans are living in ‘food insecure’ households,” or ones in which “some family members lacked consistent access throughout the year to adequate food.” It claims that “…almost 16 million children, or more than one in 5, face hunger in the United States.”

The ad campaign is characterized as “the latest in a long collaboration between Ad Council and Feeding America, ” which supplies some 200 food banks across the country that in turn supply more than 61,000 food pantries, soup kitchens and shelters. Feeding America began in the late 1990s as another organization, America’s Second Harvest, which enlisted the support of A-list celebrities such as Matt Damon and Ben Affleck. This was when the partnership with the Ad Council started.

Priscilla Natkins, a Vice-President of Ad Council, noted that in the early days “only” one out of 10 Americans was hungry. Now the ratio is 1 out of 7 and more than 1 out of 5 children. “We chose to focus on children,” she explained, “because it is a more poignant approach to illustrating the problem.”

Further research reveals that, mirabile dictu, this is not the first time that these ads have received skeptical attention. In 2008, Chris Edwards of Cato Institute wrote about two articles purporting to depict “hunger in America.” That year, the Sunday supplement Parade Magazine featured an article entitled “Going Hungry in America.” It stated that “more than 35.5 million Americans, more than 12% of the population and 17% of our children, don’t have enough food, according to the Department of Agriculture.” Also in 2008, the Washington Post claimed that “about 35 million Americans regularly go hungry each year, according to federal statistics.”

Edwards’ eyebrows went up appropriately high upon reading these accounts. After all, this was even before the recession had been officially declared. Unlike the rest of the world, though, Edwards actually resolved to verify these claims. This is what Edwards found upon checking with the Department of Agriculture.

In 2008, the USDA declared that approximately 24 million Americans were living in households that faced conditions of “low food security.” The agency defined this condition as eating “less varied diets, participat[ing] in Federal food-assistance programs [and getting] emergency food from community food pantries.” Edwards contended that this meant those people were not going hungry – by definition. And indeed, it is semantically perverse to define a condition of hunger by describing the multiple sources of food and change in composition of food enjoyed by the “hungry.”

The other 11 million (of the 35 million figure named in the two articles) people fell into a USDA category called “very low food security.” These were people whose “food intake was reduced at times during the year because they had insufficient money or other resources for food” [emphasis added]. Of these, the USDA estimated that some 430,000 were children. These would (then) comprise about 0.6% of American children, not the 17% mentioned by Parade Magazine, Edwards noted. Of course, having to reduce food on one or more occasions to some unnamed degree for financial reasons doesn’t exactly constitute “living in hunger” in the sense of not knowing where one’s next meal was coming from, as Edwards observed. The most that could, or should, be said was that the 11 million and the 430,000 might constitute possible candidates for victims of hunger.

On the basis of this cursory verification of the articles’ own sources, Chris Edward concluded that hunger in America ranked with crocodiles in the sewers as an urban myth.

We can update Edwards’ work. The USDA figures come from survey questions distributed and tabulated by the Census Bureau. The most recent data available were released in December 2013 for calendar year 2012. About 14.5% of households fell into the “low food security” category and about 5.7% of households were in the “very low food security” pigeonhole. Assuming the current average of roughly 2.58 persons per household, this translates to approximately 34 million people in the first category and just under 13.5 million people in the second category. If we assume the same fraction of children in these at-risk households as those in 2008, that would imply about 635,000 children in the high-risk category, or less than 0.9 of 1% of the nation’s children. That is a far cry from the 17% of the nation’s children mentioned in the Washington Post article of 2008. It is a farther cry from the 17,000,000 children mentioned in current ads, which would be over 20% of America’s children.

The USDA’s Work is From Hunger

It should occur to us to wonder why the Department of Agriculture – Agriculture, yet – should now reign as the nation’s arbiter of hunger. As it happens, economists are well situated to answer that question. They know that the federal food-stamp began in the 1940s primarily as a way of disposing of troublesome agricultural surpluses. The federal government spent the decade of the 1930s throwing everything but the kitchen sink at the problem of economic depression. Farmers were suffering because world trade had imploded; each nation was trying to protect its own businesses by taxing imports of foreign producers. Since the U.S. was the world’s leading exporter of foodstuffs, its farmers were staggering under this impact. They were swimming in surpluses and bled so dry by the resulting low prices that they burned, buried or slaughtered their own output without bringing it to market in an effort to raise food prices.

The Department of Agriculture devised various programs to raise agricultural prices, most of which involved government purchases of farm goods to support prices at artificially high levels. Of course, that left the government with lots of surplus food on its hands, which it stored in Midwestern caves in a futile effort to prevent spoilage. Food distribution to the poor was one way of ridding itself of these surpluses, and this was handled by the USDA which was already in possession of the food.

Just because the USDA runs the food-stamp program (now run as a debit-card operation) doesn’t make it an expert on hunger, though. Hunger is a medical and nutritional phenomenon, not an agricultural one. Starvation is governed by the intake of sufficient calories to sustain life; malnutrition is caused by the maldistribution of nutrients, vitamins and minerals. Does the Census Bureau survey doctors on the nutritional status of their patients to provide the USDA with its data on “food insecurity?”

Not hardly. The Census Bureau simply asks people questions about their food intake and solicits their own evaluation of their nutritional status. Short of requiring everybody to undergo a medical evaluation and submit the findings to the government, it could hardly be otherwise. But this poses king-sized problems of credibility for the USDA. Asking people whether they ever feel hungry or sometimes don’t get “enough” food is no substitute for a medical evaluation of their status.

People can and do feel hungry without coming even close to being hungry in the sense of risking starvation or even suffering a nutritional deficit. Even more to the point, their feelings of hunger may signal a nutritional problem that cannot be cured by money, food pantries, shelters or even higher wages and salaries. The gap between the “low food security” category identified by the USDA and starving peoples in Africa or Asia is probably a chasm the size of the Grand Canyon.

The same America that is supposedly suffering rampant hunger among both adults and children is also supposedly suffering epidemics of both obesity and diabetes. There is only one way to reconcile these contradictions: by recognizing that our “hunger” is not the traditional type but rather the kind associated with diabetes (hence, obesity) rather than the traditional sort of starvation or malnutrition. Over-ingestion of simple carbohydrates and starches can often cause upward spikes in blood sugar among susceptible populations, triggering the release of insulin that stores the carbohydrate as fat. Since the carbohydrate is stores as fat rather than burned for energy, the body remains starved for energy and hungry even though it is getting fat. Thus do hunger and obesity coexist.

The answer is not more government programs, food stamps, food pantries and shelters. Nor, for that matter, is it more donations to non-profit agencies like Feeding America. It is not more food at all, in the aggregate. Instead, the answer is a better diet – something that millions of Americans have found out for themselves in the last decade or so. In the meantime, there is no comparison between the “hunger” the USDA is supposedly measuring and the mental picture we form in our minds when we think of hunger.

This is not the only blatant contradiction raised by the “hunger in America” claims. University of Chicago economist Casey Mulligan, in his prize-winning 2012 book The Redistribution Recession, has uncovered over a dozen government program and rule changes that reduced the incentive to work and earn. He assigns these primary blame for the huge drop in employment and lag in growth that the U.S. has summered since 2007. High on his list are the changes in the food-stamp program that substituted a debit card for stamps, eliminated means tests and allowed recipients to remain on the program indefinitely. A wealthy nation in which 46 million out of 315 million citizens are on the food dole cannot simultaneously be suffering a problem of hunger. Other problems, certainly – but not that one.

What About the Real Hunger?

That is not to say that real hunger is completely nonexistent in America. Great Britain’s BBC caught word of our epidemic of hunger and did its own story on it, following the New York Times, Washington Post, Parade Magazine party line all the way. The BBC even located a few appropriately dirty, ragged children for website photos. But the question to ask when confronted with actual specimens of hunger is not “why has capitalism failed?” or “why isn’t government spending enough money on food-security programs?” The appropriate question is “why do we keep fooling ourselves into thinking that more government spending is the answer when the only result is that the problem keeps getting bigger?” After all, the definition of insanity is doing the same thing over and over again and expecting a different result.

The New York Times article in late 2013 quoted two academic sources that were termed “critical” of the ad campaign. But they said nothing about its blatant lies and complete inaccuracy. No, their complaint was that it promoted “charity” as the solution rather than their own pet remedies, a higher minimum wage and more government programs. This calls to mind the old-time wisecrack uttered by observers of the Great Society welfare programs in the 1960s and 70s: “This year, the big money is in poverty.” The real purpose of the ad campaign is to promote the concept of hunger in America in order to justify big-spending government programs and so-called private programs that piggyback on the government programs. And the real beneficiaries of the programs are not the poor and hungry but the government employees, consultants and academics whose jobs depend on the existence of “problems” that government purports to “solve” but that actually get bigger in order to justify ever-more spending for those constituencies.

That was the conclusion reached, ever so indirectly and delicately, by Chris Edwards of Cato Institute in his 2008 piece pooh-poohing the “hunger in America” movement. It applies with equal force to the current campaign launched by non-profits like the Ad Council and Feeding America, because the food banks, food pantries and shelters are supported both directly and indirectly by government programs and the public perception of problems that necessitate massive government intervention. It is the all-too-obvious answer to the cry for enlightenment made earlier in this essay.

In this context, it is clear that the answer to any remaining pockets of hunger is indeed charity. Only private, voluntary charity escapes the moral hazard posed by the bureaucrat/consultant class that has no emotional stake in the welfare of the poor and unfortunate but a big stake in milking taxpayers. This is the moral answer because it does not force people to contribute against their will but does allow them to exercise free will in choosing to help their fellow man. A moral system that works must be better than an immoral one that fails.

Where is the Protest?

The upshot of our inquiry is that the radio ads promoting “hunger in America” and suggesting that America’s children don’t know where their next meal is coming from are an intellectual fraud. There is no evidence that those children exist in large numbers, but their existence in any size indicts the current system. Rather than rewarding the failure of our current immoral system, we should be abandoning it in favor of one that works.

Our failure to protest these ads and publicize the truth is grim testimony to how far America has fallen from its origins and ideals. In the first colonial settlements at Jamestown and Plymouth, colonists learned the bitter lesson that entitlement was not a viable basis for civilization and work was necessary for survival. We are in the process of re-learning that lesson very slowly and painfully.

DRI-259 for week of 2-2-14: Kristallnacht for the Rich: Not Far-Fetched

An Access Advertising EconBrief:

Kristallnacht for the Rich: Not Far-Fetched

Periodically, the intellectual class aptly termed “the commentariat” by The Wall Street Journal works itself into frenzy. The issue may be a world event, a policy proposal or something somebody wrote or said. The latest cause célèbre is a submission to the Journal’s letters column by a partner in one of the nation’s leading venture-capital firms. The letter ignited a firestorm; the editors subsequently declared that Tom Perkins of Kleiner Perkins Caulfield & Byers “may have written the most-read letter to the editor in the history of The Wall Street Journal.”

What could have inspired the famously reserved editors to break into temporal superlatives? The letter’s rhetoric was both penetrating and provocative. It called up an episode in the 20th century’s most infamous political regime. And the response it triggered was rabid.

“Progressive Kristallnacht Coming?”

“…I would call attention to the parallels of fascist Nazi Germany to its war on its “one percent,” namely its Jews, to the progressive war on the American one percent, namely “the rich.” With this ice breaker, Tom Perkins made himself a rhetorical target for most of the nation’s commentators. Even those who agreed with his thesis felt that Perkins had no business using the Nazis in an analogy. The Wall Street Journal editors said “the comparison was unfortunate, albeit provocative.” They recommended reserving Nazis only for rarefied comparisons to tyrants like Stalin.

On the political Left, the reaction was less measured. The Anti-Defamation League accused Perkins of insensitivity. Bloomberg View characterized his letter as an “unhinged Nazi rant.”

No, this bore no traces of an irrational diatribe. Perkins had a thesis in mind when he drew an analogy between Nazism and Progressivism. “From the Occupy movement to the demonization of the rich, I perceive a rising tide of hatred of the successful one percent.” Perkins cited the abuse heaped on workers traveling Google buses from the cities to the California peninsula. Their high wages allowed them to bid up real-estate prices, thereby earning the resentment of the Left. Perkins’ ex-wife Danielle Steele placed herself in the crosshairs of the class warriors by amassing a fortune writing popular novels. Millions of dollars in charitable contributions did not spare her from criticism for belonging to the one percent.

“This is a very dangerous drift in our American thinking,” Perkins concluded. “Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?” Perkins point is unmistakable; his letter is a cautionary warning, not a comparison of two actual societies. History doesn’t repeat itself, but it does rhyme. Kristallnacht and Nazi Germany belong to history. If we don’t mend our ways, something similar and unpleasant may lie in our future.

A Short Refresher Course in Early Nazi Persecution of the Jews

Since the current debate revolves around the analogy between Nazism and Progressivism, we should refresh our memories about Kristallnacht. The name itself translates loosely into “Night of Broken Glass.” It refers to the shards of broken window glass littering the streets of cities in Germany and Austria on the night and morning of November 9-10, 1938. The windows belonged to houses, hospitals, schools and businesses owned and operated by Jews. These buildings were first looted, then smashed by elements of the German paramilitary SA (the Brownshirts) and SS (security police), led by the Gauleiters (regional leaders).

In 1933, Adolf Hitler was elevated to the German chancellorship after the Nazi Party won a plurality of votes in the national election. Almost immediately, laws placing Jews at a disadvantage were passed and enforced throughout Germany. The laws were the official expression of the philosophy of German anti-Semitism that dated back to the 1870s, the time when German socialism began evolving from the authoritarian roots of Otto von Bismarck’s rule. Nazi officialdom awaited a pretext on which to crack down on Germany’s sizable Jewish population.

The pretext was provided by the assassination of German official Ernst vom Rath on Nov. 7, 1938 by a 17-year-old German boy named Herschel Grynszpan. The boy was apparently upset by German policies expelling his parents from the country. Ironically, vom Rath’s sentiments were anti-Nazi and opposed to the persecution of Jews. Von Rath’s death on Nov. 9 was the signal for release of Nazi paramilitary forces on a reign of terror and abduction against German and Austrian Jews. Police were instructed to stand by and not interfere with the SA and SS as long as only Jews were targeted.

According to official reports, 91 deaths were attributed directly to Kristallnacht. Some 30,000 Jews were spirited off to jails and concentration camps, where they were treated brutally before finally winning release some three months later. In the interim, though, some 2-2,500 Jews died in the camps. Over 7,000 Jewish-owned or operated businesses were damaged. Over 1,000 synagogues in Germany and Austria were burned.

The purpose of Kristallnacht was not only wanton destruction. The assets and property of Jews were seized to enhance the wealth of the paramilitary groups.

Today we regard Kristallnacht as the opening round of Hitler’s Final Solution – the policy that produced the Holocaust. This strategic primacy is doubtless why Tom Perkins invoked it. Yet this furious controversy will just fade away, merely another media preoccupation du jour, unless we retain its enduring significance. Obviously, Tom Perkins was not saying that the Progressive Left’s treatment of the rich is now comparable to Nazi Germany’s treatment of the Jews. The Left is not interning the rich in concentration camps. It is not seizing the assets of the rich outright – at least not on a wholesale basis, anyway. It is not reducing the homes and businesses of the rich to rubble – not here in the U.S., anyway. It is not passing laws to discriminate systematically against the rich – at least, not against the rich as a class.

Tom Perkins was issuing a cautionary warning against the demonization of wealth and success. This is a political strategy closely associated with the philosophy of anti-Semitism; that is why his invocation of Kristallnacht is apropos.

The Rise of Modern Anti-Semitism

Despite the politically correct horror expressed by the Anti-Defamation Society toward Tom Perkins’ letter, reaction to it among Jews has not been uniformly hostile. Ruth Wisse, professor of Yiddish and comparative literature at HarvardUniversity, wrote an op-ed for The Wall Street Journal (02/04/2014) defending Perkins.

Wisse traced the modern philosophy of anti-Semitism to the philosopher Wilhelm Marr, whose heyday was the 1870s. Marr “charged Jews with using their skills ‘to conquer Germany from within.’ Marr was careful to distinguish his philosophy of anti-Semitism from prior philosophies of anti-Judaism. Jews “were taking unfair advantage of the emerging democratic order in Europe with its promise of individual rights and open competition in order to dominate the fields of finance, culture and social ideas.”

Wisse declared that “anti-Semitism channel[ed] grievance and blame against highly visible beneficiaries of freedom and opportunity.” “Are you unemployed? The Jews have your jobs. Is your family mired in poverty? The Rothschilds have your money. Do you feel more secure in the city than you did on the land? The Jews are trapping you in the factories and charging you exorbitant rents.”

The Jews were undermining Christianity. They were subtly perverting the legal system. They were overrunning the arts and monopolizing the press. They spread Communism, yet practiced rapacious capitalism!

This modern German philosophy of anti-Semitism long predated Nazism. It accompanied the growth of the German welfare state and German socialism. The authoritarian political roots of Nazism took hold under Otto von Bismarck’s conservative socialism, and so did Nazism’s anti-Semitic cultural roots as well. The anti-Semitic conspiracy theories ascribing Germany’s every ill to the Jews were not the invention of Hitler, but of Wilhelm Marr over half a century before Hitler took power.

The Link Between the Nazis and the Progressives: the War on Success

As Wisse notes, the key difference between modern anti-Semitism and its ancestor – what Wilhelm Marr called “anti-Judaism” – is that the latter abhorred the religion of the Jews while the former resented the disproportionate success enjoyed by Jews much more than their religious observances. The modern anti-Semitic conspiracy theorist pointed darkly to the predominance of Jews in high finance, in the press, in the arts and running movie studios and asked rhetorically: How do we account for the coincidence of our poverty and their wealth, if not through the medium of conspiracy and malefaction? The case against the Jews is portrayed as prima facie and morphs into per se through repetition.

Today, the Progressive Left operates in exactly the same way. “Corporation” is a pejorative. “Wall Street” is the antonym of “Main Street.” The very presence of wealth and high income is itself damning; “inequality” is the reigning evil and is tacitly assigned a pecuniary connotation. Of course, this tactic runs counter to the longtime left-wing insistence that capitalism is inherently evil because it forces us to adopt a materialistic perspective. Indeed, environmentalism embraces anti-materialism to this day while continuing to bunk in with its progressive bedfellows.

We must interrupt with an ironic correction. Economists – according to conventional thinking the high priests of materialism – know that it is human happiness and not pecuniary gain that is the ultimate desideratum. Yet the constant carping about “inequality” looks no further than money income in its supposed solicitude for our well-being. Thus, the “income-inequality” progressives – seemingly obsessed with economics and materialism – are really anti-economic. Economists, supposedly green-eyeshade devotees of numbers and models, are the ones focusing on human happiness rather than ideological goals.

German socialism metamorphosed into fascism. American Progressivism is morphing from liberalism to socialism and – ever more clearly – honing in on its own version of fascism. Both employed the technique of demonization and conspiracy to transform the mutual benefit of free voluntary exchange into the zero-sum result of plunder and theft. How else could productive effort be made to seem fruitless? How else could success be made over into failure? This is the cautionary warning Perkins was sounding.

The Great Exemplar

The great Cassandra of political economy was F.A. Hayek. Early in 1929, he predicted that Federal Reserve policies earlier in the decade would soon bear poisoned fruit in the form of a reduction in economic activity. (His mentor, Ludwig von Mises, was even more emphatic, foreseeing “a great crash” and refusing a prestigious financial post for fear of association with the coming disaster.) He predicted that the Soviet economy would fail owing to lack of a functional price system; in particular, missing capital markets and interest rates. He predicted that Keynesian policies begun in the 1950s would culminate in accelerating inflation. All these came true, some of them within months and some after a lapse of years.

Hayek’s greatest prediction was really a cautionary warning, in the same vein as Tom Perkins’ letter but much more detailed. The 1945 book The Road to Serfdom made the case that centralized economic planning could operate only at the cost of the free institutions that distinguished democratic capitalism. Socialism was really another form of totalitarianism.

The reaction to Hayek’s book was much the same as reaction to Perkins’ letter. Many commentators who should have known better have accused both of them of fascism. They also accused both men of describing a current state of affairs when both were really trying to avoida dystopia.

The flak Hayek took was especially ironic because his book actually served to prevent the outcome he feared. But instead of winning the acclaim of millions, this earned him the scorn of intellectuals. The intelligentsia insisted that Hayek predicted the inevitable succession of totalitarianism after the imposition of a welfare state. When welfare states in Great Britain, Scandinavia, and South America failed to produce barbed wire, concentration camps and German Shepherd dogs, the Left advertised this as proof of Hayek’s “exaggerations” and “paranoia.”

In actual fact, Great Britain underwent many of the changes Hayek had feared and warned against. The notorious “Rules of Engagements,” for instance, were an attempt by a Labor government to centrally control the English labor market – to specify an individual’s work and wage rather than allowing free choice in an impersonal market to do the job. The attempt failed just a dismally as Hayek and other free-market economists had foreseen it would. In the 1980s, it was Hayek’s arguments, wielded by Prime Minister Margaret Thatcher, which paved the way for the rolling back of British socialism and the taming of inflation. It’s bizarre to charge the prophet of doom with inaccuracy when his prophecy is the savior, but that’s what the Left did to Hayek.

Now they are working the same familiar con on Tom Perkins. They begin by misconstruing the nature of his argument. Later, if his warnings are successful, they will use that against him by claiming that his “predictions” were false.

Enriching Perkins’ Argument

This is not to say that Perkins’ argument is perfect. He has instinctively fingered the source of the threat to our liberties. Perkins himself may be rich, but argument isn’t; it is threadbare and skeletal. It could use some enriching.

The war on the wealthy has been raging for decades. The opening battle is lost to history, but we can recall some early skirmishes and some epic brawls prior to Perkins.

In Europe, the war on wealth used anti-Semitism as its spearhead. In the U.S., however, the popularity of Progressives in academia and government made antitrust policy a more convenient wedge for their populist initiatives against success. Antitrust policy was a crown jewel of the Progressive movement in the early 1900s; Presidents Theodore Roosevelt and William Howard Taft cultivated reputations as “trust busters.”

The history of antitrust policy exhibits two pronounced tendencies: the use of the laws to restrict competition for the benefit of incumbent competitors and the use of the laws by the government to punish successful companies for various political reasons. The sobering research of Dominick Armentano shows that antitrust policy has consistently harmed consumer welfare and economic efficiency. The early antitrust prosecution of Standard Oil, for example, broke up a company that had consistently increased its output and lowered prices to consumers over long time spans. The Orwellian rhetoric accompanying the judgment against ALCOA in the 1940s reinforces the notion that punishment, not efficiency or consumer welfare, was behind the judgment. The famous prosecutions of IBM and AT&T in the 1970s and 80s each spawned book-length investigations showing the perversity of the government’s claims. More recently, Microsoft became the latest successful firm to reap the government’s wrath for having the temerity to revolutionize industry and reward consumers throughout the world.

The rise of the regulatory state in the 1970s gave agencies and federal prosecutors nearly unlimited, unsupervised power to work their will on the public. Progressive ideology combined with self-interest to create a powerful engine for the demonization of success. Prosecutors could not only pursue their personal agenda but also climb the career ladder by making high-profile cases against celebrities. The prosecution of Michael Milken of Drexel Burnham Lambert is a classic case of persecution in the guise of prosecution. Milken virtually created the junk-bonk market, thereby originating an asset class that has enhanced the wealth of investors by untold billions or trillions of dollars. For his pains, Milken was sent to jail.

Martha Stewart is a high-profile celebrity who was, in effect, convicted of the crime of being famous. She was charged and convicted of lying to police about a case in which the only crime could have been the offense of insider-trading. But she was the trader and she was not charged with insider-trading. The utter triviality and absence of any damage to consumers or society at large make it clear that she was targeted because of her celebrity; e.g., her success.

Today, the impetus for pursuing successful individuals and companies today comes primarily from the federal level. Harvey Silverglate (author of Three Felonies Per Day) has shown that virtually nobody is safe from the depredations of prosecutors out to advance their careers by racking up convictions at the expense of justice.

Government is the institution charged with making and enforcing law, yet government has now become the chief threat to law. At the state and local level, governments hand out special favors and tax benefits to favored recipients – typically those unable to attain success on their own efforts – while making up the revenue from the earned income of taxpayers at large. At the federal level, Congress fails in its fundamental duty and ignores the law by refusing to pass budgets. The President appoints czars to make regulatory law, while choosing at discretion to obey the provisions of some laws and disregard others. In this, he fails his fundamental executive duty to execute the laws faithfully. Judges treat the Constitution as a backdrop for the expression of their own views rather than as a subject for textual fidelity. All parties interpret the Constitution to suit their own convenience. The overarching irony here is that the least successful institution in America has united in a common purpose against the successful achievers in society.

The most recent Presidential campaign was conducted largely as a jihad against the rich and successful in business. Mitt Romney was forced to defend himself against the charge of succeeding too well in his chosen profession, as well as the corollary accusation that his success came at the expense of the companies and workers in which his private-equity firm invested. Either his success was undeserved or it was really failure. There was no escape from the double bind against which he struggled.

It is clear, than, that the “progressivism” decried by Tom Perkins dates back over a century and that it has waged a war on wealth and success from the outset. The tide of battle has flowed – during the rampage of the Bull Moose, the Depression and New Deal and the recent Great Recession and financial crisis – and ebbed – under Eisenhower and Reagan. Now the forces of freedom have their backs to the sea.

It is this much-richer context that forms the backdrop for Tom Perkins’ warning. Viewed in this panoramic light, Perkins’ letter looks more and more like the battle cry of a counter-revolution than the crazed rant of an isolated one-percenter.

DRI-201 for week of 1-12-14: No Bravos for Bernanke

An Access Advertising EconBrief:

No Bravos for Bernanke

Last weekend’s Wall Street Journal featured an op-ed by the former Chairman of President Obama’s Council of Economic Advisors, Austan Goolsbee. Goolsbee’s tenure obviously familiarized him with the chief requirement for policymaking success in a Democrat regime; namely, the ability to define success down. His op-ed, “Brave for Bernanke and the QE Era,” is a spectacular example of the art.

Goolsbee contrasts the enthusiastic reception given to Federal Reserve Chairman Bernanke’s farewell address to the American Economic Association convention with the wide-ranging criticism directed at Bernanke from across the political spectrum. He briefly summarizes the Fed’s policies under Bernanke, confining himself to the last 3 ½ years of the so-called QE (quantitative easing) Era. Bernanke’s imminent departure and the start of the “exit-strategy countdown” signaled by QE tapering mean that “it is time to take stock of the QE Era – and time for the critics to admit they were wrong.”

Partisan divisions being what they are, it is a foregone conclusion that Goolsbee’s call will not resonate with many on both sides of the political spectrum. But it is not necessary to invoke partisan politics to criticize it. Bernanke’s policies – and Goolsbee’s – are anathema to free-market economists. But one need not espouse laissez faire in order to gaze askance at Goolsbee’s case for Bernanke’s actions. Bernanke’s tenure should be viewed as a disaster regardless of one’s political preference or economic philosophy.

Indeed, the propriety of Bernanke’s policy choices is not up for debate at this point. They are what economists would call a sunk cost; their costs have been incurred (or are unavoidable) and can’t be changed or escaped. No doubt Bernanke should have chosen differently and we would be better off if he had done so. But the question before the house is: What were the actual results of his choices? Goolsbee finds those results to be very good and purports to explain why. We can and should quarrel with his verdict and his explanations.

Bernanke and Stimulus

“…Looking back…it is clear that the Fed was right to try to help improve the [economy] and the critics were wrong [about inflation].” Goolsbee assigns Bernanke’s policies a grade of A for activism.  The implication is that it is better for a Federal Reserve Chairman to do something rather than nothing, even if activism requires a program of unprecedented scope and unknown impact.

“Think back to the days before the 2008 crisis or recession. If confronted with the scenario that would follow – five years of GDP growth of only around 2% a year, five years of unemployment rates around or above 7%, core inflation consistently below 2% – the near-universal response of economists would have been for the Fed to cut interest rates.” But would economists have reacted that way knowing that all of these effects accompanied a policy of zero interest rates? It’s one thing to say “we should have cut interest rates and all these bad things wouldn’t have happened,” but quite another to say “all these bad things happened anyway in spite of our interest rate cuts.” An objective observer would have to consider the possibility that the interest rate cuts were the wrong medicine. Of course, Goolsbee pretends that this is unthinkable; that the only possible action in the face of adversity is cutting rates. But if that were really true, his review of Bernanke’s reign is a mere formality; Bernanke’s decisions were right by definition, regardless of result. In reality, of course, the zero-interest-rate policy was not a foregone conclusion but rather an evaluative action subject to serious question. And its results do not constitute a ringing endorsement.

To appreciate the truth of this, ponder the wildly varying conclusions reached by Keynesian economists who are not ideologically hostile to Bernanke and Goolsbee. Larry Summers considers macroeconomic policy under Obama a failure because the U.S. suffers from “secular stagnation.” He prescribes a cure of massive public spending to replace the structural collapse of private investment and private over-saving. While Bernanke cannot take the blame for the failings of fiscal policy, Summers criticizes him for not doing more to provide liquidity and increase (!) inflation. Summers’ colleague Paul Krugman is even more emphatic. Bernanke should crank up the printing press to create bubbles because wasteful spending by government and the private sector is necessary to create employment. Without waste and profligacy, unemployment will persist or even rise. Alan Blinder has the temerity to point out what free-market economists noticed years ago – that the money created by Bernanke was mostly sitting idle in excess bank reserves because the Fed had chosen to pay interest on excess reserves. But Blinder is too gentlemanly to ask the obvious question: If the money creation is supposed to be “economic stimulus,” why has Bernanke prevented the money from actually stimulating?

These Keynesian economists are the farthest thing from free-market, laissez faire doctrinaires. But they are not about to give Ben Bernanke a passing grade merely for showing up, trying hard and looking very busy.

To be sure, Goolsbee does make a case that Bernanke actually succeeded in stimulating the U.S. economy. He names two of his colleagues, fellow attendees at the AEA convention, whose “research indicates that [Bernanke’s] Fed policies have helped the economy, albeit modestly… they lowered long-term Treasury rates by about 30 basis points and a bit more for mortgage spreads and corporate bond yields…Americans were able to refinance their homes at more affordable rates, and the drop led to an increase in consumer spending on automobiles and other durables.” Fifty years ago, John Maynard Keynes’ picture appeared on the cover of Time Magazine as an avatar of the end of the business cycle. Now, our Fed prosecutes a policy characterized by a leading English central banker as “the greatest government-bond bubble in history,” and economists have to do research in order to dig up “modest benefits” of the policy that would otherwise go unnoticed. And Goolsbee offers them up with a straight face as the blessings that justify our gratitude to Ben Bernanke.

Bernanke and Inflation

“The QE Era did not create inflation. Not even close. The people who said it would were looking only at the growth in the monetary base… the people arguing that QE means simply printing money (it doesn’t, really) didn’t recognize that the policy was simply offsetting the reverse printing of money resulting from the tight credit channels in the damaged financial system.” Milton Friedman devoted the bulk of his career to refuting the claims of this type of thinking; it would require a lengthy article to review his insights and a book-length analysis to review the economics issues raised by Goolsbee’s nonchalant assertions. But one sentiment popularized by Friedman suffices to convey the concern of “the people” Goolsbee dismisses so condescendingly: “Inflation is always and everywhere a monetary phenomenon.”

At no point in human history has a monetary expansion like Bernanke’s occurred without leading to hyperinflation. So Bernanke’s critics are not a gaggle of tinfoil-hat-sporting, tennis-shoe-wearing conspiracy theorists. They have history on their side. Goolsbee’s confident assurance that Bernanke leaves office with inflation under control is based on a planted axiom the size of an iceberg; namely, that Bernanke’s successor(s) can somehow corral the several trillion dollars of excess reserves still loitering around the financial system before they emerge into circulation in the form of expenditures chasing a limited stock of goods and services. But that’s only the beginning; the Fed must also conduct this money wrangling in such a way as to keep interest rates from rising too greatly and thereby dealing the economy a one-two death blow of overwhelming government debt service and private-sector constriction of economic activity. It is not immediately obvious how this might be accomplished.

Friedman made a case that the Great Depression began in the early 1930s with bank failures that had a multiple contractionary effect on the U.S. money supply. Like generals always fighting the last war, the Fed has since been grimly determined not to be hung for monetary tightness. As F.A. Hayek pointed out, a central bank that always errs on the side of loose money and inflation and never on the side of tight money or deflation will inevitably bias its policy toward inflation. That is the status quo today. Japan’s longtime low inflation is miscalled “deflation,” thereby providing a rhetorical justification for revving up the inflationary printing press. A similar boomlet is building here in the U.S.

Presumably, this explains Goolsbee’s reference to QE credit creation as an offset to credit destruction. But whether you accept Friedman’s analysis or not, Goolsbee’s rationale doesn’t hold water. The bank bailouts of 2008-09 – which were forced on sound banks and shaky ones alike by the Fed and the Treasury – were explicitly sold as a means of guaranteeing financial liquidity. QE did not come along until mid-2010. By then, banks had already repaid or were repaying TARP loans. Thus, Goolsbee cannot sell the QE Era as the solution to a problem that had already been solved. Instead, the evidence favors QE as the palliative for the financial problems of the U.S. Treasury and the spending addiction of the U.S. Congress – matters that Goolsbee delicately overlooks.

Bernanke and Greenspan: The Perils of Premature Congratulation

When Alan Greenspan left office, he had presided over nearly two decades of economic prosperity. The news media had dubbed him “The Maestro.” It is not hard to understand why he was showered with accolades upon retirement. Yet within a few short years his reputation was in tatters. Bernanke gave us an industrial-strength version of Greenspan loose-money policies. But the economy spent most of Bernanke’s tenure in the tank. And Bernanke leaves office having bequeathed us a monetary sword of Damocles whose swing leaves our hair blowing in its breeze.

With the example of Greenspan still fresh in mind, we can justifiably withhold judgment on Bernanke without being accused of rank political prejudice.

Bernanke as Savior

“…We should all be able to agree that fashion standards during a polar vortex shouldn’t be the same as in normal times.” Goolsbee is suggesting that Bernanke has adopted the stern measures called for by the hard times thrust upon him. This is indeed the leitmotiv for economic policy throughout the Obama Administration, not merely monetary policy – hey, just imagine how much worse things could have been, would have been, had we not done what we did. In order for this alibi to stand up, there must be general agreement about the nature and size of the problem(s) and the remedy(ies). Without that agreement, we cannot be sure that Bernanke hasn’t worsened the situation rather than helping it – by addressing non-existent problems and/or applying inappropriate solutions.

In this case, we have had only the word of Chairman Bernanke and Treasury Secretary O’Neill (under President Bush) that economic collapse was threatened in 2008. Despite the wild talk of imminent “meltdown,” none occurred. Indeed, there is no theoretical event or sequence that would meet that description in economics. General economic activity worsened markedly – after the bailout measures were authorized by Congress. The emergency stimulus program did not affect this worsening, nor did it effect the official recovery in June 2009; stimulus funds were so slow to reach the economy that the recovery was well underway by the time they arrived.

The QE program itself has been advertised as “economic stimulus” but is notable for not living up to this billing. (To be sure, this is misleading advertising for the reasons cited above.) If anybody feels grateful to Bernanke for launching it, it is presumably officials of the Treasury and Congress – the former because QE prevented interest rates from rising to normal levels that would have swamped the federal budget in a debt-service tsunami, the latter because the precious spending programs beloved of both parties were spared. But Goolsbee comes nowhere within sight or shouting distance of these financial truths.

It makes sense to hail a savior only when you have reached safety. We haven’t even crossed the icy waters yet, because we’ve had the benefits – tenuous though they’ve been – of QE without having to bear the costs. In other words, the worst is yet to come. Bernanke has made all of us protagonists in an old joke. A man jumps out of a skyscraper. As he falls toward earth, the inhabitants of the building on each successive lower floor hear him mutter, “Well – so far, so good.”

The Politicization of Economics

Why make so much of Austan Goolsbee’s valedictory salute to Ben Bernanke? If the quality of Bernanke’s economic policy is a sunk cost at this point, doesn’t that also moot our assessment of his job performance? If Austan Goolsbee has badly misjudged that performance, that doesn’t say much for Goolsbee, but why should we care? After all, Goolsbee is no longer employed by the Obama Administration; he is now safety ensconced back in academia.

Goolsbee’s judgments matter because they are clearly motivated by politics. They are part of a disturbing pattern in which liberal economists provide a thin veneer of economics – or sometimes no economics at all – to cover their espousal of left-wing causes. Goolsbee pooh-poohs the claim that QE was both dangerous and unnecessary, claiming that the rise in the stock market is not a bubble because it has “tracked increases in corporate earnings.” But earlier in the same article, Goolsbee claimed that QE lowered long-term interest rates on Treasuries and corporate bonds (thus reducing costs of corporate finance) and increased spending on consumer durables. So QE induced increases in corporate earnings that wouldn’t otherwise have occurred, causing increased stock prices – but is absolved from charges of creating a stock bubble because the stock prices were caused by autonomous increases in corporate earnings? Goolsbee claims credit for QE on Bernanke’s behalf at one stage, and then disclaims QE’s influence on exactly the same point at another. This is the type of circular contradiction masquerading as economics that Goolsbee and other Keynesians use to sell their politics.

“Forgoing the Fed’s unconventional monetary policies – inviting real and quantifiable damage to the economy – just to prevent the possibility of a potentially dangerous bubble forming somewhere in the economy would have been cruel and unnecessary,” Goolsbee concludes. Foregoing the “modest benefits” that Goolsbee’s pals managed to dig up merely because the Fed had to create “the greatest government-bond bubble in history” in order to generate them would have been “cruel and unnecessary.” Oh, wait – what about the loss of interest income suffered by hundreds of millions of Americans – many of them retirees, the disabled and other fixed-income investors – thanks to the zero-interest-rate policy ushered in by the QE Era? Was this cruel and/or unnecessary? Goolsbee delicately avoids the subject.

But Goolsbee’s fellow Keynesian, Paul Krugman, is not so circumspect. Krugman comes right out and says that nobody has the right to expect a positive interest return on safe assets while the economy was in a depression; they can either make do with an infinitesimal interest return or lose the value of their money to inflation. (In the same blog post, Krugman had previously accused his critics of callous indifference to the pain caused by business liquidations in a depression.)

This is not economics. It is half-baked value judgment hiding behind the mask of social science. Similarly, Austan Goolsbee’s evaluation of Ben Bernanke’s term as Federal Reserve Chairman may have the imprimatur of economics, but it lacks any of the substance.

DRI-358 for week of 10-7-12: More Expensive Free Lunches

An Access Advertising EconBrief:

 

More Expensive Free Lunches

 

Last week’s EconBrief developed the economic concept of the free lunch. “There’s no such thing as a free lunch” may be the most famous of all economic aphorisms. Often credited to Milton Friedman, it owes much to that late Nobel laureate’s astonishing talent for exposition. Friedman pointed out that the notion of a free lunch violated the principle of opportunity cost, which undergirds the very subject of economics. Since resources have alternative uses, anything produced using scarce resources must be costly. The highest-valued alternative output foregone constitutes the opportunity cost of production.

Neglect of opportunity cost is the hallmark of the free lunch. Another distinctive feature is the underpricing of a scarce good or resource on the pretext of improving welfare. The pretext obscures the true purpose of the free lunch, which is to grow government. Expansion of government regulation, agencies, bureaus and programs is another characteristic of the free lunch. Finally, the presence of unintended collateral damage – often the result of overindulgence in the underpriced “free” good – is an unmistakable sign of a free lunch.

 

Water Subsidies in the West

The lure of the free lunch acts as a sort of political Venus Fly Trap, tempting unwary citizens within range so as to swallow them up. Once gobbled up by the system, nobody emerges whole.

Farmers in the western United States were sucked in during the late 1800s and early 1900s. In this case, the lure was not a free lunch but a free drink – of water. The federal government built huge reservoirs and accompanying dams that it used to provide electric power. It made the water available to farmers in California’s Central Valley for purposes of irrigation.

As with free lunches to schoolchildren, the accompanying rhetoric was redolent with poetry and sentiment. The irrigation would “make the desert bloom.” And so it did. Eventually, farmers were able to grow crops on land that previously had little agricultural use. The federal government paid the farmers not to grow crops on the land, causing the farmers to set aside acreage and farm remaining land much more intensively, using larger amounts of water, fertilizer and pesticides. Then the federal government bought the crop surpluses produced by the farmers to artificially support crop prices, using taxpayer funds to pay for storage.

The tilled land was burned out by over-cultivation. Insects became resistant to the pesticides. Water shortages plagued the West. This was particularly irksome to farmers on better land located closer to the dams and reservoirs, who nevertheless needed some water for irrigation. Waterfowl and other wildlife species died by the tens of thousands as wetlands habitat dried up.

Acrimonious political divisions developed between Central Valley farmers and outsiders. Needless to say, farmers defended their subsidies, which had become a lucrative source of income.

Economists saw the problem as another free lunch gone wrong. The government undercharged Central Valley farmers for the water provided to them. Indeed, most communities through the U.S. and the world do not charge a true economic price for water. That is, a government-owned and operated water monopoly charges a flat rate for water usage instead of charging a price per unit of water consumed. This flat rate is economically equivalent to a price of zero.

Why? Because it is unchanged whenever the consumer increases or decreases water consumption. Thus, the marginal sacrifice (in terms of consumption foregone) for additional water consumed is zero. Under these circumstances, people are moved to treat water as a free good, to consume the maximum amount of it. Water is not a free good; additional quantities of it must be discovered, pumped, purified sufficiently for human consumption and made available to the consumer. Thus, by lying to the public, government encourages people to consume water well past the point where the personal value people place on additional water consumption is equal to the cost of producing and supplying that additional water. Because the cost exceeds the value, we are made poorer by the government policy.

Once more, practically everybody loses from the government “free lunch” policy on Western water. The possible exceptions are Central Valley farmers and government employees in the Department of Interior. Members of the Bureau of Reclamation and the Department of the Interior have made careers out of serving their constituents in the Central Valley. And, of course, proponents of big government can point to the irrigation projects and blooming desert and harvested crops. What can opponents point to? About all they can do is point to the environmental blight, the dead and dying wildlife. But farmers can deny that this has anything to do with them – they are just hardworking farmers, tillers of the soil – hardworking, God-fearing families who developed the desert in good faith.

Once again, the pattern is clear. A form of subsidy that benefits a few at the expense of the rest becomes entrenched and cannot be eradicated. No wonder, then, that economists wince when these programs start up. No wonder they take such an irritating, unyielding, hidebound stance against them. Once the programs are in place, dynamite cannot dislodge them.

Road Use

Americans are accustomed to climbing in their automobiles and taking off across the open road without let or hindrance. Public ownership of most roads has contributed to the fiction that road use is free, a fiction assiduously promoted by government policy. Sometimes a toll is charged for the use of a particular road, but traditionally the toll merely amortizes the debt incurred in construction. Expenses of road maintenance and repair are covered by revenues raised from the tax on gasoline purchases.

In fact, road use is not a free lunch. Roads are a capital good that must be built, maintained and replaced. The resources required for this are scarce and have alternative uses. Not only that, my use of the road at any point in time precludes use by somebody else, which violates the economic definition of a free good.

Opportunity costs of road use arise in both production and consumption. The resources necessary to build, maintain, repair and replace roads have alternative uses. This argues in favor of a price to place a value on use of the road. Consumers could then compare their personal valuation to the value of the alternative output foregone by making the road available for individual use. Driving would occur as long as the personal value exceeded the value of the output foregone in producing a usable road. But what usually happens is that the “free-lunch good” is underpriced, causing overuse. That is particularly true of road use at certain places, times and locations.

The overuse causes congestion. This congestion causes all sorts of collateral damage, including time wasted sitting in traffic queues, delays, road damage, accidents, gasoline wastage and air pollution. The cumulative effect is hardly trivial, since the time lost to traffic delays is estimated to have quintupled in the last 30 years.

As always, the free lunch has served the ends of big government. The Department of Transportation owes its existence to it, as doe various sub-departments and bureaus like the Federal Highway Transportation Safety Administration. State highway departments dole out money for care, maintenance and policing of state highways.

The seminal problem with this free lunch is ownership. Public ownership of the roads implies that nobody has an incentive to earn profits from them, maintain them to conserve their profit-earning capability or price their services to serve the needs of consumers. The conventional thinking (as opposed to “wisdom”) has always been that profits are bad – and prices are bad because they lead to profits. Experience has taught us the folly of this line of thinking. Profits point to goods and services that consumers want more of. Prices allow consumers to compare their valuation of each additional unit to the value of the resources used to produce it (e.g., the value of output foregone due to production). Prices and profits are the rational tools markets use to govern economic life.

The best way to reform the public road system is to privatize it. The second-best way is to use free-market techniques within a government context. Lease public roads to private firms for conversion to toll roads. Institute time-of-day pricing, or congestion pricing, to charge higher prices for road use during rush hours. This will divert some traffic away from rush hours to off-peak hours, which is a cheaper and more efficient solution than building more roads with peak usage points that last only 2-4 hours per day.

These approaches are now being followed, albeit on a small scale. Various states have leased highways to private firms as toll roads. In the Manhattan borough of New York City, all fixed tolls have been converted to congestion tolls in an effort to divert some of the city’s fearsome rush-hour traffic to off-peak points.

Military Conscription – A Ghost of Free Lunches Past

The opportunity to discuss a failed government policy in the past tense is so rare that it should not be bypassed. Military conscription in the U.S. began with the Civil War and continued through two World Wars and various lesser conflicts. The practice became hotly controversial during the Vietnam War, when many young men of draft age left the country to avoid involuntary induction into the military. The unpopularity of the draft undoubtedly led to its discontinuance in 1973.

Today the all-volunteer military operates smoothly and accepts both men and women. Occasional calls for a return to conscription echo the arguments made for the practice during its heyday. Foremost among these is that conscription saves money by allowing government to force inductees to work for lower wages than they would accept voluntarily.

It is indubitably correct that conscription forces many recruits to work for less pay than they would otherwise demand in a voluntary setting. Whether this amounts to “saving money” is a semantic question. What it certainly does not do, though, is to reduce the economic cost of providing national defense. When it comes to acquiring enlisted personnel for the military, the cost of service is not simply the monetary payment received by those soldiers. It is the value of the output they could have produced in their highest-valued alternative occupation, or the value of their marginal product. In a competitive market, their wage will be bid up until it reaches this level.

If a conscript is forced to accept a draftee wage of (say) $20,000 per year instead of the $30,000 per year he could have earned in civilian life, this is not really a fiscal victory for the draft. The government has merely levied an implicit tax on his labor, with the incidence of the tax falling on the conscript and the rest of us. The conscript earns $10,000 less than he would have otherwise; we receive military services that are less valuable than the civilian goods or services he could have produced instead.

Conscription has traditionally been portrayed in purely emotional terms. Proponents cite submission to compulsory military service as a patriotic duty. Those unwilling to defend their country are unworthy to enjoy the rights and privileges of citizenship. Opponents perceive state compulsion of its citizens as immoral; why should people be forced to fight and die for a cause they reject?

Economists cut this philosophical Gordian knot. Patriots want to win wars. The best way to do that is to put the best and most willing fighters on the front line, the best strategists in the war room and the best suppliers in the factories. Conscription completely louses up efficient resource allocation by forcing the unwilling and training the less able to fight; by making sergeants out of factory superintendents and officers out of the politically connected. More technically, conscription makes unskilled labor artificially cheap, encouraging the military to use too much of it and not enough capital goods (skilled labor and sophisticated weaponry). Indeed, this argument applies just as forcefully in peacetime.

The problem with the arguments of moralists is that they are neither necessary nor sufficient to deal with the problems posed by war and the necessity of raising and keeping a military force. In this pinched worldview, there is no stopping point between conscription on the one hand and unilateral disarmament and full-blown pacifism on the other hand. But voluntarism respects the arguments of the moralists while still allowing for optimal prosecution of war and national defense.

That conclusion is not merely theoretical. The record of the U.S. military in armed combat worldwide since the changeover to a voluntary force has been nonpareil. Recruits are now better educated – almost all are high-school graduates – and grade higher on aptitude tests at enlistment than under conscription. The military exhibits better morale, discipline and experience as a voluntary force. The logic developed above has been borne out in practice. If we were to revert to conscription, as is occasionally proposed, the resulting lower quality of recruits would raise the pecuniary costs of training to offset any monetary benefits accruing from lower wages.

In its heyday, conscription fit the free-lunch pattern like a glove. The opportunity cost of the conscript’s labor time – his or her civilian output – was ignored. The conscript’s work was underpriced, thereby distorting his or her use by government. Conscription contributed to the growth of big government in the form of the Veteran’s Administration, a massive bureaucracy devoted to processing, caring for and subsidizing citizen-soldiers rather than an army of professionals. And the collateral damage of the draft included not only the inefficiency of the armed services, but the contempt that it brought upon them and the lives it blighted.

Government Money Creation

Long before there were public schools to provide free lunches to, the foundations of the free-lunch concept were poured by the oligarchs of antiquity. They clipped coins, adulterated the metallic content of the money stock and generally debased the exchange value of the monetary media.

With the advent of paper money in the age of the printing press, monetary manipulation came into its own. Governments could pretend to create wealth by creating money – working the printing presses overtime creating currency for the public to spend. Prosperity is no farther away than the printer’s office. Happy days are here at last.

Unfortunately, money is not wealth. It merely allows the holder to acquire title to goods and services. Rapid money creation by government merely causes a mad scramble by holders of money to exchange it for the things they really want. Since the short interval between distribution of printed money and purchase does not allow for wholesale expansion of output, the result is more and more money holders waving currency and chasing a fixed supply of goods and services. The effect is to bid up prices.

The term “inflation” has come to be associated with the effect of money creation on prices when it might better be applied to the cause of the process; namely, the inflating of the money supply. The distinction is crucial, but we are only belatedly realizing that. It is sometimes true that the inflating of the money supply causes only some prices to rise. Even when all prices rise, they virtually never rise in perfect synchrony. And the world is now experiencing the most unusual case of all – a vast increase in government money creation with comparatively little effect on prices of goods and services (as of yet) but considerable effect on interest rates and the pattern of investment.

Governments today perceive virtually no monetary cost in money creation, since it is now effected by computer keystrokes to bank reserve accounts. The opportunity cost is that money that would otherwise be efficiently used in exchange and investment is now used inefficiently. Not only is its general purchasing power diluted – an effect that holds for all or most good, services and assets – but the distortion of relative values distorts specific markets such as housing, real estate, agriculture and many more. Once again, we see overuse of the free good whose value has been artificially cheapened by the free-lunch policy. As the supply of money rises, the urgency heightens to spend it before its value declines further.

And once again, we see the growth of government as beneficiary of the free lunch. The U.S. Federal Reserve was created in order to afford government control over the supply of money and credit. The Fed’s tentacles have spread until it now controls the banking and investment sectors, usurping not only private functions but even some functions of other government agencies.

Trying to sort out the direct from the collateral damage is somewhat arbitrary. For example, the entire financial crisis and ensuing Great Recession can be viewed as the collateral damage of the Fed’s money creation earlier in the decade, since the crisis would have been unthinkable in the absence of the monetary excess. But no matter how you allocate it, the overall damage has been enormous.

How Many Free Lunches Can We Afford?

As we have seen, the worst thing about economic free lunches is that they cost so much. That is the paradox of the free lunch – that it inherently promises what it logically cannot deliver. If this were all, perhaps we could write off the free lunch as a noble experiment. But the attempt to get something for nothing carries with it a big price tag. First there is inefficiency – neglect of opportunity cost means that resources are wasted and we become poorer. Then there is collateral damage – water shortages, water and air pollution, slaughter of wildlife, land devastation, road damage, highway gridlock, rush-hour tie-ups, inflation, malinvestment, unemployment add up to a gruesome butcher’s bill for just the four cases we discussed.

The Western world is currently undergoing a protracted financial crisis traceable to government overspending and debt. The crisis has its origins in the expensive free lunch.

DRI-391: The Man Who Won World War II

World War II was the transcendent historical event of the 20th century. It brought some 100 million men, representing most of the world’s nations, under arms. Between 50 and 70 million people died. In an event of this size and scope, who would so foolish as to assign credit for victory to one particular individual?

Five-star General of the Army Dwight D. Eisenhower, that’s who. Reviewing the war in his memoirs, Eisenhower named one person as “the man who won the war for us.” That man was Andrew Jackson Higgins.

Today few remember Higgins. Reviewing his story does more than restore rightful place to a forgotten hero. Higgins’ story teaches one of the most important lessons in economics.

Andrew Higgins and the Development of the Higgins Boat

Andrew Higgins was born in Nebraska in 1889. After dropping out of high school, Higgins entered the business world. In the 1920s, he started a lumber import/export business and operated it with his own large fleet of sailing ships. To service them, he built a shipyard. A few years along, Higgins designed a shallow-draft vessel with a spoonbill-shaped bow for coastal and river use, which he named Eureka. This boat was able to run up onto and off riverbanks to unload people and cargo. Higgins proved to be a genius at boat design and eventually shipbuilding replaced lumber trade as the primary business of Higgins Industries.

The Eurekaattracted the interest of the U.S. Marine Corps for use as a landing craft. It beat a boat designed by the Navy’s Bureau of Construction and Repair in tests in 1938 and 1939. The Eureka‘s only drawback was that men and cargo had to be offloaded over the side, risking exposure to enemy fire.

Since 1937, the Japanese navy had utilized ramps on its landing craft. Higgins directed his engineer to emulate this principle. The result was the unique landing craft whose technical name was “Landing Craft, Vehicle, Personnel,” (LCVP) but which became better known as the “Higgins Boat.”

The Higgins Boat was one of the most revolutionary advances in the history of military technology. Heretofore, maritime invasion forces had to disembark at port cities – a substantial disadvantage since the opponent could predict the likely arrival location(s) and accordingly prepare bristling defenses. A large army had to travel in large troop ships which, of necessity, were deep-draft vessels. Troop ships would run aground and founder if they tried to dock at most coastal locations. Only at ports with natural deep harbors could they safely dock and unload.

The Higgins Boat allowed men and equipment to be transferred from troop ships to smaller, shallow-draft vessels that could run right onto an ordinary beach, drop their ramps, unload men and equipment and then withdraw to repeat the process. This meant that opponents now had to worry about defending the majority of a coastline, not just one or a few ports.

Amazing as it seems today, the Higgins Boat did not win immediate acceptance upon rolling off the assembly line. Its fight for survival paralleled that of the Allies in the early stage of the war itself.

The Entrepreneur vs. the Bureaucracy

In the early part of World War II, authority for designing and building the Navy’s landing craft was entrusted to the Bureau of Ships. Higgins was viewed coolly by the Bureau. What could a Nebraskan teach the Navy about shipbuilding? Besides, the department had designed and built its own model. Somehow, the Navy could always find an excuse for dismissing Higgins’ designs even when they defeated the Navy’s boats in head-to-head competition.

There was one other minor obstacle standing between Higgins and the bureaucrats at the Bureau of Ships. He hated their guts, viewed their work with contempt and said so openly. “If the ‘red tape’ and the outmoded and outlandish Civil War methods of doing business were eliminated, the work could be done in the Bureau of Ships very efficiently with about one-sixth the present personnel,” Higgins observed. Fortunately for Higgins, he managed to sell his ideas to other friendly powers, this keeping himself in business even though lacking honor in his own land. “If the power structure in place during the early months of the war had stayed in place,” said historians Burton and Anita Fulsom in their book FDR Goes to War, “Higgins would have been out of work and Americans, according to Eisenhower, would have either lost the war or had victory long delayed.”

Two unlikely saviors came to Higgins’ aid. The first of them was none other than Franklin Delano Roosevelt. The President had waged a bitter fight with Republicans in favor of his New Deal economic policies for more than two terms in office, only to watch those policies fail to restore the U.S. economy to its pre-Depression levels of income and employment. His Treasury Secretary, Henry Morganthau, Jr., confessed to his diary that administration policymakers had tried everything they could think of to revive the economy – and failed. Despite Roosevelt’s unwavering faith in the New Deal – and in himself – he sensed that he would need the support of the business community in order to win the war.

Thus, FDR changed tactics abruptly, going from excoriating businessmen as “economic royalists” to abjuring them to ramp up production of war materiel in the national interest. Suddenly, it became politically correct to view business as part of the team, pulling together to win the war. Roosevelt announced this change in philosophy even before Pearl Harbor, in a fireside chat of May 26, 1940. He was fond of describing the change as a switch from “old Dr. New Deal” to “Dr. Win the War” – a characterization that reveals as much about Roosevelt’s view of himself as Physician-In-Chief for the country as it does about his strategy.

Part of Roosevelt’s new emphasis involved the creation of the Truman Committee, headed by then-Senator Harry Truman of Missouri, to investigate government waste and mismanagement. Truman’s efforts in his home state had made him very popular, so when the Marines went to bat with his committee on Higgins’ behalf, the combination was too much for the Bureau of Ships to resist. Truman told the Bureau to produce a landing craft and test it in competition with a Higgins Boat. The test took place on May 25, 1942.

Each boat carried a thirty-ton tank through rough seas. The Navy’s craft barely avoided the loss of cargo and all hands. The Higgins Boat delivered the tank to its destination. The Committee declared Higgins’ design the winner.

Truman was scathing in his verdict on the conduct of the Bureau of Ships. “The Bureau of Ships has, for reasons known only to itself, stubbornly persisted for over five years in clinging to an unseaworthy …design of its own… Higgins Industries did actually design and build a superior [design],” only to run up against the Bureau’s “flagrant disregard for the facts, if not for the safety and success of American troops.”

The Entrepreneur vs. the Rules

Higgins’ trials and tribulations did not cease when he won government contracts to produce his landing craft (and other boats, including PT boats). He succeeded in scrounging up the capital necessary to expand his boatbuilding plant in New Orleans into a facility capable of supplying the armies of the Free World. But in 1942, fully automated manufacturing plants did not exist – Higgins next faced the problem of attracting the labor necessary to man the plant. Even in peacetime, that problem would have been daunting. In the wartime environment of wage and price controls, the chief legal inducement for attracting labor, a wage increase, was limited.

Higgins attitude to this and other problems can be appreciated from his own summation of his personal philosophy: “I don’t wait for opportunity to knock. I send out a welcoming committee to drag the old harlot in.” Higgins raised wages to the allowable maximum. Then he helped to set a precedent that persists to the present day by offering free medical care to his employees. Since this did not qualify as a wage, it was exempt from the controls and from the confiscatory wartime income-tax rates as well.

One plentiful source of labor was black workers. But this was New Orleans; segregation reared its ugly head. Higgins gritted his teeth and complied, while providing equal wages and benefits to all workers.

Shortages of metals and minerals were a throbbing headache. Higgins treated it by buying steel on the black market and stealing some items (such as bronze) that he couldn’t buy. (He later paid for stolen materials.)

Victory in Europe

Andrew Higgins went from employing 50 people in a plant worth $14,000 to hiring 20,000 employees to work seven huge plants. Over 10,000 Higgins Boats were produced, comprising most U.S. landing craft. His plants also built PT and antisubmarine boats.

Prior to landings in Sicily and North Africa in early 1943, Eisenhower moaned that “when I die, my coffin should be in the shape of a landing craft,” since they were killing him with worry. By D-Day, Higgins Boats had forced Hitler to stretch his defenses thin along the French coast. Although favoring the port of Pas-de-Calais, Hitler set up a string of defenses in Normandy as well. The Germans had concentrated nearly enough firepower to defeat the American landing at Omaha Beach, but “nearly” wasn’t enough to thwart the eventual beachhead. Meanwhile, the other four landings went relatively smoothly; the Higgins Boats had made it impossible for Hitler to keep all his bases covered. As Rommel and other high-level strategists recognized, once the landings succeeded, the war’s outcome was a foregone conclusion. Even Hitler couldn’t conceal his admiration for Higgins, calling the boat builder the “new Noah.”

On Thanksgiving, 1944, Eisenhower gave thanks for the Higgins Boats. “Let us thank God,” he intoned, “for Higgins Industries, management, and labor which has given us the landing boats with which to conduct our campaign.” And after the war, in his memoirs, Eisenhower laid it on the line: “Andrew Higgins is the man who won the war for us… If Higgins had not designed and built those LCVPs, we never could have landed over an open beach. The whole strategy of the war would have been different.”

The Thanks of a Grateful Nation

Along with many of the other entrepreneurs whose Herculean efforts supplied the American, British, Chinese and Russian armies, Andrew Higgins was rewarded after the war with an IRS investigation into the “excess profits” earned by his firms during the war. Since his death in 1952, his name has been placed on a Navy ship and an expressway in Ohio. Recently, a memorial (including a statue) has been raised to him in his hometown of Columbus, NE.

At his death, Higgins held some 30 patents.

The Economic Lessons of Andrew Higgins and American Entrepreneurship in World War II: The Value of Profit Maximization

The story of Andrew J. Higgins is perhaps the most dramatic of many similar stories of American entrepreneurship in World War II. Jack Simplot developed a process for dehydrated potatoes that enabled him to feed American soldiers around the globe. After the war, he turned his expertise to frozen foods and ended by supplying frozen French fries to the McDonald’s fast-food restaurant chain. Henry Kaiser was the preeminent wartime shipbuilder. He cut average construction time per ship by a factor exceeding ten (!). Like Higgins, he sometimes resorted to buying steel on the black market. Before the war, he built roads. After the war, he switched to steel and aluminum.

Men like Higgins, Simplot and Kaiser were entrepreneurs of demonstrable, and demonstrated, skill. Today, we relate their exploits with something resembling awe, yet it should have been no surprise that they succeeded. Their success often came on the heels of government’s failure at the tasks they undertook; this should likewise come as no surprise. The fact that government actively resisted their best efforts should dismay us, but not surprise us. After all, we have seen the same lessons repeated since the war.

Consider the test of the Higgins Boat, in which the Navy landing craft designed by the Bureau of Ships faced off against the Higgins Boat. Had the Higgins Boat lost the contract, the Allies would have lost the war or been seriously threatened with losing it. (So said Dwight Eisenhower, the man who led the Allied war effort.) The tacit premise behind government regulation of business is that – of course – government regulators will always act in the “public interest” while private businessmen act from greedy self-interest which must run athwart the general welfare. Yet in this case, government bureaucrats spent years acting in a manner directly contradictory to the public interest, albeit apparently in their own interest. (So said Harry Truman, certainly no advocate of laissez-faire capitalism.)

Should we be surprised that it was the profit-maximizer who won the war and the government bureaucrats who pursued their own interest at the risk of losing it? Certainly not. Higgins had spent his life in private business, where he could gain success and happiness only by building boats that worked. The naval bureaucrats did not have to build boats that worked in order to “succeed” in their domain; e.g., remain bureaucrats and keep their staffs and budgets intact. We know this because they succeeded in thwarting Higgins’ design for five years in spite of Higgins’ triumphs in testing against the Navy. Indeed, granting a contract to the boat that worked would have threatened their success, since their own design and model would have been replaced.

The only surprising thing about the episode is that how close America came to losing the war. Had FDR not done two things that were utterly unexpected – namely, abandon his allegiance to the New Deal and set up the Truman Committee to overrule wasteful measures undertaken by bureaucrats – we might have done just that. In that sense, it might with some justice be said that it was really FDR who won the war. And, in fact, Roosevelt made several additional decisions that were crucial in determining the war’s outcome. Naming George Marshall as Chief of Staff is one such decision. Marshall chose men like Eisenhower, Omar Bradley and George Patton for key commands, in each case jumping them over many other men with seniority and more impressive resumes.

The problem with calling FDR the guarantor of victory is that each of his good decisions only offset other decisions that were dreadfully bad. FDR wouldn’t have had to abandon the New Deal had he not adopted such a disastrous mélange of counterproductive and unworkable policies in the first place. The appointment of Marshall merely undid the damage done by the appointment of mediocre yes-men like Harold Stark and Henry Stimson, to high administrative and cabinet positions in the military bureaucracy.

The Second Lesson: Abandonment of the New Deal

FDR’s abandonment of the New Deal illustrates the second lesson to be drawn from the example of Higgins and his fellow entrepreneurs. Conventional thinking posits wartime as the time of preoccupation with “guns,” whereas in peacetime we can safely concentrate on “butter.” Butter production, so tradition tells us, is effected using markets and a price system, but guns are a different proposition entirely. Combating militarism demands that we use the methods of the militarists by turning the country into an armed camp, as did Germany and Japan.

However difficult it may have been to see through this fallacy at the time, it is obvious in retrospect. America won the war with production, by supplying not only its own needs but those of Great Britain, Russia and China as well. Those needs included not only military goods narrowly defined, but foodstuffs, clothing, medicines and all manner of civilian goods as well. It is highly significant that both Roosevelt and Churchill concurred in urging major motion picture studios to maintain the volume and quality of their products rather than sacrificing resources to the war effort. They realized that movies were part of the war effort.

Put in this light, Roosevelt’s decision to substitute “Dr. Win-the-War” for “Dr. New Deal” takes on vastly more meaning. Without even consciously realizing it, he was admitting the failure of his own policies in peacetime as well as their unsuitability for war. And war’s end brought this lesson home with stunning force. During the war, Roosevelt had abandoned New Deal staples like the WPA and the CCC. After the war, President Truman was able to retain various “permanent” features of the New Deal, like Social Security, banking and stock-market regulation and pro-union regulations. Pervasive control of the price system faded away along with the gradual obsolescence of wartime price controls.

FDR had predicted that it would take total employment of 60 million to absorb the ramped-up levels of total production reached during the war. (Before the war, total employment had been only 47 million.) Keynesian economists predicted a return to Depression, with unemployment ranging from 10-20%, after the war unless massive federal-government spending was undertaken. Instead – appalled at the unprecedented level of federal-government debt as a percentage of gross national product – the Republican Congress of 1946 cut spending and taxes. The result was an increase in civilian employment from 39 million to 55 million and total employment (including government workers) reached Roosevelt’s goal of 60 million without the New Deal-type spending he had envisaged. Unemployment was 3.6%. Annual growth in gross national product reached an all-time high of 30%.

Wartime entrepreneurship battered New Deal economic policy to its knees. 1946 delivered the coup de grace.

The Third Lesson: The Primacy of the Price System

The third and final lesson to be learned concerns the impatience of the entrepreneurs with bureaucracy, rules and laws. In particular, their resort to the black market was exactly what patriotic citizens were being implored not to do during the war. Should we be surprised that entrepreneurs won the war by ignoring the anti-market directives of the bureaucrats?

Hardly. Everybody seemed to take for granted that normal commercial attitudes and impulses should be suppressed during wartime, that the success of any collective goal requires the suppression of all individual wants. But upon reflection, this simply cannot be true.

As usual, the most cogent analysis of the problem was provided by F. A. Hayek, in a short essay called, “Pricing Versus Rationing.” He pointed out that in wartime politicians’ standard recourse is to controls on prices and rationing of “essential” war materials by queue. Any professional economist would, he declared, recognize the fatuity and futility of that modus operandi. “It deprives industry of all basis of rational calculation. It throws the burden of securing economy on a bureaucracy which is neither equipped nor adequate in number for the task. Even worse, such a system would deprive those in control of even the whole economic machine of essential guides for their plans and reduce major decisions of policy and even strategy to little more than guesswork.” This will “inevitably cause inefficiency and waste of resources.”

In other words, the best policy for allocating resources in wartime is the same as the best policy for allocating resources in peacetime; namely, use the price system to determine relative values. Where so many people go wrong is by blithely assuming that because so many military goods are now required, command and control must be used to bring them into existence by forbidding the production of other things. But among the many problems caused by this approach is that the production of any good for a military use means the foreclosure of resource use for production of some other military good. Without a price system to determine relative values, the war itself cannot be run on any kind of rational or efficient basis. This is another reason why the Allies were lucky to win – the Axis were wedded to a Fascist, command-and-control economic system that foreswore free markets even more than the Allies did.

Black markets are the outgrowth of prohibition and/or price controls. They arise because legal, licit markets are forbidden. Whether it is bootleg liquor during Prohibition, illicit drugs in contemporary America or under-the-table trading of ration coupons during wartime, a black market is a sign of a free market trying to escape confinement. Higgins, Kaiser, et al were illustrating the logic of Hayek. Rationing and price controls are just as bad in wartime as in peacetime. They were violating statutory law, but were obeying the higher wartime law of salus populi suprema lex.

The Man Who Won World War II

The man who won World War II was not a soldier. He was a businessman. He won it by applying the great economic principles of free markets. This transcendent truth was acknowledged by World War II’s greatest soldier. The power and meaning of this should persuade even those unimpressed by the logic of economic theory itself.

DRI-460: Can Economics Help Man Peacefully Coexist With Dogs and (Big) Cats?

An old journalistic maxim states that it is only news when man bites dog, not vice-versa. These days, the demand for raw material that can be fashioned into a retail product called “news” far outstrips the supply. Thus, animal bites become “attacks” and injuries escalate into “maulings.” Two incidents in 2012 have attracted considerable attention and comment. They are a springboard for a look at the relationship between man and animals, using economics as our analytical tool.

Dog Bites Denver Anchorwoman; Cheetahs Bite Scottish Wife and Daughter

The first incident occurred a few months ago in Denver, CO. A local fireman rescued a dog from icy waters. The rescue made NBC affiliate KUSA-TV’s local news. Anchorwoman Kyle Dyer decided to invite the fireman and dog onto her midday interview show for a “feel-good” interview segment.

During the course of the program, Dyer approached the dog and knelt next to it – apparently to pet or embrace it. The dog bit her in the face. After cosmetic surgery and a significant recuperative period, the newswoman is back on the job.

The second episode took place last week in South Africa. A Scottish family on vacation visited a game preserve that included a petting zoo. Among the animals there were two cheetahs. After being assured that the cheetahs were “tame,” the wife and her 8-year-old daughter approached and began playing with the cheetahs.

One of the cheetahs grabbed the girl by the leg. The woman, who had been lying with or on top of the other cheetah, was bitten and scratched. The husband, who was photographing the visit to the petting zoo, caught all this on camera.

The Implicit Theory of the Victims

There is an implicit or unconscious theory behind the actions of the Denver anchorwoman and the Scottish couple. Ms. Dyer’s avowedly “feel-good” segment would utilize time-worn techniques to elicit sympathy for the dog and the fireman by manipulating the audience’s emotions.

Those techniques required the physical presence of dog and rescuer so as to figuratively reenact the rescue. Questions directed to the rescuer would be the one means of effecting this result. The anchorwoman would coax “spontaneous” reactions from the dog in order to milk emotion from the audience. The entire segment would take on the character of a theatrical production, with each participant playing a role intended to get specific reactions from the audience.

Unfortunately, the dog was not up on his part and did not follow his cues. The reaction of the station’s operations manager – that the dog exhibited “behavior nobody predicted or understood” – is not unlike one that might have been proffered to excuse the engagement of an employee who later committed a crime. “He had no previous criminal history. How could we know that he would go bad?”

The reaction of the Denver authorities suggested that blame for the biting should be assigned to the dog’s owner – even though the dog was held on short leash at the time. Law enforcement announced that the owner would be prosecuted under local law for “not having control of the dog at all times” and “allowing it to bite” (!).

The notion that the anchorwoman herself might have induced the bite was never even contemplated. The reasons for this lapse are clear. Her intentions were perfectly benign, even noble. How could she possibly be at fault, then? And it is, indeed, perfectly obvious that she never considered the possibility that she might be bitten – an outcome even more professionally hazardous than it was personally injurious. (How many successful news readers with scarred faces come readily to mind?)

Yet the comments of professional dog behaviorists left no doubt about where they placed blame for the biting. One insisted that “the dog was trying to tell her [Dyer] ‘I am going to bite you.'” Another summed up the incident by saying that “she [Dyer] did everything wrong.”

Man-made Law vs. Dog Rules

The implicit theory followed by the station, the anchorwoman and the DA was that “good” dogs are those that obey man-made statutes, while “bad” dogs do not. Moreover, dogs apparently may “go bad” suddenly and unpredictably, in the manner of people who “go postal” and commit random acts of violence.

This implicit theory runs completely counter to what we know about dogs. Dogs act according to dog rules, which have nothing to do with man-made statutes. Contrary to the impression created by countless movies and television shows, dogs are not moral creatures. While this is not quite the same thing as saying that they are unaffected by what we call emotions, it does mean that we should quit judging dogs by human standards of behavior.

The anchorwoman approached unhesitatingly, without waiting for a sign of approval from the dog. While this is common practice in human society and convenient for theatrical purposes, it is a clear violation of dog rules. Dogs are pack animals. They automatically treat non-pack members with caution and suspicion, an instinct inherited from their wolf ancestors. In the wild, failure to be on guard against strangers can cost your life.

The anchorwoman bent over the dog – preparatory to delivering a hug or kiss, perhaps. This is a friendly gesture among humans. But towering directly over a dog is perceived by the dog as a gesture of dominance. It may be tolerated – if the dog is submissive. But a dominant dog may aggressively reject such an overture, by growling or biting. Since dogs do not wear signs saying “I am dominant” or “I am submissive,” humans should avoid this behavior with unfamiliar dogs. Thousands of children suffer facial bites each year when bending down and attempting to kiss dogs.

Another implicit theory popular in human circles treats a dog’s bite as a hostile gesture, comparable to a blow struck in anger by one human against another. Lacking the multiple modes of expression available to us, dogs economize by assigning multiple meanings to their limited vocabulary. Biting is used to warn, as in this case. Dogs also bite each other when playing. Biting is an instinctive, reflex response used in ways analogous to a human shout or cry.

Although dogs certainly bite to inflict injury or death, their teeth are poorly adapted for this purpose, being rather short and blunt. It is difficult for a dog to inflict severe damage with a single bite without precise delivery to a strategic location, such as face or groin; dogs, like wolves, do the greatest damage by working in packs and tearing flesh with their carnassial teeth. That is why dogs are omnivores rather than strict carnivores. Cats, in contrast, are pure carnivores whose long, sharp teeth make them ideal killers. (That is why the danger of infection is so much greater for cat bites than for dog bites.)

Reports of the incident sometimes referred to it as an “attack.” This is semantically dubious, since an attack is, by definition, an offensive action and our explication of dog rules marks this particular bite as a defensive reaction.

We can sum up the attitude implicit in conventional thinking about dogs by the public and the law. “Dogs are smaller, four-legged versions of human beings, with limited powers of reasoning but a mental structure and moral outlook nonetheless comparable to ours. When they act in ways we find inappropriate, we may feel free to judge them by human standards; e.g., by punishing them for their sins.”

The Implicit Theory of Cheetah Tameness

The Scottish couple who visited the South African game preserve was also applying an implicit theory of human-animal interaction. We will call it the theory of “cheetah tameness.”

The idea of cheetah tameness may strike many readers as inherently absurd. Aren’t cheetahs carnivorous wild animals, like lions and tigers? Isn’t a “tame cheetah” an oxymoron, like a chaste prostitute?

Actually, mankind has been reducing cheetahs to captivity and domesticating them for centuries, at least since the days of ancient Egypt. Sometimes the objective has been perpetuation of the species, as when cheetahs are bred inside zoos. Sometimes the cats have been privately owned and raised as pets. The upshot is that the appellation of “tame” has often been applied to cheetahs, unlike other big cats. The question is: What practical implications and limits apply to this status?

Accounts of the bites suffered by the woman and her daughter suggest that the couple interpreted the concept of tameness in roughly the following way: “Because the two cheetahs were born in captivity and raised as pets by human beings, the cheetahs essentially considered themselves human and accordingly behaved in ways consonant with that belief – subject to the mental and physical limitations imposed by their actual physical status as cheetahs, of course. The family could behave toward the cheetahs as if they were human, by approaching them, playing with them, even lounging with and lying upon them – all without fear of suffering physically at the cheetahs’ hands… er, paws.”

Put so baldly, this sounds naïve, even idiotic. Yet nothing less could explain the couple’s behavior. Why else would they allow their 8-year-old daughter to play with the animals? Why would the husband complacently photograph their interactions with the cats – and continue to do so even as the action became bloody and dangerous? Why would the wife – who described the cheetah’s thought processes as being “like children” – feel free to lie down upon one of the cheetahs?

Cheetahs are large felines. Cats are playful animals who bat each other with their paws. Cats sometimes bite owners who act in non-approved (by the cats) ways; e.g., by scratching the cats’ stomachs or teasing them by waving their hands or other objects within reach.

Here again, accounts of the cheetah episode refer to it as an “attack.” Even more than in the Denver case, this word is an obvious misnomer. A cheetah is the fastest land animal in the world, created by nature to reach and kill wild game and defend it in a world of competing predators. If the cheetahs had really attacked the Scottish family, those people would be dead. An attacking cheetah would not have grabbed an 8-year-old girl – small enough to qualify as “prey” in the cheetah perception – by the leg. The cheetah would have bit her in the throat and made short work of her. The “mauled” wife would not have been in any shape to give interviews to the press afterwards; the husband would not have been able to complain that “the authorities should have made sure that the petting zoo was safe before allowing tourists” to visit it. Both husband and wife would both have been in the morgue.

The Implicit Theory of Justice

The issue of intent is worth raising because the implicit theory of justice applied to animals in such cases is so heavily dependent on it. For example, dogs that bite are often killed because they are “proven biters” or because “we cannot risk it happening again” – arguments devoid of justification because all dogs are biters and there is no logical or empirical link between one bite and a subsequent one.

Human beings are moral creatures. Punishment for misbehavior achieves at least three worthwhile purposes – protection from actual misbehavers, deterrence from misbehavior by potential misbehavers and revenge against misbehavers to ease pain felt by victims and relatives. Cognizance of our mistakes is the logical prerequisite for this. Indeed, that is the basis for the most common legal definition of insanity – the inability to distinguish right from wrong.

None of these arguments applies to animals, which have no moral sense, cannot be deterred from reacting instinctively and are unaware that their statutory punishment constitutes revenge for misbehavior. The implicit theory of justice behind punishing animals for injuring humans rests on a false analogy between animals and humans. We use words like “euthanasia” to soften the impact of the fact that we kill animals in reaction to our own mistakes; pejoratives like “attack” and “maul” frame the animals for crimes in order to justify capital punishment.

The issue is further clouded by irrelevant considerations like “animal rights” and allegations of “sentiment for animals.” The assignment of rights to non-sentient creatures incapable of exercising them raises too many logical obstacles to warrant serious consideration. Insistence on logic in the evaluation of human/animal interactions does not elevate sentiment – it downplays it. The sentimentalists are those who judge animals by our standards.

Having said all that, it is not difficult to see where the romantic, sentimental tendency to humanize animals came from.

Modern Neuroscience and the Economic Theory of Knowledge

Over half a century ago, a future Nobel laureate in economics, F. A. Hayek, published a revolutionary study in theoretical psychology called The Sensory Order. Among the book’s numerous insights was one that has subsequently been picked up by modern neuroscience – that we gradually create and improve our fragmentary understanding of a vast, complex reality by developing theories about the world to substitute for the unavoidable gaps in our specific knowledge.

Hayek integrated his theory of human psychology with his economic theory. The latter stressed the importance of markets as disseminators of knowledge that is dispersed and decentralized in the brains of billions of people the world over. Throughout the 20th century, economists made extensive use of the concept of equilibrium, a state of affairs in which nobody can gain from changing their current pattern of behavior. The most frequent example of equilibrium is the clearing of markets – that is, the state in which prices equate ex ante quantities offered for sale with desired purchases.

Hayek pointed out that economists overlooked – or deliberately refrained from mentioning – that equilibrium presupposed full knowledge about production and consumption opportunities. Only the operation of free markets could in fact generate that knowledge and make it widely available. And since human perception is fragmentary and subjective, markets perform the irreplaceable function of gradually bringing the subjective perceptions of billions of people into line with the true, objective facts of reality. They do this by bringing to light information that would otherwise not exist or be transmitted to those most in need of it.

The Evolution of Human Understanding of Animals

The interaction of human beings and animals goes back as far as our knowledge of the human species. Most animals were a source of labor, food or observation. Domesticated dogs and cats threw in their lot with man thousands of years ago – well, actually, it is unclear who joined up with who. Improbable as the implicit theories described above may seem, their underlying basis has been accepted throughout mankind’s history, until quite recently. (Competing theories have tended to be even less plausible, such as those elevating animals to the status of deities as was done in ancient Egypt and in India.) After all, we really had little choice except to compare animals to what we knew.

When did the traditional view of animals change? What caused the change? Hayek pointed out that, far from requiring rationality on the part of market participants, markets tend to promote rationality by rewarding it. Here, the rewards began with the Industrial Revolution and accelerated dramatically with the surge in economic growth that accompanied the 20th century.

For most of human history, domestic animals served two primary purposes – companionship and labor. Their value in chose uses was stable but low. Domestic animals were a source of companionship; this rewarded knowledge that improved their health and longevity. Those developments had to await comparable innovations in human medicine and nutrition. Meanwhile, the companion value of dogs and cats provided a sustaining incentive maintain the population.

Dogs and cats provided a source of labor on farms for centuries, during which time human life was predominantly rural. There was little scope for improvements in agricultural productivity generally or the factor productivity of domestic animals, so while the romantic view of animals had obvious shortcomings there was little motivation or ability to improve on it.

When the worldwide engine of economic growth shifted into overdrive, this opened up opportunities for domestic animals. As humans became healthier and started living longer, the companion value of domestic animals increased. Improvements in human medicine gradually percolated down to veterinary medicine, and the increased companion value made it economically feasible to employ them.

Wealthier societies can afford to focus more time, energy and resources on aesthetic concerns like human treatment of animals, which produced more, better and healthier domestic animals. More and more uses for the animals were developed. Seeing-eye dogs were followed by bomb-sniffing dogs and medical-therapy dogs. That last category now includes dogs specialized to detect their owners’ seizures and low-blood-sugar episodes and act upon the discovery.

Higher productivity meant that domestic animals became better investments, which meant that learning what made them tick became more economically desirable and feasible. Books and television programs by veterinarians and trainers became big sellers. Thus, objective truth about animals was transmitted from those best qualified to apprehend it to those who wanted and needed it the most.

This economic process picked up speed and momentum in the second half of the 20th century and became the avalanche of today. It is now commonplace to see articles in the popular and business press marveling at the time and expenditure allocated to pets, but much less space is devoted to the enormous increase in the value created by them and enjoyed by their owners. And yet economic theory suggests that domestic animals are capital goods – long-lived producer goods used to produce other goods, assets that produce a stream of value over a period of years. Their value is derived from the net present value of this stream of output that they produce.

At long last, we are shedding the romantic view of dogs as either good dogs or bad dogs, Lassie or the Hound of the Baskervilles. It is no longer sensible to assassinate a few million potential productive assets every year in municipal animal shelters – thus, the trend toward no-kill shelters, pet rescue and adoption programs.

Hayekian Equilibrium in the Market for Domestic Animals

Cases like the Denver and South African ones show that our subjective perceptions have still not caught up with the objective reality of animals. But our analysis also proves that markets have wonders in improving the lives of both people and animals – and can do even more if we let them.