DRI-284 for week of 7-13-14: Why Big Government is Rotten to the Core: The Tale of the Taxpayers’ Defender Inside Federal Housing

An Access Advertising EconBrief:

Why Big Government is Rotten to the Core: The Tale of the Taxpayers’ Defender Inside Federal Housing

Today the trajectory of our economic lives is pointed steeply downward. This space has been disproportionately devoted to explaining both how and why. That explanation has often cited the theory of government failure, in which the purported objects of government action are subordinated to the desires of politicians, bureaucrats, government employees and consultants. Economists have been excoriated for sins of commission and omission. The resulting loss of personal freedom and marketplace efficiency has been decried. The progressive march toward a totalitarian state has been chronicled.

A recent column in The Wall Street Journal ties these themes together neatly. Mary Kissel’s “Weekend Interview” column of Saturday/Sunday, July 12/13, 2014, is entitled “The Man Who Took On Fannie Mae.” It describes the working life of “career bureaucrat” and economist, Edward DeMarco, whose most recent post was acting director of the Federal Housing Finance Agency. Ms. Kissel portrays him as the man “who fought to protect American taxpayers” and “championed fiscal responsibility” in government. As we shall see, however, he is really integral to the malfunctioning of big government in general and economics in particular.

The Career of Edward DeMarco

Edward DeMarco is that contradictory combination, a career government bureaucrat who is also a trained economist. He received a PhD. in economics from the University of Maryland in the late 1980s and went to work for the General Accounting Office (GAO). As “low man on the totem pole,” he was handed the job of evaluating Fannie Mae and Freddie Mac. They had been around since the 1930s but were known to few and understood by fewer in Congress. The decade-long-drawn-out, painful series of savings-and-loan bailouts had scalded the sensibilities of representatives and regulators alike. DeMarco’s job was to determine if Fannie and Freddie were another bailout landmine lying in wait for detonation.

His answer was: yes. The implicit taxpayer backstop provided to these two institutions – not written into their charter but tacitly acknowledged by everybody in financial markets – allowed them to borrow at lower interest rates than competitors. This meant that they attracted riskier borrowers, which set taxpayers up to take a fall. And the Congressional “oversight” supposedly placing the two under a stern, watchful eye was actually doing the opposite – acting in cahoots with them to expand their empire in exchange for a cut of the proceeds.

DeMarco sounded the alarm in his report. And sure enough, Congress acted. In 1992, it established the Office of Federal Housing Oversight (OFHO). A triumph for government regulation! A vindication of the role of economics in government! A victory for truth, justice and the American way!

Yeah, right.

DeMarco pinned the tail on this donkey right smack on the hindquarters. “‘The Fannie and Freddie Growth Act,’” he called it, “because it told the market ‘Hey, we really care about these guys, and we’re concerned about them because they’re really important.’” In other words, the fix was in: Congress would never allow Fannie and Freddie to fail, and their implicit taxpayer guarantee was good as gold.

This was the first test of DeMarco’s mettle. In that sense, it was the key test, because the result jibed with the old vaudeville punchline, “we’ve already agreed on what you are; now we’re just haggling about the price.” As soon as the ineffectual nature of OFHO crystallized, DeMarco should have screamed bloody murder. But the “low man on the totem pole” in a government bureaucracy can’t do that and still hope for a career; DeMarco would have had to say sayonara to the security of government employment in order to retain his integrity. Instead, he kept his mouth shut.

Kissel discreetly overlooks this because it doesn’t jibe with her picture of DeMarco as heroic whistleblower. She is acting as advocate rather than journalist, as editor rather than reporter.

Any doubts about the fairness of this judgment are dispelled by Kissel’s narrative. “After stints at the Treasury and Social Security Administration, DeMarco found himself working at the very oversight office that his reports to Congress had helped create.” Oh, he “found himself” working there, did he? At the very office that had doublecrossed and betrayed him? “It was 2006, when Fannie and Freddie’s growth had been turbocharged by the government’s mortgages-for-all mania. Mr. DeMarco recalls that during his ‘first couple of weeks’ at the agency, he attended a conference for supervision staffers organized to tell them ‘about great, new mortgage instruments’ – subprime loans, he says, with a sardonic chuckle.” But what exactly did he do about all this while it was in progress, other than chuckling sardonically?

The first twenty years of Edward DeMarco’s career illustrate the workings of big government to a T. They depict the “invisible handshake” between orthodox, mainstream economics and the welfare state that has replaced the “invisible hand” of the marketplace that economics used to celebrate.

The Mainstream Economist as Patsy for Politicians and Bureaucrats

Mainstream economists are trained to see themselves as “social engineers.” Like engineers, they are trained in advanced mathematics. Like engineers, they are trained as generalists in a wide-ranging discipline, but specialize in sub-disciplines – civil, mechanical and chemical engineering for the engineer, macroeconomics and microeconomics for the economist. Like engineers, economists hone their specialties even more finely into sub-categories like monetary economics, international economics, industrial organization, labor economics, financial economics and energy economics. Economists are trained to think of themselves are high theoreticians applying optimizing solutions to correct the failures of human society in general and markets in particular. They take it for granted that they will command both respect and power.

This training sets economists up to be exploited by practical men of power and influence. Lawyers utilize the services of economists as expert witnesses because economists can give quantitative answers to questions that are otherwise little more than blind guesses. Of course, the precision of those quantitative answers is itself suspect. If economists really could provide answers to real-world questions that are as self-assured and precise as they pretend on the witness stand, why would they be wasting their lives earning upper-middle-class money as expert witnesses? Why are they not fabulously rich from – let us say – plying those talents as traders in commodity or financial markets? Still, economists can fall back on the justified defense that nobody else can provide better estimates of (say) wages foregone by an injured worker or business profits lost due to tortious interference. The point is, though, that economists owe their status as experts to default; their claim on expertise is what the late Thorstein Veblen would call “ceremonial.”

When economists enter the realm of politics, they are the veriest babes in the savage wood. Politicians want to take other people’s money and use it for their own – almost always nefarious – purposes. They must present a pretense of legitimacy, competence and virtue. They will use anybody and everybody who is useful to them. Economists hold doctorates; they teach at universities and occupy positions of respect. Therefore, they are ideal fronts for the devices of politicians.

Politicians use economists. They hire them or consult with them or conspicuously call them to testify in Congress. This satisfies the politicians’ debt to competence legitimacy, competence, virtue and conscience (if they have one). Have they not conferred with the best available authority? And having done so, politicians go on to do whatever they intended to do all along. They either ignore the economist or twist his advice to suit their intentions.

That is exactly what happened to Edward DeMarco. His superiors gave him an assignment. Like a dutiful economist, he fulfilled it and sat back waiting for them to act on his advice. They acted, all right – by creating an oversight body that perverted DeMarco’s every word.

Deep down, mainstream economists envision themselves as philosopher kings – either as (eventual) authority figures or as Talleyrands, the men behind the throne who act as ventriloquists to power. When brought face-to-face with the bitter disillusion of political reality, they react either by retreating into academia in a funk or by retreating into their bureaucratic shell. There is a third alternative: occupational prostitution. Some economists abandon their economic principles and become willing mouthpieces for politicians. They are paid in money and/or prestige.

It is clear that DeMarco took the path of bureaucratic compliance. Despite the attempt of WSJ’s Kissel to glamorize his role, his career has obviously been that of follower rather than either leader or whistleblower. His current comments show that he harbors great resentment over being forced to betray his principles in order to make the kind of secure living he craved.

For our purposes, we should see him as the wrong man for the job of taxpayers’ defender. That job required an extraordinary man, not a bureaucrat.

DeMarco, DeMartyr

The second career of Edward DeMarco – that of “DeMarco, DeMartyr” to the cause of fiscal responsibility and taxpayer interests, began after the housing collapse and financial panic of 2008. After bailout out Fannie and Freddie, Congress had to decide whether to close them down or reorganize them. They fell back on an old reliable default option – create a new agency, the Federal Housing Finance Agency, whose job it was to ride herd on the “toxic twins.” When FHFA’s director, James Lockhart, left in August, 2009, Treasury Secretary Timothy Geithner appointed DeMarco as acting director.

DeMarco began by raising executive salaries to stem the exodus of senior management. This got him bad press and hostility from both sides of the Congressional aisle. DeMarco set out to reintroduce the private sector to the mortgage market by reducing loan limits and shrinking the mortgage portfolios of Fannie and Freddie. But we shouldn’t get the wrong idea here – DeMarco wasn’t actually trying to recreate a free market in housing. “I wasn’t trying to price Fannie and Freddie out of the market so much as get the price closer so that the taxpayer capital is getting an appropriate rate of return and that, more important, we start selling off this risk,” DeMarco insists. He was just a meliorist, trying to fine-tune a more efficient economic outcome by the lights of the academic mainstream. Why, he even had the President and the Financial Stability Oversight Council (FSOV) on his side.

Ms. Kissel depicts DeMarco as a staunch reformer who was on his way to turning the housing market around. “Mr. DeMarco’s efforts started show results. Housing prices recovered, both [Fannie and Freddie] started to make money – lots of it – and private insurance eyed getting back into the market. Then in August 2012 the Obama administration decided to ‘sweep’ Fannie and Freddie’s profits, now and in the future, into the government’s coffers. The move left the companies unable to build up capital reserves, and shareholders sued.”

That was just the beginning. DeMarco was pressured by Congress and the administration to write down principal on the loans of borrowers whose homes were “underwater;” e.g., worth less at current market value than the value remaining on the mortgage. He also opposed creation of a proposed housing trust fund (or “slush fund,” as Kissel aptly characterizes it). Apart from the obvious moral hazard involved in systematically redrawing contracts to favor one side of the transaction, DeMarco noted the hazard to taxpayers in giving mortgagees – 80% of whom were still making timely payments – an incentive to default or plead hardship in order to benefit financially. How could mortgage markets attract investment and survive in the face of this attitude?

This intelligent evaluation won him the undying hatred of “members of Congress [and] President Obama’s liberal allies [including] White House adviser Van Jones [who] told the Huffington Post “you could have the biggest stimulus program in America by getting rid of one person;” namely, DeMarco. “Realtors, home builders, the Mortgage Bankers Association, insured depositories and credit unions” fronted for the White House by pressuring DeMarco to “degrade lending standards” to the least creditworthy borrowers – a practice that epitomized the housing bubble at its frothiest. “Protestors organized by progressive groups showed up more than once outside [DeMarco's] house in Silver Spring, MD, demanding his ouster. A demonstration in April last year brought out 500 picketers with ‘Dump DeMarco’ signs and 15-foot puppets fashioned to look like him. ‘My first reaction was of course one of safety,’ [said DeMarco]. ‘When I first saw them, I was standing a few feet from the window of a ground-level family room and they’re less than 10 feet way through this pane of glass, and it was a crowd of people so big I couldn’t tell how many people were out there. And then all the chanting and yelling started.’ His wife had gone to pick up their youngest daughter…’so I had to get on the phone and tell her ‘Don’t come.’ Then he called the police, who eventually cleared the scene. ‘It was unsettling,’ he says. ‘I think it was meant to be unsettling… They wanted me to start forgiving debt on mortgages.’” This is what Ms. Kissel calls “the multibillion-dollar do-over,” to which “Mr. DeMarco’s resistance made him unpopular in an administration that was anxious to refire the housing market.” Ms. KIssel’s metaphor of government as arsonist is the most gripping writing in the article.

Epilogue at FHFA

Edward DeMarco was the “acting” director at FHFA. The Senate capitulated to pressure for his removal by approving Mel Watt, Majority Leader Harry Reid’s pick, as permanent director. Watt immediately began implementing the agenda DeMarco had resisted. DeMarco had successfully scheduled a series on increases in loan-guarantee fees as one of a series of measures to entice private insurers back into the market. Watt delayed them. He refused to lower loan limits for Fannie and Freddie from their $625,000 level. He directed the two companies to seek out “underserved, creditworthy borrowers;” i.e., people who can’t afford houses. He assured the various constituencies clamoring for DeMarco’s ouster that “government will remain firmly in control of the mortgage market.”

DeMarco’s valedictory on all this is eye-opening in more ways than one. Reviewing what Ms. Kissel primly calls “government efforts to promote affordable housing,” DeMarco dryly observes, “‘Let’s say it was a failed effort…To me, if you go through a 50-year period, and you do all these things to promote housing, and the homeownership rate is [the same as it was 50 years ago], I think the market’s telling you we’re at an equilibrium.’ If we assume “that only government can foster homeownership among people ‘below median income,’ that ‘suggests a troubling view of markets themselves.’”

And now the whole process is starting all over again. “If we have another [sic] recession, if there’s some foreign crisis that …affects our economy, it doesn’t matter whatever the instigating event is, the point is that if we have another round of house-price declines like we’ve had, we’re going erode most of that remaining capital support.” Characteristically, he refuses to forthrightly state the full implications of his words, which are: We are tottering on the brink of full-scale financial collapse.

Edward DeMarco: Blackboard Economist

The late Nobel laureate Ronald Coase derided what he called “blackboard economists” – the sort who pretended to solve practical problems by proposing a theoretical solution that assumed they possessed information they didn’t and couldn’t have. (Usually the solution came in the form of either mathematical equations or graphical geometry depicted on a classroom blackboard, hence the term.)

Was Coase accusing his fellow economists of laziness? Yes and no. Coase believed that transactions costs were a key determinant of economic outcomes. Instead of investigating transactions costs of action in particular cases, economists were all too prone to assume those costs were either zero (allowing markets to work perfectly) or prohibitive (guaranteeing market failure). Coase insisted that this was pure laziness on the part of the profession.

But information isn’t just lying around in the open waiting for economists to discover it. One of Coase’s instructors at the London School of Economics, future Nobel laureate F.A. Hayek, pointed out that orthodox economic theory assumed that everybody already knew all the information needed to make optimal decisions. In reality, the relevant information was dispersed in fragmentary form inside the minds of billions of people rather than concentrated in easily accessible form. The market process was not a mere formality of optimization using given data. Instead, it was markets that created the incentives and opportunities for the generation and collation of this fragmented, dispersed information into usable form.

Blackboard economists were not merely lazy. They were unforgivably presumptuous. They assumed that they had the power to effectuate what could only be done by markets, if at all.

That lends a tragic note to Ms. Kissel’s assurance that “Mr. DeMarco isn’t against government support for housing – if done properly.” After spending his career as “the loneliest man in government” while fighting to stem the tide of the housing bubble, Edward DeMarco now confesses that he doesn’t oppose government interference in the housing market after all! The problem is that the government didn’t ask him how to go about it – they didn’t apply just the right optimizing formula, didn’t copy his equations off the blackboard.

And when President Obama and Treasury Secretary Geithner and the housing lobbyists and the realtors and builders and mortgage bankers and lenders and progressive ideologues hear this explanation, what is their reaction? Do they smack their foreheads and cry out in dismay? Do they plead, “Save us from ourselves, Professor DeMarco?”

Not hardly. The mounted barbarians run roughshod over Mr. DeMarco waving his blackboard formula and leave him rolling in the dust. They then park their horses outside Congress and testify that “See? He’s in favor of government intervention, just as we are – we’re just haggling about the price.” Politicians with a self-interested agenda correctly view any attempt at compromise as a sign of weakness, an invitation to “let’s make a deal.” It invokes contempt rather than respect.

That is exactly what happened to Edward DeMarco. He is left licking the wounds of 25 years of government service and whining about the fact that the fact that politicians are self-interested, that government regulators do not really regulate but in fact serve the interests of the regulated, that the political left wing will stop at nothing, including physical intimidation and force.

No spit, Spurlock. We are supposed to stand up and cheer for a man who is only now learning this after spending 25 years in the belly of the savage beast? Whose valiant efforts at reform consisted of recommending optimizing nips and tucks in the outrageous government programs he supervised? Whose courageous farewell speech upon being run out of office, a la Douglas MacArthur, is “I’m not against government support for housing if done properly?”

Valedictory for Edward DeMarco

The sad story of Edward DeMarco is surely one more valuable piece of evidence confirming the theory of big government as outlined in this space. Those who insist that government is really full of honest, hard-working, well-meaning people full of idealistic good intentions doing a dirty job the best they can will now have an even harder time saying it with a straight face. It is one thing when big government opposes exponents of laissez faire; we expect bank robbers to shoot at the police. But gunning down an innocent bystander for shaking his fist in reproof shows that the robber is a hardened killer rather than a starving family man. When the welfare state steamrolls over an Edward DeMarco’s efforts to reform it at the margins, it should be clear to one and all that big government is rotten to the core.

Even so, the fact that Edward DeMarco was and is an honest man who thought he was doing good does not make him a hero. Edward DeMarco is not a martyr. He is a cautionary example. The only way to counteract big government is to oppose it openly and completely by embracing free markets. Anything less fails while giving aid and comfort to the enemy. Failure coupled with career suicide can only be redeemed by service to the clearest and noblest of principles.

DRI-254 for week of 7-6-14: The Selling of Environmentalism

An Access Advertising EconBrief:

The Selling of Environmentalism

The word “imperialism” was coined by Lenin to define a process of exploitation employed by developed nations in the West on undeveloped colonies in the Eastern hemisphere. In recent years, though, it has been used in a completely different context – to describe the use of economic logic to explain practically everything in the world. Before the advent of the late Nobel laureate Gary Becker, economists were parochial in their studies, confining themselves almost exclusively to the study of mankind in its commercial and mercantile life. Becker trained the lens of economic theory on the household, the family and the institution of marriage. Ignoring the time-honored convention of treating “capital” as plant and equipment, he (along with colleagues like Theodore Schultz) treated human beings as the ultimate capital goods.

Becker ripped the lid off Pandora’s Box and the study of society will never be the same again. We now recognize that any and every form of human behavior might profitably be seen in this same light. To be sure, that does not mean employing the sterile and limiting tools of the professional economist; namely, advanced mathematics and formal statistics. It simply means subjecting human behavior to the logic of purposeful action.

Environmentalism Under the Microscope

The beginnings of the environmental movement are commonly traced to the publication of Silent Spring in 1961 by marine biologist Rachel Carson. That book sought to dramatize the unfavorable effects of pesticides, industrial chemicals and pollution upon wildlife and nature. Carson had scientific credentials – she had previously published a well-regarded book on oceanography – but this book, completed during her terminal illness, was a polemic rather than a sober scientific tract. Its scientific basis has been almost completely undermined in the half-century since publication. (A recent book devoted entirely to re-examination of Silent Spring by scientific critics is decisive.) Yet this book galvanized the movement that has since come to be called environmentalism.

An “ism” ideology is, or ought to be, associated with a set of logical propositions. Marxism, for example, employs the framework of classical economics as developed by David Ricardo but deviates in its creation of the concept of “surplus value” as generated by labor and appropriated by capitalists. Capitalism is a term intended invidiously by Marx but that has since morphed into the descriptor of the system of free markets, private property rights and limited government. What is the analogous logical system implied by the term “environmentalism?”

There isn’t one. Generically, the word connotes an emotive affinity for nature and corresponding distaste for industrial civilization. Beyond that, its only concrete meaning is political. The problem of definition arises because, in and of itself, an affinity for nature is insufficient as a guide to human action. For example, consider the activity of recycling. Virtually everybody would consider it de rigueur as part of an environmentalist program. The most frequent stated purpose of recycling is to relieve pressure on landfills, which are ostensibly filling up with garbage and threatening to overwhelm humanity. The single greatest component of landfills is newsprint. But the leachates created by the recycling of newsprint are extremely harmful to” the environment;” e.g., their acidic content poisons soils and water and they are very costly to divert. We have arrived at a contradiction – is recycling “good for the environment” or “bad for the environment?” There is no answer to the question as posed; the effects of recycling are couched in terms of tradeoffs. In other words, the issue is dependent on economics, not emotion only.

No matter where we turn, “the environment” confronts us with such tradeoffs. Acceptance of the philosophy of environmentalism depends on getting us to ignore these tradeoffs by focusing on one side and ignoring the other. Environmental advocates of recycling, for instance, customarily ignore the leachates and robotically beat the drums for mandatory recycling programs. When their lopsided character is exposed, environmentalists retreat to the carefully prepared position that the purity of their motives excuses any lapses in analysis and overrides any shortcomings in their programs.

Today’s economist does not take this attitude on faith. He notes that the political stance of environmentalists is logically consistent even if their analysis is not. The politics of environmentalism can be understood as a consistent attempt to increase the real income of environmentalists in two obvious ways: first, by redistributing income in favor of their particular preferences for consumption (enjoyment) of nature; and second, by enjoying real income in the form of power exerted over people whose freedom they constrain and real income they reduce through legislation and administrative and judicial fiat.

Thus, environmentalism is best understood as a political movement existing to serve economic ends. In order to do that, its adherents must “sell” environmentalism just as a producer sells a product. Consumers “buy” environmentalism in one of two ways: by voting for candidates who support the legislation, agencies, rules and rulings that further the environmental agenda; and by donating money to environmental organizations that provide real income to environmentalists by employing them and lobbying for the environmental agenda.

Like the most successful consumer products, environmentalism has many varieties. Currently, the most popular and politically successful one is called “climate change,” which is a model change from the previous product, “global warming.” In order to appreciate the economic theory of environmentalism, it is instructive to trace the selling of this doctrine in recent years.

Why Was the Product Called “Climate Change” Developed?

The doctrine today known as “climate change” grew out of a long period of climate research on a phenomenon called “global warming.” This began in the 1970s. Just as businessmen spent years or even decades developing products, environmentalists use scientific (or quasi-scientific) research as their product-development laboratory, in which promising products are developed for future presentation on the market. Although global warming was “in development” throughout the 1970s and 80s, it did not receive its full “rollout” as a full-fledged environmental product until the early 1990s. We can regard the publication of Al Gore’s Earth in the Balance in 1992 as the completed rollout of global warming. In that book, Gore presented the full-bore apocalyptic prophesy that human-caused global warming threatened the destruction of the Earth within two centuries.

Why was global warming “in development” for so long? And after spending that long in development limbo, why did environmentalists bring it “to market” in the early 1990s? The answers to these questions further cement the economic theory of environmentalism.

Global warming joined a long line of environmental products that were brought to market beginning in the early 1960s. These included conservation, water pollution, air pollution, species preservation, forest preservation, overpopulation, garbage disposal, inadequate food production, cancer incidence and energy insufficiency.The most obvious, logical business rationale for a product to be brought to market is that its time has come, for one or more reasons. But global warming was brought to market by a process of elimination. All of the other environmental products were either not “selling” or had reached dangerously low levels of “sales.” Environmentalists desperately needed a flagship product and global warming was the only candidate in sight. Despite its manifest deficiencies, it was brought to market “before its time;” e.g., before its scientific merits had been demonstrated. In this regard, it differed from most (although not all) of the previous environmental products.

Those are the summary answers to the two key questions posed above. Global warming (later climate change) spent decades in development because its scientific merits were difficult if not impossible to demonstrate. It was brought to market in spite of that limitation because environmentalists had no other products with equivalent potential to provide real income and had to take the risks of introducing it prematurely in order to maintain the “business” of environmentalism as a going concern. Each of these contentions is fleshed out below.

The Product Maturation Suffered by Environmentalism

Businesses often find that their products lead limited lives. These limitations may be technological, competitive or psychological. New and better processes may doom a product to obsolescence. Competitors may imitate a product into senescence or even extinction. Fads may simply lose favor with consumers after a period of infatuation.

As of the early 1990s, the products offered by environmentalism were in various stages of maturity, decline or death.

Air pollution was a legitimate scientific concern when environmentalism adopted it in the early 1960s. It remains so today because the difficulty of enforcing private property rights in air make a free-market solution to the problem of air pollution elusive. But by the early 1990s, even the inefficient solutions enforced by the federal government had reduced the problem of air pollution to full manageability.

Between 1975 and 1991, the six air pollutants tracked by the Environmental Protection Agency (EPA) fell between 24% and 94%. Even if we go back to 1940 as a standard of comparison – forcing us to use emissions as a proxy for the pollution we really want to measure, since the latter wasn’t calculated prior to 1975 – we find that three of the six were lower in 1991 and total emissions were also lower in 1991. (Other developed countries showed similar progress during this time span.)

Water pollution was already decreasing when Rachel Carson wrote and continued to fall throughout the 1960s, 70s and 80s. The key was the introduction of wastewater treatment facilities to over three-quarters of the country. Previously polluted bodies of water like the CuyahogaRiver, the AndroscogginRiver, the northern Hudson River and several of the Great Lakes became pure enough to host sport-fishing and swimming. The Mississippi River became one of the industrialized world’s purest major rivers. Unsafe drinking water became a non-problem. Again, this was accomplished despite the inefficient efforts of local governments, the worst of these being the persistent refusal to price water at the margin to discourage overuse.

Forests were thriving in the early 1990s, despite the rhetoric of environmental organizations that inveighed against “clear-cutting” by timber companies. In reality, the number of wooded acres in the U.S. had grown by 20% over the previous two decades. The state of Vermont had been covered 35% by forest in the late nineteenth century. By the early 1990s, this coverage had risen to 76%.

This improvement was owed to private-sector timber companies, who practiced the principle of “sustainable yield” timber management. By the early 1990s, annual timber growth had exceeded harvest every year since 1952. By 1992, the actual timber harvest was a miniscule 384,000 acres, six-tenths of 1% of the land available for harvest. Average annual U.S. wood growth was three times greater than in 1920.

Environmentalists whined about the timberlands opened up for harvest by the federal government in the national parks and wildlife refuges, but less logging was occurring in the National Forests than at any time since the early 1950s. Clear-cut timber was being replaced with new, healthier stands that attracted more wildlife diversity than the harvested “old-growth” forest.

As always, this progress occurred in spite of government, not because of it. The mileage of roads hacked out of national-park land by the Forest Service is three times greater that of the federal Interstate highway system. The subsidized price at which the government sells logging rights on park land is a form of corporate welfare for timber companies. But the private sector bailed out the public in a manner that would have made John Muir proud.

Garbage disposal and solid-waste management may have been the most unheralded environmental victory won by the private sector. At the same time that Al Gore complained that “the volume of garbage is now so high that we are running out of places to put it,” modern technology had solved the problem of solid-waste disposal. The contemporary landfill had a plastic bottom and clay liner that together prevent leakage. It was topped with dirt to prevent odors and run-off. The entire U.S. estimated supply of solid waste for the next 500 years could be safely stored in one landfill 100-yards deep and 20 miles on a side. The only problem with landfills was a siting problem, owing to the NIMBY (“not in my back yard”) philosophy fomented by environmentalism. The only benefit to be derived from recycling could be had from private markets by recycling only those materials whose benefits (sales revenue) exceeded their reclamation costs (including a “normal” profit).

Overpopulation was once the sales leader of environmentalism. In 1968′s The Population Bomb, leading environmentalist Paul Ehrlich wrote that “the battle to feed all of humanity is over. In the 1970s, the world will undergo famines – hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date, nothing can prevent a substantial increase in the world death rate….” Ehrlich also predicted food riots and plummeting life expectancy in the U.S. and biological death for a couple of the Great Lakes.

Ehrlich was a great success at selling environmentalism. His book, and its 1990 sequel The Population Explosion, sold millions of copies and recruited untold converts to the cause. Unfortunately, his product had a limited shelf life because his prophecies were spectacularly inaccurate. The only famines were politically, not biologically, triggered; deaths were in the hundreds of thousands, not millions. Death rates declined instead of rising. The Great Lakes did not die; they were completely rehabilitated. Even worse, Ehrlich made a highly publicized bet with economist Julian Simon that the prices of five metals handpicked by Ehrlich would rise in real terms over a ten-year period. (The loser would pay the algebraic sum of the price changes incurred.) The prices went down in nominal terms despite the rising general level of prices over the interval – another spectacular prophetic failure by Ehrlich.

It’s not surprising that Ehrlich, rather than the population, bombed. In the 1960s, the world’s annual population growth was about 2.0%. By the 1990s, it would fall to 1.6%. (Today, of course, our problem is falling birth rates – the diametric opposite of that predicted by environmentalism.)

Therefore, the phantom population growth predicted by environmentalism did not comprise one component of the inadequate food supply foreseen with uncanny inaccuracy by environmentalists. Ehrlich and others had foreseen a Malthusian scenario in which rising population growth overtook diminishing agricultural productivity. They were just as wrong about productivity as about population. The Green Revolution ushered in by Norman Borlaug et al drove one of the world’s leading agricultural economists to declare that “the scourge of famine due to natural causes has been almost conquered….”

The other leg of environmentalism’s collapsing doomsday scenario of inadequate food was based on cancer incidence. Not only would the food supply prove insufficient, according to environmentalists, it was also unsafe. Industrial chemicals and pesticides were entering the food supply through food residues and additives. They were causing cancer. How did we know this? Tests on animals – specifically, on mice and rats – proved it.

There was only one problem with this assertion. Scientifically speaking, it was complete hooey. The cancer risk of one glass of wine was about 10,000 -12,000 times greater than that posed by the additives and pesticide residues (cumulatively) in most food products. Most of our cancer risk comes from natural sources, such as sunlight and natural pesticides produced by plants. Some of these occur in common foods. Still, cancer rates had remained steady or fallen over the previous fifty years except for lung cancers attributable to smoking and melanomas attributable to ultraviolet light. Cancer rates among young adults had decreased rapidly. Age-adjusted death rates had mostly fallen.

Energy insufficiency had been brought to market by environmentalists in the 1970s, during the so-called Energy Crisis. It sold well when OPEC was allowed to peg oil prices at stratospheric levels. But when the Reagan administration decontrolled prices, domestic production rose and prices fell. As the 1990s rolled around, environmentalists were reduced to citing on “proven reserves” of oil (45 years) and natural gas (63 years) as “proof” that we would soon run out of fossil fuels and energy prices would then skyrocket. Of course, this was more hooey; proven reserves are the energy equivalent of inventory. Businesses hold inventory as the prospective benefits and costs dictate. Current inventories say nothing about the long-run prospect of shortages.

In 1978, for example, proven reserves of oil stood at 648 billion barrels, or 29.2 years’ worth at current levels of usage. Over the next 14 years, we used about 84 billion barrels, but – lo and behold – proven reserves rose to nearly a billion barrels by 1992. That happened because it was now profitable to explore for and produce oil in a newly free market of fluctuating oil prices, making it cost-efficient to hold larger inventories of proven reserves. (And in today’s energy market, it is innovative technologies that are driving discoveries and production of new shale oil and gas.) Really, it is an idle pastime to estimate the number of years of “known” resources remaining because nobody knows how much of a resource remains. It is not worth anybody’s time to make an accurate estimate; it is easier and more sensible to simply let the free market take its course. If the price rises, we will produce more and discover more reserves to hold as “inventory.” If we can’t find any more, the resultant high prices will give us the incentive to invent new technologies and find substitutes for the disappearing resource. That is exactly what has just happened with the process called “fracking.” We have long known that conventional methods of oil drilling left 30-70% of the oil in the ground because it was too expensive to extract. When oil prices rose high enough, fracking allowed us to get at those sequestered supplies. We knew this in the early 1990s, even if we didn’t know exactly what technological process we would ultimately end up using.

Conservation was the first product packaged and sold by environmentalism, long predating Rachel Carson. It dated back to the origin of the national-park system in Theodore Roosevelt’s day and the times of John Muir and John Jacob Audubon. By the early 1990s, conservation was a mature product. The federal government was already the biggest landowner in the U.S. We already had more national parks than the federal government could hope to manage effectively. Environmentalists could no longer make any additional sales using conservation as the product.

Just about the only remaining salable product the environmentalists had was species preservation. Environmentalism flogged it for all it was worth, but that wasn’t much. After the Endangered Species Act was passed and periodic additions made to its list, what was left to do? Not nearly enough to support the upper-middle-class lifestyles of a few million environmentalists. (It takes an upper-middle-class income to enjoy the amenities of nature in all their glory.)

Environmentalism Presents: Global Warming

In the late 1980s, the theory that industrial activity was heating up the atmosphere by increasing the amount of carbon dioxide in the air began to gain popular support. In 1989, Time Magazine modified its well-known “Man of the Year” award to “Planet of the Year,” which it gave to “Endangered Earth.” It described the potential effects of this warming process as “scary.” The International Panel on Climate Change, an organization of environmentalists dedicated to selling their product, estimated that warming could average as much as 0.5 degrees Fahrenheit per decade over the next century, resulting in a 5.4 degree increase in average temperature. This would cause polar ice caps to melt and sea levels to rise, swamping coastal settlements around the world – and that was just the beginning of the adverse consequences of global warming.

No sooner had rollout begun than the skepticism rolled in along with the new product. Scientists could prove that atmospheric carbon dioxide was increasing and that industrial activity was behind that, but it could not prove that carbon dioxide was causing the amount of warming actually measured. As a matter of fact, there wasn’t actually an unambiguous case to be made for warming. What warming could be found had mostly occurred at night, in the winter and in the Southern Hemisphere (not the locus of most industrial activity). And to top it all off, it is not clear whether or not we should ascribe warming to very long-run cyclical forces that have alternated the Earth between Ice Ages and tropical warming periods for many thousands of years. By 1994, Time Magazine (which needed a continuous supply of exciting new headlines just as much as environmentalists needed a new supply of products with which to scare the public) had given up on global warming and resuscitated a previous global-climate scare from the 1970s, the “Coming Ice Age.”

It is easy to see the potential benefits of the global-warming product for environmentalists. Heretofore, almost all environmentalist products had an objective basis. That is, they spotlighted real problems. Real problems have real solutions, and the hullabaloo caused by purchase of those products led to varying degrees of improvement in the problems. Note this distinction: the products themselves did not cause or lead to the improvement; it was the uproar created by the products that did the job. Most of the improvement was midwived by economic measures, and environmentalism rejects economics the way vampires reject the cross. This put environmentalists in an anomalous position. Their very (indirect) success had worked against them. Their real income was dependent on selling environmentalism in any of various ways. Environmentalists cannot continue to sell more books about (say) air pollution when existing laws, regulations and devices have brought air quality to an acceptable level. They cannot continue to pass more coercive laws and regulations when the legally designated quality has been reached. Indeed, they will be lucky to maintain sales of previously written books to any significant degree. They cannot continue to (credibly) solicit donations on the strength of a problem that has been solved, or at least effectively managed.

Unfortunately for environmentalists, the environmental product is not like an automobile that gives service until worn out and needs replacement, ad infinitum. It is more like a vaccine that, once taken, needn’t be retaken. Once the public has been radicalized and sensitized to the need for environmentalism, it becomes redundant to keep repeating the process.

Global warming was a new kind of product with special features. Its message could not be ignored or softened. Either we reform or we die. There was no monkeying around with tradeoffs.

Unlike the other environmental products, global warming was not a real problem with real solutions. But that was good. Real problems get solved – which, from the environmentalist standpoint, was bad. Global warming couldn’t even be proved, let alone solved. That meant that we were forced to act and there could be no end to the actions, since they would never solve the problem. After all, you can’t solve a problem that doesn’t exist in the first place! Global warming, then, was the environmentalist gift that would keep on giving, endlessly beckoning the faithful, recruiting ever more converts to the cause, ringing the cash register with donations and decorating the mast of environmentalism for at least a century. Its very scientific dubiety was an advantage, since that would keep it in the headlines and keep its critics fighting against it – allowing environmentalists the perfect excuse to keep pleading for donations to fend off the evil global-warming deniers. Of course, lack of scientific credibility is also a two-edged sword, since environmentalists cannot force the public to buy their products and can never be quite sure when the credibility gap will turn the tide against them.

When you’re selling the environmentalist product, the last thing you want is certainty, which eliminates controversy. Controversy sells. And selling is all that matters. Environmentalists certainly don’t want to solve the problem of global warming. If the problem is solved, they have nothing left to sell! And if they don’t sell, they don’t eat, or at least they don’t enjoy any real income from environmentalism. Environmentalism is also aimed at gaining psychological benefits for its adherents by giving their lives meaning and empowering them by coercing people with whom they disagree. If there is no controversy and no problem, there is nothing to give their lives meaning anymore and no basis for coercing others.

The Economic Theory of Environmentalism

Both environmentalists and their staunchest foes automatically treat the environmental movement as a romantic crusade, akin to a religion or a moral reform movement. This is wrong. Reformers or altruists act without thought of personal gain. In contrast, environmentalists are self-interested individuals in the standard tradition of economic theory. Some of their transactions lie within the normal commercial realm of economics and others do not, but all are governed by economic logic.

That being so, should we view environmentalism in the same benign light as we do any other industry operating in a free market? No, because environmentalists reject the free market in favor of coercion. If they were content to persuade others of the merits of their views, their actions would be unexceptional. Instead, they demand subservience to their viewpoint via legal codification and all forms of legislative, executive, administrative and judicial tyranny. Their adherents number a few would-be dictators and countless petty dictators. Their alliance with science is purely opportunistic; one minute they accuse their opponents of being anti-scientific deniers and the next they are praying to the idol of Gaia and Mother Earth.

The only thing anti-environmentalists have found to admire about the environmental movement is its moral fervor. That concession is a mistake.

DRI-292 for week of 6-29-14: One in Six American Children is Hungry – No, Wait – One in Five!

An Access Advertising EconBrief:

One in Six American Children is Hungry – No, Wait – One in Five!

You’ve heard the ad. A celebrity – or at least somebody who sounds vaguely familiar, like singer Kelly Clarkson – begins by intoning somberly: “Seventeen million kids in America don’t know where their next meal is coming from or even if it’s coming at all.” One in six children in America is hungry, we are told. And that’s disgraceful, because there’s actually plenty of food, more than enough to feed all those hungry kids. The problem is just getting the food to the people who need it. Just make a donation to your local food pantry and together we can lick hunger in America. This ad is sponsored by the Ad Council and Feeding America.

What was your reaction? Did it fly under your radar? Did it seem vaguely dissonant – one of those things that strikes you wrong but leaves you not quite sure why? Or was your reaction the obvious one of any intelligent person paying close attention – “Huh? What kind of nonsense is this?”

Hunger is not something arcane and mysterious. We’ve all experienced it. And the world is quite familiar with the pathology of hunger. Throughout human history, hunger has been mankind’s number one enemy. In nature, organisms are obsessed with absorbing enough nutrients to maintain their body weight. It is only in the last few centuries that tremendous improvements in agricultural productivity have liberated us from the prison of scratching out a subsistence living from the soil. At that point, we began to view starvation as atypical, even unthinkable. The politically engineered famines that killed millions in the Soviet Union and China were viewed with horror; the famines in Africa attracted sympathy and financial support from the West. Even malnutrition came to be viewed as an aberration, something to be cured by universal public education and paternalistic government. In the late 20th century, the Green Revolution multiplied worldwide agricultural productivity manifold. As the 21st century dawned, the end of mass global poverty and starvation beckoned within a few decades and the immemorial problem of hunger seemed at last to be withering away.

And now we’re told that in America – for over a century the richest nation on Earth – our children – traditionally the first priority for assistance of every kind – are hungry at the ratio of one in six?

WHAT IS GOING ON HERE?

The Source of the Numbers – and the Truth About Child Hunger

Perhaps the most amazing thing about these ads, which constitute a full-fledged campaign, is the general lack of curiosity about their origins and veracity. Seemingly, they should have triggered a firestorm of criticism and investigation. Instead, they have been received with yawns.

The ads debuted last Fall. They were kicked off with an article in the New York Times on September 5, 2013, by Jane L. Levere, entitled “New Ad Campaign Targets Childhood Hunger.” The article is one long promotion for the ads and for Feeding America, but most of all for the “cause” of childhood hunger. That is, it takes for granted that a severe problem of childhood hunger exists and demands close attention.

The article cites the federal government as the source for the claim that “…close to 50 million Americans are living in ‘food insecure’ households,” or ones in which “some family members lacked consistent access throughout the year to adequate food.” It claims that “…almost 16 million children, or more than one in 5, face hunger in the United States.”

The ad campaign is characterized as “the latest in a long collaboration between Ad Council and Feeding America, ” which supplies some 200 food banks across the country that in turn supply more than 61,000 food pantries, soup kitchens and shelters. Feeding America began in the late 1990s as another organization, America’s Second Harvest, which enlisted the support of A-list celebrities such as Matt Damon and Ben Affleck. This was when the partnership with the Ad Council started.

Priscilla Natkins, a Vice-President of Ad Council, noted that in the early days “only” one out of 10 Americans was hungry. Now the ratio is 1 out of 7 and more than 1 out of 5 children. “We chose to focus on children,” she explained, “because it is a more poignant approach to illustrating the problem.”

Further research reveals that, mirabile dictu, this is not the first time that these ads have received skeptical attention. In 2008, Chris Edwards of Cato Institute wrote about two articles purporting to depict “hunger in America.” That year, the Sunday supplement Parade Magazine featured an article entitled “Going Hungry in America.” It stated that “more than 35.5 million Americans, more than 12% of the population and 17% of our children, don’t have enough food, according to the Department of Agriculture.” Also in 2008, the Washington Post claimed that “about 35 million Americans regularly go hungry each year, according to federal statistics.”

Edwards’ eyebrows went up appropriately high upon reading these accounts. After all, this was even before the recession had been officially declared. Unlike the rest of the world, though, Edwards actually resolved to verify these claims. This is what Edwards found upon checking with the Department of Agriculture.

In 2008, the USDA declared that approximately 24 million Americans were living in households that faced conditions of “low food security.” The agency defined this condition as eating “less varied diets, participat[ing] in Federal food-assistance programs [and getting] emergency food from community food pantries.” Edwards contended that this meant those people were not going hungry – by definition. And indeed, it is semantically perverse to define a condition of hunger by describing the multiple sources of food and change in composition of food enjoyed by the “hungry.”

The other 11 million (of the 35 million figure named in the two articles) people fell into a USDA category called “very low food security.” These were people whose “food intake was reduced at times during the year because they had insufficient money or other resources for food” [emphasis added]. Of these, the USDA estimated that some 430,000 were children. These would (then) comprise about 0.6% of American children, not the 17% mentioned by Parade Magazine, Edwards noted. Of course, having to reduce food on one or more occasions to some unnamed degree for financial reasons doesn’t exactly constitute “living in hunger” in the sense of not knowing where one’s next meal was coming from, as Edwards observed. The most that could, or should, be said was that the 11 million and the 430,000 might constitute possible candidates for victims of hunger.

On the basis of this cursory verification of the articles’ own sources, Chris Edward concluded that hunger in America ranked with crocodiles in the sewers as an urban myth.

We can update Edwards’ work. The USDA figures come from survey questions distributed and tabulated by the Census Bureau. The most recent data available were released in December 2013 for calendar year 2012. About 14.5% of households fell into the “low food security” category and about 5.7% of households were in the “very low food security” pigeonhole. Assuming the current average of roughly 2.58 persons per household, this translates to approximately 34 million people in the first category and just under 13.5 million people in the second category. If we assume the same fraction of children in these at-risk households as those in 2008, that would imply about 635,000 children in the high-risk category, or less than 0.9 of 1% of the nation’s children. That is a far cry from the 17% of the nation’s children mentioned in the Washington Post article of 2008. It is a farther cry from the 17,000,000 children mentioned in current ads, which would be over 20% of America’s children.

The USDA’s Work is From Hunger

It should occur to us to wonder why the Department of Agriculture – Agriculture, yet – should now reign as the nation’s arbiter of hunger. As it happens, economists are well situated to answer that question. They know that the federal food-stamp began in the 1940s primarily as a way of disposing of troublesome agricultural surpluses. The federal government spent the decade of the 1930s throwing everything but the kitchen sink at the problem of economic depression. Farmers were suffering because world trade had imploded; each nation was trying to protect its own businesses by taxing imports of foreign producers. Since the U.S. was the world’s leading exporter of foodstuffs, its farmers were staggering under this impact. They were swimming in surpluses and bled so dry by the resulting low prices that they burned, buried or slaughtered their own output without bringing it to market in an effort to raise food prices.

The Department of Agriculture devised various programs to raise agricultural prices, most of which involved government purchases of farm goods to support prices at artificially high levels. Of course, that left the government with lots of surplus food on its hands, which it stored in Midwestern caves in a futile effort to prevent spoilage. Food distribution to the poor was one way of ridding itself of these surpluses, and this was handled by the USDA which was already in possession of the food.

Just because the USDA runs the food-stamp program (now run as a debit-card operation) doesn’t make it an expert on hunger, though. Hunger is a medical and nutritional phenomenon, not an agricultural one. Starvation is governed by the intake of sufficient calories to sustain life; malnutrition is caused by the maldistribution of nutrients, vitamins and minerals. Does the Census Bureau survey doctors on the nutritional status of their patients to provide the USDA with its data on “food insecurity?”

Not hardly. The Census Bureau simply asks people questions about their food intake and solicits their own evaluation of their nutritional status. Short of requiring everybody to undergo a medical evaluation and submit the findings to the government, it could hardly be otherwise. But this poses king-sized problems of credibility for the USDA. Asking people whether they ever feel hungry or sometimes don’t get “enough” food is no substitute for a medical evaluation of their status.

People can and do feel hungry without coming even close to being hungry in the sense of risking starvation or even suffering a nutritional deficit. Even more to the point, their feelings of hunger may signal a nutritional problem that cannot be cured by money, food pantries, shelters or even higher wages and salaries. The gap between the “low food security” category identified by the USDA and starving peoples in Africa or Asia is probably a chasm the size of the Grand Canyon.

The same America that is supposedly suffering rampant hunger among both adults and children is also supposedly suffering epidemics of both obesity and diabetes. There is only one way to reconcile these contradictions: by recognizing that our “hunger” is not the traditional type but rather the kind associated with diabetes (hence, obesity) rather than the traditional sort of starvation or malnutrition. Over-ingestion of simple carbohydrates and starches can often cause upward spikes in blood sugar among susceptible populations, triggering the release of insulin that stores the carbohydrate as fat. Since the carbohydrate is stores as fat rather than burned for energy, the body remains starved for energy and hungry even though it is getting fat. Thus do hunger and obesity coexist.

The answer is not more government programs, food stamps, food pantries and shelters. Nor, for that matter, is it more donations to non-profit agencies like Feeding America. It is not more food at all, in the aggregate. Instead, the answer is a better diet – something that millions of Americans have found out for themselves in the last decade or so. In the meantime, there is no comparison between the “hunger” the USDA is supposedly measuring and the mental picture we form in our minds when we think of hunger.

This is not the only blatant contradiction raised by the “hunger in America” claims. University of Chicago economist Casey Mulligan, in his prize-winning 2012 book The Redistribution Recession, has uncovered over a dozen government program and rule changes that reduced the incentive to work and earn. He assigns these primary blame for the huge drop in employment and lag in growth that the U.S. has summered since 2007. High on his list are the changes in the food-stamp program that substituted a debit card for stamps, eliminated means tests and allowed recipients to remain on the program indefinitely. A wealthy nation in which 46 million out of 315 million citizens are on the food dole cannot simultaneously be suffering a problem of hunger. Other problems, certainly – but not that one.

What About the Real Hunger?

That is not to say that real hunger is completely nonexistent in America. Great Britain’s BBC caught word of our epidemic of hunger and did its own story on it, following the New York Times, Washington Post, Parade Magazine party line all the way. The BBC even located a few appropriately dirty, ragged children for website photos. But the question to ask when confronted with actual specimens of hunger is not “why has capitalism failed?” or “why isn’t government spending enough money on food-security programs?” The appropriate question is “why do we keep fooling ourselves into thinking that more government spending is the answer when the only result is that the problem keeps getting bigger?” After all, the definition of insanity is doing the same thing over and over again and expecting a different result.

The New York Times article in late 2013 quoted two academic sources that were termed “critical” of the ad campaign. But they said nothing about its blatant lies and complete inaccuracy. No, their complaint was that it promoted “charity” as the solution rather than their own pet remedies, a higher minimum wage and more government programs. This calls to mind the old-time wisecrack uttered by observers of the Great Society welfare programs in the 1960s and 70s: “This year, the big money is in poverty.” The real purpose of the ad campaign is to promote the concept of hunger in America in order to justify big-spending government programs and so-called private programs that piggyback on the government programs. And the real beneficiaries of the programs are not the poor and hungry but the government employees, consultants and academics whose jobs depend on the existence of “problems” that government purports to “solve” but that actually get bigger in order to justify ever-more spending for those constituencies.

That was the conclusion reached, ever so indirectly and delicately, by Chris Edwards of Cato Institute in his 2008 piece pooh-poohing the “hunger in America” movement. It applies with equal force to the current campaign launched by non-profits like the Ad Council and Feeding America, because the food banks, food pantries and shelters are supported both directly and indirectly by government programs and the public perception of problems that necessitate massive government intervention. It is the all-too-obvious answer to the cry for enlightenment made earlier in this essay.

In this context, it is clear that the answer to any remaining pockets of hunger is indeed charity. Only private, voluntary charity escapes the moral hazard posed by the bureaucrat/consultant class that has no emotional stake in the welfare of the poor and unfortunate but a big stake in milking taxpayers. This is the moral answer because it does not force people to contribute against their will but does allow them to exercise free will in choosing to help their fellow man. A moral system that works must be better than an immoral one that fails.

Where is the Protest?

The upshot of our inquiry is that the radio ads promoting “hunger in America” and suggesting that America’s children don’t know where their next meal is coming from are an intellectual fraud. There is no evidence that those children exist in large numbers, but their existence in any size indicts the current system. Rather than rewarding the failure of our current immoral system, we should be abandoning it in favor of one that works.

Our failure to protest these ads and publicize the truth is grim testimony to how far America has fallen from its origins and ideals. In the first colonial settlements at Jamestown and Plymouth, colonists learned the bitter lesson that entitlement was not a viable basis for civilization and work was necessary for survival. We are in the process of re-learning that lesson very slowly and painfully.

DRI-319 for week of 6-22-14: Redskins Bite the Dust – and So Do Free Markets

An Access Advertising EconBrief:

Redskins Bite the Dust – and So Do Free Markets

The Trials and Appeals Board (TTAB) of the United States Patent and Trademark Office (USPTO) recently suspended validity of the trademarks previously held by the Washington Redskins professional football team of the National Football League (NFL). The legal meaning of this action is actually much more complex than public opinion would have us believe. The importance of this action transcends its technical legal meaning, however. If we can believe polls taken to test public reaction to the case, 83% of the American public disapproves of the decision. They, too, sense that there is more at stake her than merely the letter of the law.

The Letter of the Law – and Other Letters

The federal Lanham Trademark Act of 1946 forbids the registration of “any marks that may disparage persons or bring them into contempt or disrepute.” That wording forms the basis for the current suit filed by a group of young Native American plaintiffs in 2006. The hearing was held before TTAB in March, 2013. This week the judges issued a 99-page opinion cancelling each of the 6 different trademark registrations of the name “REDSKINS” and the Redskins’ logo, an Indian brave’s head in silhouette with topknot highlighted on the left. The decision called the trademarks “disparaging to Native Americans at the respective times they were registered.” The wording was necessary to the verdict; indeed, the dissenting judge in the panel’s 2-1 ruling claimed that the majority failed to prove that the registrations were contemporaneously disparaging.

This was not the first attempt to invalidate the Redskins trademarks – far from it. The previous try came in 1999 when the TTAB also ruled against the team. That ruling was overturned on appeal. The grounds for rejection were both technical and substantive. The judges noted that the plaintiffs were well over the minimum filing age of 18 and that the registrations went as far back as the 1930s. Thus, the plaintiffs had undermined their claim to standing by failing to exercise their rights to sue earlier – if the trademarks were known to have been such an egregious slur, why hadn’t plaintiffs acted sooner? The plaintiffs also cited a resolution by the National Congress of American Indians in 1993 that denounced the name as offensive. The Congress claimed to represent 30% of all Native Americans, which the judges found insufficiently “substantial” to constitute a validation of plaintiffs’ claim.

Meanwhile, an AnnenbergPublicPolicyCenter poll found in 2004 that “90% of Native Americans [polled] said the name didn’t bother them,” as reported in the Washington Post. Team owner Daniel Snyder’s consistent position is that he will “never” change the team name since it was chosen to “honor Native Americans,” the same stand taken by NFL President Roger Goodell. Various Native American interest groups and celebrities, such as 5000-meter Olympic track gold-medalist Billy Mills, have sided with the plaintiffs. Senate Majority Leader Harry Reid jumped at the chance to play a race card, calling the team name a “racial slur” that “disparages the American people” (!?). He vows to boycott Redskins’ games until the name is changed. Roughly half his Senate colleagues sent a letter to the team demanding a name change.

The Practical Effects of the Ruling

Numerous popular sources have opined that anybody is now “free” to use the name “Redskins” for commercial purposes without repercussions. Several lawyers have pointed out that this is not true. For one thing, this latest decision is subject to judicial review just as were previous ones. Secondly, it affects only the federal registration status of the trademarks, not the right to the name. The enforceability of the trademark itself still holds under common law, state law and even federal law as outlined in the Lanham Act. The law of trademark itself takes into account such concepts as “pervasiveness of use,” which reflects actual commercial practice. In this case, the name has been in widespread use by the team for over 80 years, which gives it a strong de facto claim. (If that sounds confusing, join the club.) Finally, the appeals process itself takes at least two years to play out, so even the registration status will not change officially for awhile.

Thus, the primary impact of the ruling will be on public relations in the short run. The same commentators who cast doubt on the final result still urge Daniel Snyder to take some sort of token action – set up a foundation to benefit Native Americans, for instance – to establish his bona fides as a non-racist and lover of Native Americans.

Why the Law is an Ass

There are times when you’re right and you know why you’re right. There are other times when you’re right and you know you’re right, but you can’t quite explain why you’re right. The general public is not made up of lawyers. If judges say the trademark registrations are illegal, the public is prepared to grant it. But, like Charles Dickens’ character Mr. Bumble, they insist that the law is an ass. They just can’t demonstrate why.

The provision in the Lanham Act against disparaging trademarks is the kind of legal measure that governments love to pass. It sounds both universally desirable and utterly innocuous. Disparaging people and holding them up to ridicule and contempt is a bad thing, isn’t it? We’re against that, aren’t we? So why not pass a law against it – in effect – by forbidding disparaging trademarks. In 1946, when the Lanham Act passed, governments were big on passing laws that were little more than joint resolutions. The Employment Act of 1946, for example, committed the federal government to achieving “maximum employment, purchasing power and income.” There is no objective way to define these things and lawmakers didn’t try – they just passed the law as a way to show the whole world that they were really, really serious about doing good, not just kidding around the way legislatures usually are. Oh, and by the way, any time they needed an excuse for spending a huge wad of the taxpayers’ money, they now had one. (Besides, before the war a famous economist had said that it was all right to spend more money than you had.)

The law against disparaging trademarks was passed in the same ebullient mood as was the Employment Act of 1946. Government doesn’t actually have the power to guarantee maximum employment or income or purchasing power and it also doesn’t have the power to objectively identify disparagement. Unlike beauty, a slur is not in the eye of the beholder. It is in the brain of the author; it is subjective because it depends on intent. Men often call each other “bastard” or “son of a bitch”; each can be either deadly serious invective or completely frivolous, depending on the context. The infamous “n-word,” so taboo that it dare not speak its name, is in fact used by blacks toward each other routinely. It can be either a casual form of address or a form of disparagement and contempt – depending on the intent of the user.

Everybody – including even Native Americans – knows that Washington football team owner George Preston Marshall, one of the legendary patriarchs of the NFL, did not choose the team name “Redskins” in order to disparage Native Americans or hold up to ridicule or contempt. He chose it to emphasize the fighting and competitive qualities he wanted the team to exemplify, because Indians in the old West were known as fierce, formidable fighters. Whether he actually meant to honor Native Americans or merely to trade on their reputation is open to debate, but it is an open-and-shut, 100%, Good-Housekeeping-seal-of-approval-certified certainty that he was not using the word “Redskins” as a slur. Why? Because by doing so he would have been committing commercial suicide by slandering his own team, that’s why.

That brings us to the second area resemblance of between the Lanham Act and the Employment Act of 1946. The Employment Act was unnecessary because free markets when left to their own devices already do the best job of promoting high incomes, low unemployment and strong purchasing power than can be done. And free markets are the best guarantee against the use of disparaging trademarks, because the inherent purpose of a trademark is to promote identification with the business. Who wants their business identified with a slur? We don’t need a huge bureaucracy devoted to the business of rooting out and eradicating business trademarks that are really slurs. Free markets do that job automatically by driving offending businesses out of business. Why otherwise would businesses spend so much time and money worrying about public relations and agonizing over names and name changes?

If the only reason for the persistence of legislation like the Employment Act and the Lanham Act were starry-eyed idealism, we could write off them off as the pursuit of perfect justice, the attempt to make government write checks it can’t cover in the figurative sense as well as the financial. Idealism may explain the origin of these laws but not their persistence long after their imposture has been exposed.

Absolute Democracy

By coincidence, another political-correctness scandal competed with the Redskins trademark revocation for headlines. The story was first reported as follows: A 3-year-old girl suffered disfiguring facial bites by three dogs (allegedly “pit bulls”). She was taken to a Kentucky Fried Chicken franchise by a parent, where she was asked to leave, after an order was placed for her favorite meal of sweet tea and mashed potatoes, because her presence was “disrupting the other customers.” Her relatives took this story of “discrimination” to the news media.

Representatives of the parent corporation were guarded in their reaction to the accusation, but unreserved in the sympathy they expressed for the girl. They promised a donation of $30,000.00 to aid in treatment of her injuries and for her future welfare. They also promised to follow up to confirm what actually happened at the store.

What actually happened, according to their follow-up investigation, was nothing. This was the result of their internal probe and a probe by an independent company they hired to do its own investigation. Review of the store’s surveillance tape showed no sign of the girl or her relatives on the day in question. A review of transactions showed no order for “sweet tea and mashed potatoes” on that day, either. KFC released a finding that the incident was a hoax, a conclusion that was disputed by another relative of the girl who was not one of those supposedly present at the incident.

Perhaps the most significant part of this episode is that KFC did not retract their promise of a $30,000.00 donation to the girl – despite their announced finding that her relatives had perpetrated a hoax against the corporation.

The Redskins trademark case and the apparent KFC hoax are related by the desire of interested parties to use political correctness as a cover for extracting money using the legal system. Pecuniary extortion is crudely obvious in the KFC case; $30,000 is the blackmail that company officials are willing to pay to avoid being crucified in a public-relations scandal manufactured out of nothing.

Their investigation was aimed at avoiding a charge of “discrimination” against the girl, which might have resulted in a six- or seven-figure lawsuit and an even-worse PR scandal. But their willingness to pay blackmail suggests an indifference to the problem of “moral hazard,” something that clearly influences Daniel Snyder’s decision not to change the Redskins’ team name. Willingness to pay encourages more blackmail; changing the team name encourages more meddling by activists.

The Redskins case is more subtle. Commentators stress that plaintiffs are unlikely to prevail on the legal merits, but doubt that the team can stand the continuous heat put on it by the PR blowtorch lit by the TTAB verdict. That is where the money comes in – owner Daniel Snyder will have to pony up enough money to the various Native American interest groups to buy their silence. Of course, this will be spun by both sides as a cultural contribution, meant to make reparations for our history of injustice and brutality to the Native American, and so on.

Of course, Snyder may turn out to be as good as his word; he may never agree to change the Redskins’ team name. The NFL – either the Commissioner or the other owners exerting their influence – may step in and force a name change. Or Snyder may even sell the team rather than be forced to change their name against his will. That would leave the plaintiffs and Native American interest groups out in the cold – financially speaking. Does that invalidate the economic theory of absolute democracy as applied to this case?

No. Plaintiffs stand to benefit in an alternative manner. Instead of gaining monetary compensation for their efforts, they would earn psychological (psychic) utility. From everyday observation, as well as our own inner grasp of human nature, we realize that some people who cannot achieve nevertheless earn psychic pleasure from thwarting the achievements of others. In this particular case, the prospective psychic gains earned by some Native Americans from overturning the Redskins name and the prospective monetary gains earned from blackmailing the Redskins’ owner are substitute goods; the favorable verdict handed down by TTAB makes it odds-on that that one or the other will be enjoyed.

This substitution potential is responsible for the rise and continued popularity of the doctrine of political correctness. “Race hustlers” like Jesse Jackson and Al Sharpton have earned handsome financial rewards for themselves and/or clients by demonizing innocuous words and deeds of whites as “racist.” What is seldom recognized, though, is the fact that their popularity among blacks at large is owed to the psychic rewards they confer upon the rank-and-file. When (let us say) a white English teacher is demoted or fired for teaching the wrong work by Mark Twain or Joseph Conrad, followers of Jackson and Sharpton delight. They know full well that the exercise is a con – that is the point. They feel empowered by the fact that they may freely use the n-word while whites are prevented from doing so. Indeed, this is simply a reversal of the scenario under Jim Crow, when blacks were forced to the back of the bus or to restricted drinking fountains. In both cases, the power of the law is used to earn psychic rewards by imposing psychic losses on others.

Legal action was necessary in the Redskins’ case because plaintiffs were bucking an institution that had been validated by the free market. The Washington Redskins have over 80 years of marketplace success on their record; the free market refused to punish their so-called slur against Native Americans. In fact, the better case is that the team has rehabilitated the connotation of the word “redskins” through its success on the field and its continuing visibility in the nation’s capital. Goodness knows, countless words have undergone this sort of metamorphosis, changing from insults to terms of honor.

When plaintiffs could not prevail through honest persuasion they adopted the modern American method – they turned to legal force. However tempting it might be to associate this tactic exclusively with the political correctness of the left, the truth is that it is the means of first resort for conservatives as well. That is the seeming paradox of absolute democracy, which represents the dictatorship of the law over free choice.

Inevitably, advocates of political correctness cite necessity as their justification. The free market is not free and does not work, so the government must step in. The planted axioms – that free markets usually fail while governments always work – are nearly 180 degrees out of phase. The failures of government highlight our daily lives, but the successes of the free market tend to be taken for granted. The famous episode of Little Black Sambo and its epilogue serves as a reminder.

The Little Black Sambo stories and Sambo Restaurants

The character of Little Black Sambo and the stories about him have been redefined by their detractors – that is to say, demonized as racist caricatures that dehumanize and degrade American blacks. This is false. In the first place, the original character of Little Black Sambo, as first portrayed in stories written in the late 19th and early 20th centuries, was Tamil (Indian or Sri Lankan) – a reflection of the ecumenical reach exerted by the term “black” in those days. Eventually, the character was adapted to many nationalities and ethnic identities, including not only American black but also Japanese. (Indeed, he remains today a hero to children of Japan, who remain blissfully untouched by the political correctness familiar to Americans.) This is not surprising, since the stories portray a little boy whose heroic perseverance in the face of obstacles is an imperishable life lesson. Presumably, that is why the stories are among the bestselling children’s storybooks of all time.

When American versions of the story portrayed Little Black Sambo as an American or African black, this eventually caught the eye of disapproving blacks like the poet Langston Hughes, who called the picture-book depiction a classic case of the “pickaninny” stereotype. Defenders of the stories noted that when the single word “black” was removed and any similarity to American or African blacks deleted from the illustrations, the stories attracted no charges of racism. Yet black interest groups echoed the psychologist Alvin Poussaint, who claimed that “I just don’t see how I can get past the title and what it means,” regardless of any merit the stories might contain. The storybooks disappeared from schools, nurseries and libraries.

In 1957, two restaurant owners in Santa Barbara, CA, opened a casual restaurant serving ethnic American food. In the manner of countless others, they chose a name that combined their two nicknames, “Sam” (Sam Battistone) and “Bo” (Newell Bohnett). Over time, Sambo’s Restaurant’s popularity encouraged them to franchise their concept. It grew into a nationwide company with 1,117 locations. Many of these were decorated with pictures and statuary that borrowed from the imagery of the “Little Black Sambo” stories.

The restaurants were a marketplace success, based on their food, service and ambience. But in the 1970s, black interest groups began raising objections to the use of the “Sambo” name and imagery, calling it – you guessed it – racist. Defenders of the franchise cited the value and longstanding popularity of the stories. They noted the success and popularity of the restaurants. All to no avail. By 1981, the franchising corporation was bankrupt. Today, only the original Santa Barbara location remains.

This was certainly not a victory for truth and justice. But it was a victory for the American way – that is, the true American way of free markets. Opponents of Sambo’s Restaurants went to the court of public opinion and made their case. Odious though it seemed to patrons of the restaurants, the opponents won out.

So much for the notion that free markets are rigged against political correctness. In the case of Sambo’s Restaurants, people concluded that the name tended to stigmatize blacks and they voluntarily chose not to patronize the restaurants. The restaurants went out of business. This was the appropriate way to reach this outcome because the people who were benefitting from the restaurants decided that the costs of production outweighed the benefits, and chose to forego those benefits. The decisive factor was that bigotry was (apparently) a cost of production.

Instead of achieving their aim through legal coercion or blackmail, activists achieved it through voluntary persuasion. Alas, that lesson has now been forgotten by both the political Left and Right.

DRI-312 for week of 6-15-14: Wealth and Poverty: Blame and Causation

An Access Advertising EconBrief:

Wealth and Poverty: Blame and Causation

Among the very many cogent distinctions made by the great black economist Thomas Sowell is that between blame and causation. Blame is a moral or normative concept. Causation is a rational, cause-and-effect concept. “Sometimes, of course, blame and causation may coincide, just as a historic event may coincide with the Spring equinox,” Sowell declared in Economic Facts and Fallacies. “But they are still two different things, despite such overlap.”

Unfortunately, blame has overtaken causation in the public perception of how the world works. This is bad news for economics, which is a rational discipline rather than a morality play.

Economic Development

There is a specialized branch of economics called economic development. Not surprisingly, its precepts derive from the principles of general economic theory, adapted to apply in the special case of areas, regions and nation states whose productive capabilities rise from a primitive state to advanced status.

The public perception of economic development, though, is that of a historical morality play. Developed Western nations in Europe engaged in a practice called “imperialism” by colonizing nations in South America and Africa. Then they proceeded to exploit the colonial natives economically. This exploitation not only reduced their standard of living contemporaneously, it left them with a legacy of poverty that they have been subsequently unable to escape. Only government aid programs of gifts or loans, acting as analogues to the welfare programs for impoverished individuals in the Western countries, can liberate them and expiate the sins of the West.

The idea that moral opprobrium attaches to acts of national conquest has a considerable appeal. The conventional approach to what is loftily called “international law” – or, more soberly, “foreign policy” – is that military force applied aggressively beyond a country’s own international boundaries is wrong. But the impact of wrongful acts does not necessarily condemn a nation to everlasting poverty.

In fact, world history to date has been overwhelmingly a tale of conquest. For centuries, nations attained economic growth not through production but through plunder. Only since the Industrial Revolution has this changed. It is worthwhile to question the presumption that defeat automatically confers a legacy of economic stasis and inferiority.

That is why we must distinguish between blame and causation. We may assign blame to colonizers for their actions. But those actions and their effects occurred in the colonial era, prior to independence. Cause-and-effect relationships are necessarily limited to relationships in the same temporal frame; the past cannot hold the present prisoner. Even if we were to claim that (say) inadequate past investment under colonization is now responsible for constraining present economic growth, we would still have to explain why current investment cannot grow and eventually stimulate future economic growth.

Great Britain was the world’s leading economic power during the 18th and 19th centuries. She conquered and held a worldwide empire of colonies. She must have commanded great wealth, both military and economic, in order to achieve these feats. Yet Great Britain herself was conquered by the Romans and spent centuries as part of the Roman Empire. The “indigenous peoples” of the British Isles (perhaps excluding the Irish, who may have escaped the Roman yoke) must have recovered from the pain of being subjugated by the Romans. They must have overcome the humiliation of bestowing upon William the title of “Conqueror” after his victory at Hastings in 1066. They must – otherwise, how else could they have rebounded to conquer half the world themselves?

Great Britain’s legacy of military defeat, slavery and shame did not thwart its economic development. It did not stop the British pound sterling from becoming the vehicle currency for world trade, just as the U.S. dollar is today. If anything, Great Britain and Europe prospered under Roman domination and suffered for centuries after the collapse of the empire.

Germany has been an economic powerhouse since the 19th century. It survived utter devastation in two world wars and calumniation in their wake, only to rise from the ashes to new heights of economic prominence. Yet its legacy prior to this record of interrupted success was a history of squabbles and conflict between regional states. They, too, were subjugated by Rome and arose from a long period of primitive savagery. Why didn’t this traumatize the German psyche and leave them forever stunted and crippled?

It is hard to think of any nation that had a tougher road to hoe than China. True, China was the world’s greatest economic power over a millennium ago. But centuries of isolation squandered this bequest and left them a medieval nation in a modern world. As if this weren’t bad enough, they reacted by embracing a virulent Communism that produced the world’s worst totalitarian state, mass famine and many millions of innocent deaths. At the death of Mao Ze-Dong in 1976, China was a feeble giant – the world’s most populous nation but unable to feed itself even a subsistence diet. Yet this legacy of terror, famine, defeat and death failed to prevent the Chinese from achieving economic development. Less than 40 years later, China is a contender for the title of world’s leading economic power.

It is certainly true that some countries in Africa and South America were colonized by European powers and subsequently experienced difficulty in raising their economic productivity. But it is also true that there are “countries mired in poverty that were never conquered.” Perhaps even more significantly, “for thousands of years, the peoples of the Eurasian land mass and the peoples of the Western Hemisphere were unaware of each other’s existence,” which constitutes a legacy of isolation even more profound and enduring than any residue left by the much shorter period of contact between them.

Economists have identified various causal factors that affect economic development much more directly and clearly than military defeat or personal humiliation suffered by previous generations. Most prominent among these are the geographic factors.

Mankind’s recorded history began with settlements in river valleys. A river valley combines two geographic features – a river and a valley. The river is important because it provides a source of water for drinking and other important uses. Rivers also serve as highways for transportation purposes. Finished goods, goods-in-process and primary inputs are all transported by water. In modern times, with the advent of swifter forms of transportation, only commodities with low value relative to bulk travel by water. But throughout most of human history, rivers were the main transportation artery linking human settlements. Oceans were too large and dangerous to risk for ordinary transportation purposes; lakes were not dispersed widely enough to be of much help.

If we contrast the kind and quality of rivers on the major continents, it is not hard to see why North America’s economic development exceeded that of Africa. Not only is North America plentifully supplied with rivers, but its largest rivers, the Mississippi and the Missouri, tend to be highly navigable. Its coastline contains many natural harbors. Africa’s rivers, in contrast, are much more problematic. While the Nile is navigable, its annual floods have made life difficult for nearby settlers. The Congo River’s navigability (including its access from the ocean) is hindered by three large falls. The African coastline contains comparatively few natural harbors and is often difficult or impossible for ships to deal with – a fact that hindered international trade between Africa and the outside world for decades. The Congo is the world’s second largest river in terms of water-volume discharged; the Amazon River in South America is the largest. Yet the tremendous hydropower potential of both rivers has hardly been tapped owing to various logistical and political obstacles.

Valleys contrast favorably with mountainous regions because they are more fertile and easier to traverse. Sowell quotes the great French historian Fernand Braudel’s observation that “mountain life lagged persistently behind the plain.” He cites mountainous regions like the Appalachians in the U.S., the mountains of Greece, the RifMountains in Morocco and the ScottishHighlands to support his generalization. Not only do both Africa and South America contain formidable mountain barriers, their flatlands are much less conducive to economic development than those of (say) North America. Both Africa and South America contain large rainforests and jungles, which not only make travel and transport difficult or impossible but are also hard to clear. As if that weren’t a big enough barrier, both continents face political hurdles to the exploitation of the rainforests.

South America differs from its northern neighbor particularly in topography. The AndesMountains to the west have traditionally divided the continent and represented a formidable geographic barrier to travel and transportation. One of the great stories in the history of economic geography is the tale, told most vividly by legendary flier and author Antoine de Saint-Exupery in his prize-winning novel Night Flight, of the conquest of the Andes by airline mail-delivery companies in the formative days of commercial North America, the flatlands of South America do not consist primarily aviation.

Climate has similar effects on economic development. A priori, temperate climate is more suitable for agriculture and transportation than either the extremes of heat or cold. Both Africa and South America contain countries located within tropical latitudes, where heat and humidity exceed the more temperate readings typical of North America and Europe. Indeed, Africa’s average temperature makes it the hottest of all continents. While North America does contain some desert land, it cannot compare with northern Africa, where the Sahara approaches the contiguous U.S in size. The barrenness of this climate makes it less suitable for human habitation and development than any area on Earth save the polar regions. Speaking of which, subarctic climates can be found on the highest mountain regions on each continent.

The economic toll taken by geographic barriers to trade can be visualized as akin to taxes. Nature is levying a specific tax on the movement of goods, services and people over distance. The impact of this “transport tax” can extend far beyond the obvious. As Sowell points out, the languages of Africa comprise 30% of the world’s languages but are spoken by only 13% of the world’s population. The geographic fragmentation and separation of the continent has caused cultural isolation that has produced continual fear, hatred, conflict and even war between nations. The civil war currently raging between Sunni, Shiite and Kurd is the same kind of strife that T.E. Lawrence sought to suppress during World War I almost a century ago. Thus, an understanding of basic geography is sufficient to convey the severe handicap imposed on most countries in Africa and South America compared to the nations of Europe and North America.

Political Economy

It is certainly true that geography alone placed Africa and South Africa behind the economic-development 8-ball. Still, each continent does contain a share of desirable topographies and climates. History even records some economic-development success stories there. Argentina was one of the world’s leading economic powers in the 19th century. Not only was its national income ranked among world leaders, its rate of growth was high and growing. Its share of world trade also grew. Today, its status is dismal, exactly the reverse of its prior prosperity – its GDP is barely one-tenth of ours. But it was not conquered by a colonial power, nor was it “exploited” by “imperialism.”

Argentina won its independence from Spain well before it rose to economic prominence. Unfortunately, its political system gradually evolved away from free-market economics and toward the dictatorial socialism epitomized by Juan Peron and his wife, Evita. This produced inflation, high taxes, loss of foreign trade and investment and a steady erosion of real income.

Elsewhere in South America, economic evolution followed a similar course, albeit by a different route. Most countries lacked the same experience with free markets and institutions that lifted Argentina to the heights. Even when independence from colonial rule brought republican government, this quickly morphed to one-party rule or military dictatorship. Although the political Left insists that South America has been victimized by capitalism, South America’s history really reeks of the same “crony capitalism” that reigns supreme in the Western nations today. This means authoritarian rule, unlimited government and favoritism exerted in behalf of individuals or constituent groups. Moreover, erosion of property rights has weakened a key bulwark of free-market capitalism in the West today, just as it did throughout the history of South America.

In Africa, the situation was even worse and has remained so until quite recently. After crying out for independence from colonial oppressors, native Africans surrendered their freedom to a succession of dictators who proved more oppressive, brutal and bloodthirsty than the colonizers. Now, with the rise of the Internet and digital technology, Africans at last possess the ability to exist and thrive independently of government. They also can overcome the costs of transacting to protest against dictatorship.

The importance of markets and institutions can be divined from a roll-call of the most successful countries. Great Britain, Japan, Hong Kong, Singapore and Scandinavia are all small countries that lack not only size but also abundance of natural resources. One thing that Africa and South America did possess in quantities rivaling that of Europe and North America was resource wealth. But the ability to turn resources into goods and services requires the other things that Africa and South America lacked: not only favorable geography and climate, but also favorable institutions, laws and mores. Even in North America, the U.S. had all the favorable requisites, while Mexico lacked the legal and institutional environment and Canada lacked the favorable geography and climate.

Viewed in this light, it is not chauvinism to invoke a principle of “American exceptionalism;” it is just clear-eyed analysis. The country that later became the United States of America was blessed with ideal geography and climate. While it faced aboriginal opposition, that was much less fierce than it might have been. Great Britain’s colonial stewardship allowed the colonies to develop economically, albeit in a restricted framework. Moreover, the colonists developed a close acquaintanceship with British laws and institutions. This proved vital to the eventual birth of the American Declaration of Independence and Constitution. The U.S. was indeed the exception when it came to economic development because it faced few of the obstacles that hampered the development of almost all other countries. Coupled with the most favorable constitution ever written for free markets and a century and a half of virtually free immigration, the result was the growth of the world’s greatest economy.

Culture

Through the ages, historians have accorded culture an increasing emphasis in their studies. Oddly, though, it has seldom been linked to economics in general and almost never to economic development in particular. Yet even a cursory glance suggests it as an explanation for some of what otherwise would stand as paradoxes.

India has long ranked as the “phenom” of economic development – perennially expected to bust loose to assume its rightful place among the world’s economic powerhouses, and perennially a disappointment. As a legacy of centuries of colonial rule by Great Britain, it inherited a cadre of well-trained and educated civil servants. The world’s second-largest population provided a ready source of labor. The country did not lack for capital goods despite the abject poverty of most of its citizens, thanks to British investment. What, exactly, was holding India back?

The political left supplied its standard answer by attaching blame for India’s poverty to its “legacy of colonialism.” Movies like Gandhi portrayed British behavior toward Indians as beastly and sanctified Gandhi’s policy of passive resistance within a framework of civil disobedience. These answers were less than complete, however. They did not explain how the U.S., also a British colony and occasional victim of British beastliness for a century and a half, was able to succeed so brilliantly while India failed so dismally. Nor did they explain why India failed while employing the same socialist economic policies that England had incubated throughout the early 1900s before installing them at home just before granting India’s independence.

India’s adoption of socialism was the political complement to its cultural reverence for poverty, created and nurtured by Gandhi. India could hardly have picked a worse symbol for hero worship. Fortunately, India’s independence was delayed until after World War II, in which India refused to embrace Gandhi’s pacifism and participated significantly in her own defense and that of the Eastern theater. Then, after independence, India continued to stoke regional hostilities with neighbors China and Pakistan in subsequent decades, ignoring Gandhi’s views in the one context in which they might have done some good. Meanwhile, the country’s steadfast unwillingness to adopt a commercial ethic, root out public corruption and eradicate traditional taboos against the unhindered opposition of markets foreclosed any possibility of real economic growth.

If there was ever a culture that seemed impervious to economic growth, it was India’s. Even China never seemed such a hopeless case, for Chinese who emigrated became the success story of Southeast Asia; clearly Chinese institutions were holding up economic development, not her culture. Well, India’s cultural head is still buried in the sands of the past, but her institutions have changed sufficiently to midwife noticeable economic growth beginning in the late 1990s.

Foreign Aid and Foreign Investment

Two great myths of economics relate to foreign aid and foreign investment. For decades, intellectuals and governments sang the praises of foreign aid as a recipe for prosperity and cure for poverty. Alas, institutions like the World Bank and International Monetary Fund – both of which were created for completely unrelated purposes – have failed miserably to promote economic development despite decades of trying and billions of dollars in loans, grants and consulting contracts.

The failures have been particularly glaring in Africa, where real incomes were the lowest in the world throughout the 20th century. In retrospect, it is not easy to figure out why international aid should have succeeded in raising real incomes. After all, one of the signature measures employed by newly independent regimes in Africa and South America was to expropriate wealth owned by foreigners through nationalization. This raised the incomes of government officials and their cronies but did not raise real incomes generally. As Sowell observes, “there is no more reason to expect automatic benefits from wealth transfers through international agencies than from wealth transfers through internal confiscations.” And indeed, “the incentives facing those disbursing the aid and those receiving it seldom make economic development the criterion of success.” Aid agencies simply strive to give money away; host governments simply strive to get money. And that is pretty much what happened.

Lenin developed a theory of imperialism to explain why capitalism did not succumb to revolution on schedule. When the declining profit from capital threatened their viability, capitalists would turn to the less-developed nations, where their foreign investment would earn “super profits” at the expense of the host peoples. Unfortunately, his theory was overturned by experience, which showed that capitalists in developed countries invested mostly in other developed countries. (Today’s neo-Marxism has returned full-circle to the exploitation theories of original Marxism with the newly popular theory of French economist Piketty. His theory postulates a return to “capital” that is greater than that from investment in labor, which promotes a greater level of (hypothesized) inequality in income and wealth. Having failed to sell a theory of inequality based on a declining rate of profit, the Left is switching tactics – the return on capital is too high, not declining.)

The real recurring example of successful “foreign investment” has come through immigration. Welsh miners have come to the U.S. and mined successfully. Chinese entrepreneurs have migrated throughout Southeast Asia and dominated entrepreneurship in their adopted countries. Jews have migrated to countries throughout the world and dominated industries such as finance, clothing, motion pictures and education. German workers helped Argentina become a world leader in wheat production and export. Indian immigrants have become leading entrepreneurs in motels and hotels in the U.S. Italian and Lebanese immigrants migrated to Africa and the U.S. and achieved entrepreneurial success in various fields. Yet, ironically, immigration has typically been opposed by natives in spite of the consistent benefits it generates.

Causation, not Blame

History is a record of strife and conflict, of conquest and submission. At one time or other, practically every people have been conquered and subjugated. Colonial status has sometimes been disastrous to natives, as with some countries colonized by Spain in the Age of Exploration. Sometimes it has been relatively beneficial, as it was in the early stages of the American colonies. Often it turned out to be a mixed bag of benefits and drawbacks. But economic development has never been either guaranteed or foreclosed by the mere existence of a colonial past. Economic logic lists too many causal factors affecting development for us to play the blame game.

DRI-291 for week of 6-8-14: The (Latest) V.A. Scandal: So What Else is New?

An Access Advertising EconBrief:

The (Latest) V.A. Scandal: So What Else is New?

The news media has covered the recent medical-care scandal involving the Veterans’ Administration with its usual breathless urgency. Veterans of political economy find this ironic, since no feature of the political landscape is more ritualistic than the administrative scandal. Its elements are by now as stylized as those of the Japanese kabuki dance.

It begins with the uncovering of shocking facts – perhaps by journalistic investigation, perhaps by revelation from an internal source such as a whistleblower, perhaps by random circumstance. The facts are greeted first by denials, starting at the administrative level and proceeding upward – typically to the cabinet level, sometimes ending in assurance by the President of the United States that reports are greatly exaggerated.

Observers generally realize that it is the administrators and politicians who are exaggerating, not the journalists and whistleblowers, since few scandals emerge full-blown without any previous hint of their existence. Eventually, the fundamental truth of the allegations cannot be denied any longer, and the administrators and cabinet secretary in charge of the erring agency must fess up. This is the confessional stage of the scandal. It is characterized by admission of grievous fault, abject apology and plea for forgiveness, or at least understanding. The confession flagrantly contradicts previous insistence that the whole thing was an overblown attempt by political opponents to smear the present administration.

The last phase is the Presidential phase. The President is shocked, shocked to discover that error and scandal have invaded the administration of government on his watch. His attitude toward his administrative subordinates in the executive department is that of the admonishing schoolmaster: fair but firm, reluctant to punish but determined to root out all evil, to banish forever this unaccountable blot on the escutcheon of his tenure. The administrators must go, of course, even though they are able, noble, kind, determined, brave, clean and reverent. There will be an investigation, and when all the details are known, we will proceed to wipe this disgraceful episode from our memories and move on, to greater and more glorious triumphs…

The ellipsis reflects the fact that the entire purpose of the ritual is to pass through the period of scandal with the least possible political damage inflicted on the administration. The collective attitude of that of a child caught in a misdeed. The child is fully conscious of guilt; every word and action is oriented toward escaping punishment and returning to the status quo ante. Neither truth nor justice has any bearing on the child’s behavior. Likewise, they have no effect on the administration’s actions, either.

The recent V.A. scandal contains the classic elements. Not only is it predictable, it was predicted in this space in our previous discussions of the economics of medical care and Obamacare. Now the other shoe has dropped. The utter familiarity of the ritual means that the political aspects can be subordinated to our real object. It is the economic features that claim our interest.

The Details of the Scandal

The V.A. scandal concerns the provision of medical care for discharged members of the armed forces by the Veterans’ Health Administration in Department of Veterans Affairs. This is only one of the functions performed by the Veterans’ Administration, the others being administration of veterans’ benefits and supervision of burials and memorials for veterans. The cost of medical care to veterans depends on the ability to pay – it is either free or accompanied by a co-pay. When a vet is discharged from the service, he must enroll in the V.A. system in one of three ways: by calling a toll-free number, going online or visiting one of the hundreds of V.A. clinics across the country. In order to complete the enrollment process, the vet must possess his DD214 discharge form. At enrollment, the vet is given a means test to determine qualification for a co-pay.

Once enrollment is complete and the vet is accepted within the system, new patients must be seen by a physician within 14 days. Existing patients (who have already been treated and, thus, have already seen a physician for evaluation) must see a doctor within 14-30 days. The failure to meet the stipulated deadlines for these initial appointments is the gravamen of the current scandal.

The head of the V.A.’s health affairs office, Robert Petzer, testified that he knew as early as 2010 that V.A. health clinics were “using inappropriate scheduling procedures” to defer these initial appointments. The deferrals were done because of need; the clinics were simply unable to meet the demand for initial appointments. The excess demand for appointments grew over time and the situation worsened until it reached the epidemic proportions now forming the basis for periodic new revelations. Scandals within the armed forces (and the government at large) are investigated by inspectors general (known as “IGs”). An interim report on the V.A. scandal by the V.A.’s IG called the practice of inappropriate scheduling “systemic.” It involved the use of false or phony waiting lists that were tailored to give the impression that the V.A. was meeting its initial-appointment goals rather than falling further and further short of them.

The scandal erupted after a doctor at the Phoenix, AZ V.A. clinic complained to the IG about treatment delays. It should be noted that this doctor waited until after his retirement to lodge these complaints. The complaints were made in letters written in December, 2013 but did not rise to the level of a public scandal until May, 2014. It transpired that some 1,700 vets were kept on waiting lists and the average vet waited for 115 days for his initial appointment. Meanwhile, official records were falsified to hide these delays.

On June 9, 2014, the Department of Veterans’ Affairs released preliminary results of an audit of 731 V.A. clinics that showed about 57,000 vets who have currently waited for their initial appointment for an average time span exceeding 90 days. Some 13% of V.A. schedulers say they have been ordered to falsify appointment-request logs to make them compliant with the rules. The IG calls the current 14-day goal for initial appointment “unattainable” due to the logistical obstacles posed by insufficient money and personnel.

The news from Phoenix triggered a chain reaction of similar revelations from V.A. hospitals and clinics across America. In Fort Collins, CO, clerks were specifically taught how to falsify records to paint a misleadingly favorable picture of initial appointments kept. A police detective found that in Miami, cover-ups were “ingrained into the hospitals’ culture” and drugs were routinely dealt out of hospital premises. In Pittsburgh, PA, an outbreak of Legionnaire’s Disease in 2011-12 was revealed to be the product of “human error” rather than the “faulty equipment” that had been blamed in Congressional testimony last year.

The delays in initial appointments are important because they represent a delay in the potential diagnosis and/or treatment of one or more medical conditions. Much has been made of the statement by IG Richard Griffin that “we didn’t conclude…that the delay[s] caused… death. It’s one thing to be on a waiting list; it’s another for that to be the cause of death.” But in the case of 52 patients seen by the Columbia, SC gastroenterology unit of the V.A., it certainly was determined that those patients had “disease associated with” treatment delays. We are urged every day to visit our doctor, not to put off visits or hide conditions in hopes that symptoms will disappear, reminded that cancer and other diseases are curable with early detection. Now, suddenly, delays in seeing the doctor are downplayed as a factor in actual incidence or severity of disease.

The medical facilities were not the only loci of dereliction. The War on Terror launched by the Bush Administration has produced an avalanche of disability claims filed by veterans of the Iraq and Afghanistan campaigns. In order to claim a compensable disability, a veteran must show not only the existence of a disability but also a likelihood exceeding 50% that it is due to military service. He is not allowed to hire a lawyer (unless the lawyer works pro bono) before the disability determination is made, so as to preclude the lust for private profit from luring private-sector contingency lawyers into the Klondike of military disability determination. But this process of disability determination has been stalled by (you guessed it) a massive backlog of claims waiting to be heard. This backlog reached a high of 611,000 in 2013 before the resulting publicity triggered a mini-scandal that forced action by Eric Shinseki, Secretary of the Department of Veterans’ Affairs. It now stands at about 300,000 cases that so far have taken over 125 days to process.

One of the most highly publicized features of the scandal has been the bonuses received by upper-level V.A. administrators, tied to complying with V.A. rules for initial-appointment timeliness. These bonuses provided a clear-cut incentive for the falsification of records by lower-level employees operating under orders by their superiors.

The Economics of the V.A. System of Medical Care

Previous discussion of health care in this space touched on the V.A. system. Why should a separate system of medical care exist for military veterans?  Why should that separate system be administered by the federal government? If this separate system exists because it is superior to the one available to the rest of us, why not make it available to all? If it is not superior, why does it exist at all?

Some people have actually followed this logic to its ultimate conclusion. In 2011, the left-wing economist and political columnist Paul Krugman made the case that the V.A. does indeed constitute a superior system of medical care which should be broadened to the entire country. Part of his case rested on the V.A.’s success in meeting its initial-appointment guidelines. By doing so, he contended, it avoided the need for any rationing of care.

“Rationing” is the operative word applying to government provision of medical services. The whole purpose of designating government as the “single payer” for medical care is to sell the concept as “free medical care for all regardless of ability to pay.” Private producers cannot distribute goods for free but this is a specialty of government. As always, the big problem government faces is bridging the gap between its expansive claims and its inability to deliver what it claims. A free good is one for which there is no opportunity cost of provision, hence no scarcity. Saying that a good is free doesn’t make it free; it merely causes people to try to maximize their efforts to acquire it. Maximizing the demand for something is the worst possible way to deliver it free to everybody because it places the biggest possible burden on the supply apparatus.

The V.A. headlines its medical services to veterans as free, but upon reading the fine print veterans discover that they will be subjected to a means test and requested to pony up a co-pay. Of course, this is not the same thing as a unit price to an economist, but it does involve a sacrifice of alternative consumption. But this is small potatoes compared to the real shock in store for any veteran who thinks that his military status entitles him to health care in perpetuity.

Reading current newspaper accounts of the scandal would leave the impression that discharged vets enroll for medical benefits on a first-come, first-served basis. This is not so. Upon applying for benefits, vets are assigned to one (or more) of 8 “eligible priority groups.” The word “priority” hints at the purpose of these groups; they decide whose applications get processed first and in what order. In other words, medical care for veterans is rationed by the Veterans Health Administration from the instant of application for enrollment.

To erase any doubts about the veracity of this statement, we have the word of the V.A. itself. “Unfortunately, the Veterans Health Administration does not have enough resources to provide care to all veterans who need it. To address this issue, the VA has created eight priority groups for enrollment.” There we have it – the dirty little secret of VA medical benefits. Veterans are lured into the system with the promise of free benefits. Before they are even accepted, they find out that the benefits aren’t free and they may not even get them – or, if they do, the effective price may include a hefty upcharge for waiting time. At worst, that upcharge may be the loss of their life.

Each of the 8 eligible priority groups contains multiple subcategories of prioritization. Any connection to medical need or severity is tenuous at best. Group 1, the highest priority for enrollment, includes vets who are 50% or more disabled due to service-connected disability, then picks up those who are unemployed due to service-connected disability. Of course, it could be true that a 50% disability carries with it an immediate need to see a physician. It could also be utterly untrue; it depends on the specific medical circumstances.

Right away, we see that the criteria governing rationing are political and bureaucratic. Political because a disabled vet is a highly visible and ongoing political liability, much more so than a vet who dies awaiting treatment. In a free-market system, decisions about medical treatment are made by you and your doctor in consultation. You know your economic capabilities and your doctor knows you and your medical needs; together you can compare the value you would receive from each incremental bit of medical treatment with its cost. But in the VA, your medical decisions will ultimately be made by bureaucrats who know little or nothing about medicine. That is why criteria like “50% disabled” are necessary; they provide a pseudo-objective basis upon which medically untutored bureaucrats can affirm or deny treatment.

Group 2 includes 30-40% disabled vets. Group 3 is headlined by former prisoners of war, Purple-Heart holders, holders of the Medal of Honor, vets with lower disability status and those who disability was actually caused by treatment or rehabilitation. Again, politics is evident in this ranking with the inclusion of POWs and medal-winners. Why should medical care be turned into a popularity contest? Then again, once we have excluded the free market from consideration, any other system of allocating benefits would be arbitrary.

The lower-ranking Groups introduce other arbitrary criteria like service in Vietnam and exposure to atomic radiation at Hiroshima, Nagasaki or test sites. Low-income vets receive precedence over high-income vets; willingness to fork over a co-pay buys the vet a higher place in line.

When we combine the economics of the V.A. system with the known facts of the current V.A. scandal, the latter becomes easier to understand but harder to stomach.

The Economics of the Scandal

Note the fundamental difference between scarcity as it exists in a free-market context and in the command-and-control context of a politically motivated bureaucracy. Economists define scarcity as the condition in which we cannot have all that we wish to consume and must choose the things we value most. Nobody is automatically or inherently excluded from consumption; price tells us the value that people place on a good and its cost in alternative (or foregone) output. People choose how much to buy based on their incomes and tastes; they can buy small, medium or large quantities and vary their consumption as their incomes change and prices vary. At the V.A., the government chooses what to give you and how much to give you based on (mostly) arbitrary criteria that ignore price and cost. It frankly admits that some people will be excluded – once more based on arbitrary criteria.

Economic logic tells us that the government system is wildly inefficient. Moreover, its inefficiencies will get worse and worse over time because it encourages customers to demand more medical care than can be supplied.

There is nothing remotely surprising or shocking about the current scandal. And as the Washington Post points out, “President Obama has been talking for years about fixing the system.” According to Press Secretary Jay Carney, “This is not a new issue to the President.” Here is one sure sign that Krugman, et al, have missed the boat analytically; you don’t fix a system that is working brilliantly.

Everybody is acting as if the scandal is the result of something going terribly wrong with the system. But this is merely the system working as we expect a system of rationing to work – by excluding some people from service altogether. The V.A. itself says it is designed to do this and explains how it does it – just how surprised should we be when that is exactly what happens? The scandal is not that something has gone wrong with the system; the scandal is the system.

Economic logic tells us that the system is designed to ration care by excluding vets from medical benefits, thereby reducing the amount of medical care provided. This exclusion by rationing takes several forms. First, the vet may be excluded by not qualifying at all. Second, he may fall in the last (8th) eligible priority group, get tired of waiting to be processed and accepted and simply seek out paid care in the private sector. This relieves the V.A. of the burden of serving him. Third, he may die while waiting to be seen, as vets have done and continue to do. The larger the number of vets who face delays in acceptance and processing, the greater the likelihood that this will happen. And the longer the delays, the greater the likelihood that this will happen. Once more, this relieves the V.A. of the necessity of serving him. Fourth, the longer the delay faced by the vet, the worse (on net balance) will be his condition when he is finally accepted, seen and treated. This will shorten his life span and reduce the total amount of medical care the V.A. will be required to give him. (In this shorter time span, however, it will increase the need for greater spending on him, which will give the V.A. leverage to demand larger budget allocations in Congress. This is politically valuable to bureaucrats and their political sponsors.)

Of course, the V.A. can haul out testimonials from some vets who crow about the outstanding medical treatment they have received. In any bureaucracy – police, fire, public education, even the federal government itself – some individuals will stand out by ignoring the lack of incentives for performance and adhering to their own personal standards. And the fact that the V.A. picks and chooses who it treats, when it treats them – “we will treat no veteran before his time” – and how it treats them will allow the agency to provide good service to some vets. But claims of competitive superiority for the V.A. are a mockery considering that it is able to rig the game through rationing and, we now belatedly realize, rig its own statistics internally.

Claims by Krugman and others that the V.A. is a model for health care in general are false on their face. What little success the V.A. has enjoyed depends on the failures highlighted here. The V.A. cannot exist in its present form without the concurrent existence of a private-sector (or public-sector) alternative where its rejects can be dumped and where consumers can seek out consistently higher-quality treatment at a price. An attempt to impose the V.A. model on the country at large simply produces the kind of socialist, “national health service” health care found in countries such as Great Britain and Canada. These are characterized by long waits for care, lower-quality care, poorer medical technology and almost no new drug development. According to Krugman, we should be clamoring for access to the superior medical care provided by the V.A. Americans should be “health tourists,” traveling to Great Britain, France and Canada for their health care. Instead, though, the flow of health tourists runs the other way – into the U.S.

Democrats insist that the Bush Administration caused the V.A. scandal by overloading the system with applicants through its foreign wars. They cannot have it both ways. How can their system be superior if it falls apart when the demand for its product increases, which is the average business’s idea of paradise? Free-markets use flexibility of prices and quantities to handle variations in demand; they use higher prices to attract more resources into the system to handle the larger demand. It is command-and-control rationing systems, deprived of vital pricing tools, which crumble under the pressure of demand increases.

Public shock over the incentive bonuses paid to V.A. administrators for initial-appointment compliance not actually attained is likewise naïve. After all, critics of free markets and corporations scream bloody murder when CEOs are not paid for performance. The V.A. was simply trying to curry favor with the public by mimicking the private sector’s performance incentives. The problem is, of course, that the V.A. is not the private sector. In a free market, a firm couldn’t get away with faking its performance because you can’t fake the bottom line; failure to perform will reduce profits. But there is no bottom line at the V.A. and no way (short of audit) to detect the kind of fakery that went on at the V.A. for years and years. Sure, veterans complained, but nothing happened because vets did not control the bureaucracy and had no political clout. The only reason the scandal was uncovered was that the doctor who blew the whistle had recently retired and no longer had to fear bureaucratic retaliation for his actions.

Speaking of political clout…

Cui Bono?

Why has a federal agency so inimical to the interests of a beloved constituency persisted – nay, thrived – since its inception in 1930? The great myth of big government is that it serves the interests of its constituents. But as we have seen, this is hardly true.

The real beneficiaries of big government are government employees, bureaucrats and politicians. The V.A. has metastasized into a cabinet-level bureaucracy with over 330,000 employees, including thousands of mid-level bureaucrats. Most of its employees belong to a powerful public-sector union. Employees and bureaucrats vote for the politicians who vote the appropriations that pay their salaries and lucrative benefits.

These people are invisible in the current scandal, except for the passive role they play as order-takers and functionaries. But they are the reason why the system is not “reformed.” There is no reforming this kind of system, only tinkering around the margins. Genuine reform would disband the V.A. altogether since its rationale is utterly misguided.

That will not happen. The falsity of the V.A.’s guiding premise is irrelevant. It is not really intended to serve veterans, so its failure to do so does not really matter to politicians. Its real purpose is to win votes by conferring benefits on employees and bureaucrats and it is fulfilling that purpose just as well, if not better, by failing veterans as it would by serving them.

That is why the stern promises to “fix the problem” are so much hypocritical cant. There will be no fix and no reform – only the next scandal.

Cant Rules in Public Discussions of the V.A.

Why do we watch numbly as the V.A. scandal unfolds – the latest in a never-ending series? By now, we know the ritual by heart. What is that has us hypnotized?

Human beings mix reason with emotion, and we apparently remain enthralled by the cant that surrounds the V.A. “We love and revere our veterans – so much that we cannot entrust their physical well-being to the mundane ministrations of marketplace medicine. Veterans deserve only the very best. So, naturally, we put their welfare in the hands of the federal government, because it handles all our most important jobs and never fails to satisfy us. We will never rest until veterans are well-cared for, because their happiness and security is our first priority.”

In one part of our mind, this rationale reigns supreme. In the other part, we store all that we know about how the V.A. – and the federal government – actually operates. If those two parts ever commingled, they would probably short-circuit our mental processes indefinitely. We have not yet outgrown our fantasy of government as benevolent, omniscient, omnipotent parent.

In reality, the failures of government are all too painfully obvious. It is not that government has anything special against veterans, other than the fact that they keep showing up at the door expecting to be medically treated. No, government double-crosses and fails veterans just as it does the rest of us. When its failures become manifest, it lies about them. And the people who have placed their ideological and occupational bets on government lie, too.

DRI-275 for week of 6-1-14: The Triumph of Economics in Sports: Economics Takes the Field to Build Winning Teams

An Access Advertising EconBrief:

The Triumph of Economics in Sports: Economics Takes the Field to Build Winning Teams

In the previous two EconBriefs, we spoke of a popular attitude towards sports. It looks nostalgically to a hazy past, when men played a boys’ game with joyous abandon. Today, alas, sports are “just a business,” which is “all about the money.” As elsewhere, “greed” – a mysterious force no more explicable than a plague of locusts – has overtaken the men and robbed them of their childlike innocence.

This emotional theory of human behavior owes nothing to reason. It is the view now commonly bruited by those who describe the financial crisis of 2008 and the Great Recession as the outcome of free markets run rampant. People are irrational, so the result of “unfettered capitalism” must naturally be chaotic disaster.

Economics is the rational theory of human choice. For a half-century, it has opposed the irrationalists from two directions. Its free-market adherents have been led by the Chicago School of Frank Knight, Milton Friedman and George Stigler. That school embraced a theory of perfect rationality: perfect knowledge held by all market participants (later modified somewhat by a theory of information only slightly less heroic in its assumptions), perfectly competitive markets and (where necessary) perfectly benevolent government regulators and/or economist advisors.

The neo-Keynesian opponents of Chicago accepted individual rationality but asserted that individually rational actions produced perverse results in the aggregate, leading to involuntary unemployment and stagnant economies. Only counteracting measures by far-seeing government policymakers and regulators – following the advice of economist philosopher-kings – could rescue us from the depredations of free markets.

The debate, then, has largely been defined by people who saw market participants moved either by utter irrationality or complete rationality. But our analysis has revealed instead an evolutionary climate in which participants in professional sports pursued their own ends rationally within the limits imposed by their own knowledge and capabilities. The great free-market economist F.A. Hayek observed that capitalism does not demand that its practitioners be rational. Instead, the practice of capitalism itself makes people more rational than otherwise by continually providing the incentive to learn, adapt and adopt the most efficient means toward any end. Professional sports has exemplified Hayek’s dictum.

Early on, in its first century, the pursuit of individual self-interest left baseball owners, players and fans at loggerheads. The first owner to address himself to the task of improving the product provided to sports fans was Bill Veeck, Jr., who introduced a host of business, financial and marketing innovations that not only enhanced his own personal wealth but also treated his fans as customers whose patronage was vital. The attitude of ownership toward fans prior to Veeck can be gleaned from the dismissal by New York Yankees’ general manager George Weiss of a proposed marketing plan to distribute Yankee caps to young fans. “Do you think I want every youngster in New York City walking around wearing a Yankees’ cap?” snorted Weiss. Veeck made owners and administrators realize that this was exactly what they should want.

Although few people seemed to realize it, economics had yet to play its trump card in the game of professional sports. Economics is the study of giving people what they want the most in the most efficient way. What sports fans want the most is a winning team – and that is exactly what economics had failed to give them. It failed because it had never been deployed toward that end. Even Bill Veeck, despite his success in improving the on-field performance of his teams, had not unlocked the secret to using economic principles per se to win pennants and World Series.

As sometimes happens in human endeavor, baseball had to traverse a Dark Age before this secret was finally revealed.

The Dark Age: Municipal Subsidies and the Growth of Revenue Potential

During Bill Veeck’s swan song as baseball owner in 1975-1981, baseball had entered the period of free agency. The reserve clause tying players to a single team had been drastically modified, allowing players to eventually migrate to teams offering them the best financial terms. As we indicated earlier, this development – viewed in isolation – tilted the division of sports revenue from ownership to players.

This created the pretext by which owners were able to extract subsidies from municipalities throughout the nation. Owners could truthfully claim that they were earning less money as a result of free agency. What they left out was that they were earning more money for a host of other reasons. The obscure nature of player depreciation hid the true financial gains of sports-team ownership from the public. Moreover, the early years of free agency coincided with the advent of massive new revenue sources for owners. Television had brought baseball to millions of people who otherwise saw few games or none; broadcast rights were becoming a valuable asset of team ownership. Radio-broadcast rights increased in value as the increased visibility of teams and players enhanced their popularity. These increases were just gaining speed when the vogue of sports-team subsidies became a national pastime of its own.

The movement of baseball teams had long been viewed as analogous to the movement of businesses. Even the loss of popular teams like the Brooklyn Dodgers and New York Giants to westward expansion of baseball in Los Angeles and San Francisco was grudgingly accepted, since baseball still remained in New York City and the Mets were added as an expansion franchise in 1962. But when the Athletics moved from Kansas City to Oakland in 1967, Missouri Senator Stuart Symington decided that the federal government could not countenance “unfettered capitalism” in the baseball business. He demanded that major-league baseball replace Kansas City’s lost franchise. This opened the floodgates to the intrusion of politics in baseball.

If it was fair for politicians to dictate where major-league baseball should operate, then franchises should be able to demand favors from local governments – or so reasoned baseball owners. And demand them they did.

Owners demanded that teams build new, larger, better-appointed stadiums for their sports teams. Cities should fund construction, own the stadiums, operate them, maintain them and lease them to the sports teams for peanuts – otherwise, owners would pack up and move to a city that would meet their demands.

What was in it for the host city? After all, not everybody is a sports fan. Owners sensed that they needed something to offer the city at large. Thus was born one of the great con games of the 20th century: the notion of sports as economic-development engine of growth. Owners seized on the same thinking that animated the dominant neo-Keynesian economic model. They sponsored “economic-impact studies” of the effect sports teams had on the local economy. In these studies, spending on sports took on a magical, mystical quality, as if jet-propelled by a multiplier ordained to send it rocketing through the local economy. And everybody “knew” that the more spending took place, the better off we all were.

It is hard to say what was worse, the economic logic of these studies or their statistical probity. It was not unusual to find that a study would add (say) the money spent on gasoline purchases at stations adjacent to the stadium to the “benefits” of sports team presence. Of course, this implies that locating the team as far as possible from the fans would increase the “benefits” dramatically; it is a case of cost/benefit analysis in which the costs are counted as benefits. This novel technique inevitably produces a finding of vast benefits.

As time went on, sale of team artifacts and memorabilia was added to the list of supplemental revenue. Larger stadiums, lucrative TV, radio and cable rights, team product sales – all these drove revenues to owners through the roof as the 20th century approached its close. With municipalities subsidizing the ownership, maintenance and improvement of stadiums, it is no wonder that the capital gains available to owners of sports teams were phenomenal. Ewing Kauffman bought the Kansas City Royals’ franchise for $1 million in 1968. At his death in 1993, the team’s value was estimated at well over $100 million.

One might have expected the usual left-wing suspects to recoil in horror from the income redistribution from ordinary taxpayers to rich owners and rich ballplayers – but no. Newspaper editorialists threw up their hands. The economists who supported free agency said that the major-market teams would get the best players, didn’t they? And hadn’t things worked out just that way, before free agency as well as after? If small-market taxpayers want to win – or even have a team at all – they’ll just have to ante up and face the fact that “this is how the game is played in today’s world.” Besides, doesn’t economic research show the economic-development benefits of sports teams?

Heretofore, economics had operated beneficially, albeit in a gradual, piecemeal way. Now the distortion of economics by the owners and their political allies meant that it was serving the ends of injustice.

Economics – and baseball fans – needed a hero. They got one – several, actually – from a pretty unlikely place.

Middle American Ingenuity to the Rescue

Bill James was born in tiny Holton, KS, in 1947. From childhood, he was a devoted sports fan. Like countless others before him, he was fascinated by the quantitative features of baseball and studied them obsessively. He was unique, though, in refusing to take on faith the value of conventional measures of baseball worth such as batting average, fielding average and runs batted in. James developed his own theories of baseball productivity and the statistical measures to back them up.

In 1977, he published the first edition of his Baseball Abstract, which subsequently became the Bible for his disciples and imitators. James was suspicious of batting average because it deliberately omitted credit for walks. (Ironically, walks were originally granted equivalent status with hits in computing batting average; “Tip” O’Neill’s famous top-ranking average of .485 in 1887 was accrued on this basis. The change to the modern treatment took place shortly thereafter.) While it may be technically true that a walk does not represent a “batting” accomplishment, it is certainly the functional equivalent of a single from the standpoint of run-producing productivity. (Veterans of youth baseball will recall their teammates urging them to wait out the opposing pitcher by chanting, “A walk’s as good as a hit, baby!”) Moreover, walks have many ancillary advantages. Putting the ball in play risks making an out. A walk forces the opposing pitcher to throw more pitches, thereby decreasing his effectiveness on net balance. Waiting longer in the count increases the chances that a hitter will get a more hittable pitch to hit, one that may be driven with power. For all these reasons, James made a convincing case that on-base percentage (OBP)is superior to batting average as a measure of a hitter’s run-producing productivity.

Rather than the familiar totals of home runs and runs batted in, James argued in favor of a more comprehensive measure of power production in hitting called slugging percentage (SP), defined as total bases divided by at bats. This includes all base hits, not just home runs. Instead of runs batted in, James created the category of runs created (RC), defined as hits plus walks times total bases, divided by plate appearances. James also sought a substitute for the concept of “fielding average,” which stresses the absence of errors committed on fielding chances actually handled but says nothing about the fielder’s ability or willingness to reach balls and execute difficult plays that other players may not even attempt. Moreover, fielding must be evaluated on the same level with offensive production since it must be just as valuable to prevent run production by the opposing team as to create runs for the home team.

These measures and maxims formed the core of Bill James’ theory of baseball productivity. His Baseball Abstract computed his measures for the major-league rosters each year and analyzed the play and management of the teams each year. Gradually, James became a cult hero. Others adopted his methods and measures. The Society for American Baseball Research (SABR) sprang up. The intensive study of quantitative baseball – eventually, sports in general – came to be known as “sabermetrics.” Even with all this attention, it still took decades for Bill James himself to be embraced by organized baseball itself. That, too, happened eventually, but not before sabermetrics left the realm of theory and invaded the pressbox, the front office and the very baseball diamond itself.

Moneyball Takes the Field

Billy Beane was a high-school “phenom” (short for phenomenal), a term denoting a player whose all-round potential is so patent that he “can’t miss” succeeding at the major-league level. Like a disconcerting number of others, though, Beane did miss. He played only minimally at the major-league level for a few years before quitting to become a scout. He rose to the front office and was named general manager of the Oakland Athletics in 1997. Beane’s mentor, general-manager Sandy Alderson, taught him the fundamentals of Bill James’ theories of baseball productivity. To them, Beane added his own observations about player development – notably, that baseball scouts cannot accurately evaluate the future prospects of players at the high-school level because their physical, emotional and mental development is still too limited to permit it. Thus, major-league teams should concentrate on drafting prospects out of college in order to improve their draft-success quotient.

Beane hired a young college graduate from HarvardUniversity – not as a player but as an administrative assistant. Paul DiPodesta was an economics major who was familiar with the logic of marginal productivity theory. The theory of the firm declares that managers should equalize the marginal productivity per dollar (that is, the ratio of output each unit of input produces at the margin to the input’s price) between inputs by continually adding more of any input with a higher ratio until the optimal output is reached. Of course, the problem in applying this or any other economic principle to baseball had always been that the principles were non-operational unless a meaningful measure of “output” could be found and the inputs contributing to that output could be identified. That was where Bill James and sabermetrics came in.

In 2001, the Oakland team had won the Western Division of the American League. But their star player, Jason Giambi, has been wooed away by a seven-year, $120-million dollar contract offered by the New York Yankees. It was the age-old story, the “Curse of the Bambino” all over again in microcosm. Oakland’s success had ramped up the value of its players on the open market; replacing those players with comparable talent at market rates would bust the payroll budget. Various other Oakland players were lost to injury or disaffection or free agency. Throughout baseball, opinion was unanimous that the Athletics were in for hard times until the team’s talent base could be rebuilt through player development.

Beane and DiPodesta used the most basic sabermetric concepts, such as ONB, SP and RC, as their measures of productivity. Using publicly available information about player salaries, they calculated player productivities per dollar and discovered the startling number of players whose true productivity was undervalued by their current salaries. Methodically, they set out to rebuild the Oakland Athletics “on the cheap” by acquiring the best players their budget could afford through trade or purchase of contracts. They substantially remade the team using this approach. Despite a slow start, their rebuilt club eventually tied the all-time major-league baseball record by winning 21 straight games and successfully defended the Western Division championship in 2002 and 2003. Author Michael Lewis outlined their story and the rise of sabermetrics in baseball in his 2003 best-selling book Moneyball, which later became a 2011 movie starring Brad Pitt that received six Academy Award nominations.

For the first time, baseball management had explicitly used an economic production function – marginal productivity theory with an operational definition of product or output – to maximize a meaningful object function – namely, “wins” by the team. And they succeeded brilliantly.

Money See, Money Do

In 2003, new Boston Red Sox owner John Henry hired Bill James as a consultant to management, to put the theories of sabermetrics into practice in Boston. During 2001 and 2002, the team had lugged the second-highest payroll in major-league baseball to disappointing results. But in 2003, with a lower- (6th-) ranked payroll, the Boston Red Sox laid the ghost of Babe Ruth by winning their first World Series since 1918. Over the succeeding decade, the Red Sox became the success story of baseball, winning the World Series three more times.

Was this a case of what Rocky’s manager Mickey would call “freak luck?” Not hardly. Thanks to the success of Oakland and Boston and Michael Lewis’s book, the tale of Bill James and sabermetrics traveled. Throughout baseball, sabermetrics ran wild and economics reigned triumphant. In 2003, the Detroit Tigers lost an American-League-record 119 games. In 2006, with only the 14th-highest payroll out of 30 major-league teams, the Tigers won the American League championship. In 2008 and 2009, the Washington Nationals were the worst team in baseball. In 2012, with baseball’s 20th-highest payroll, they had baseball’s best record. In 2010, the Pittsburgh Pirates lost 105 games. In 2013, with baseball’s 20th-highest payroll, they made the post-season playoffs. The Cleveland Indians rebounded from sub-.500 seasons to playoff finishes twice between 2006 and 2014, despite never ranking higher than 15th in the size of their payroll; usually, they ranked between 20th and 26th.

The crowning achievement was that of the perennial cellar-dwelling Tampa Bay Devil Rays. Cellar-dwelling, that is, in the size of their payroll, but not necessarily in the season standings. After years of dismal finishes, the 2008 TampaBay team became American League champs despite ranking 29th (next to last!) in the size of their payroll. They have made the playoffs in four of the six subsequent years, but their payroll continues to languish at the bottom of the major-league rankings.

The New Frontier

Does this mean that the generalization about large-market teams getting the better players and enjoying the better results was and is a lie? No, it was and still is true. But like all economic propositions it is subject to qualification and careful statement.

First, it is a ceteris paribus proposition. It is true that “you can’t beat the stock market (averages)” but every year some people (particularly professional investors) do it. You can’t do it systematically by trading on the basis of publicly available information. The few people who succeed do it on the basis of (unsystematic) luck or by uncovering new information (legally) before it becomes generally known. The market for professional sports is not nearly this efficient; techniques of sports productivity evaluation are not nearly as refined and efficient as those of stock evaluation and trading, which leaves much more room for systematic exploitation by techniques like those of sabermetrics.

Second, the term “large market” is no longer limited by geography as it has been during the first century and a half of U.S. professional sports. Ted Turner’s promotion of the Atlanta Braves using his cable-TV stations blazed the trail for turning a local team into a national one, thereby increasing the value of the team’s broadcasting and product rights. Today, there is no inherent geographic limitation of the size of the market for any team – no reason, for example, why the Kansas City Royals or Chiefs could not become “the world’s team” and sit atop the largest market of all.

The Evolutionary Approach to Free Markets

The correct approach to economics is not the irrationalist view that has clouded our understanding of professional sports. Neither is it the perfectionist view of the ChicagoSchool, which has oversold the virtues of free markets and damaged their credibility. It is certainly not the remedial view of the neo-Keynesian school, which has failed whenever and wherever tried and is now undergoing its latest serial failure.

The evolutionary approach of the true free-market school, so nobly outlined by Hayek and his disciples, fits the history of baseball like a batting glove. It is now in full flower. Taxpayers need no longer be violated by owners who promote false economic benefits of sports and hide the real ones. Fans no longer need languish in a limbo of psychological unfulfillment. Economics – not politicians, regulators or academic scribblers – has come to the rescue at last.