DRI-332 for week of 6-16-13: What Lies Ahead for Us?

An Access Advertising EconBrief:

What Lies Ahead for Us?

Last month, Federal Reserve Chairman Ben Bernanke announced that the Fed Open Market Committee is contemplating an end to the $85 billion program of bond purchases that has been dubbed “Quantitative Easing (QE).” The announcement was hedged over with assurances that the denouement would come only gradually, when the Fed was satisfied that general economic conditions had improved sufficiently to make QE unnecessary. Nonetheless, the announcement produced a flurry of speculation about the eventual date and timing of the Fed’s exit.

The Fed’s monetary policy since the financial crisis of 2008 and the stimulus package of 2009 is unique in U.S. economic history. Indeed, its repercussions have resounded throughout the world. Its motives and means are both poorly understood and hotly debated. Shedding light on these matters will help us face the future. A question-and-answer format seems appropriate to reflect the mood of uncertainty, anxiety and fear that pervades today’s climate.

What was the motivation for QE, anyway?

The stated motivation was to provide economic stimulus. The nature of the stimulus was ambiguously defined. Sometimes it was to increase the rate of inflation, which was supposedly too low. Sometimes it was to stimulate investment by holding interest rates low. The idea here was that, since the Fed was buying bonds issued by the Treasury, the Fed could take advantage of the inverse relationship between a bond’s price and its yield to maturity by bidding up T-bond prices, which automatically has the effect of bidding down their yields. Because $85 billion worth of Treasury bonds comprise such a large quarterly chunk of the overall bond market, this will depress bond yields for quite a while after the T-bond auction. Finally, the last stimulative feature of the policy was ostensibly to indirectly stimulate purchase of stocks by driving down the yields on fixed-income assets like bonds. With nowhere to go except stocks, investors would bid up stock prices, thus increasing the net worth of equity investors, who comprise some 40-50% of the population.

How was driving up stock prices supposed to stimulate the economy?

The ostensible idea was to make a large segment of Americans feel wealthier. This should cause them to spend more money. This sizable increase in overall expenditures should cause secondary increases in income and employment through the economic process known as the “multiplier effect.” This would end the recession by reducing unemployment and luring Americans back into the labor force.

How did the plan work out?

Inflation didn’t increase much, if at all. Neither did investment, particularly when viewed in net terms that exclude investments to replace deteriorated capital stock. Stock prices certainly rose, although the consumption increases that followed have remained modest.

So the plan was a failure?

That would be a reasonable assessment if, and only if, the stated goal(s) of QE was (were) the real goal(s). But that wasn’t true; the real goal QE was to reinforce and magnify the Fed’s overall “zero interest-rate policy,” called ZIRP for short. As long as it accomplished that goal, any economic stimulus produced was a bonus. And on that score, QE succeeded very well indeed. That is why it was extended and why the Fed is stretching it out as long as it can.

Wait a minute – I thought you just said that even though interest rates have remained low, investment has not increased. Why, then, is the Fed so hot to keep interest rates low? I always heard that the whole idea behind Fed policies to peg interest rates at low levels was to stimulate investment. Why is the Fed busting our chops to follow this policy when it isn’t working?

You heard right. That was, and still is, the simple Keynesian model taught to freshman and sophomore economics students in college. The problem is that it never did work particularly well and now works worse than ever. In fact, that policy is actually the proximate cause of the business cycle as we have traditionally known it.

But even though the Fed gives lip service to this outdated textbook concept, the real reason it wants to keep interest rates low is financial. If the Fed allowed interest rates to rise – as they would certainly do if allowed to find their own level in a free capital market – the rise in market interest rates would force the federal government to finance its gargantuan current and future budget deficits by selling bonds that paid much higher interest rates to bondholders. And that would drive the percentage of the federal government budget devoted to interest payments through the roof. Little would be left for the other spending that funds big government as we know it – the many cabinet departments and myriad regulatory and welfare agencies.

Even if you don’t find this argument compelling – and you can bet it compels anybody who gets a paycheck from the federal government – it should be obvious to everybody that the Fed isn’t really trying that hard to apply traditional stimulative monetary policy. After all, stimulative monetary policy works by putting money in public hands – allowing banks to make loans and consumer spending to magnify the multiplier effects of the loan expenditures. But Bernanke lobbied for a change in the law that allowed the Fed to pay interest to banks on their excess reserves.  When the Fed enforces ZIRP by buying bonds in the secondary market, it pays banks for them by crediting the banks’ reserve accounts at the Fed. The interest payments mean that the banks don’t have to risk making loans with that money; they can just hold it in excess reserves and earn easy profits. This is the reason why the Fed’s money creation has not caused runaway inflation, as government money creation always did in the past. You can’t have all or most prices rising at once unless the newly created money is actually chasing goods and services, which is not happening here.

But the mere fact that hyperinflation hasn’t struck doesn’t mean that the all-clear has been sounded. And it doesn’t mean that we’re not being gored by the horns of a debt dilemma. We certainly are.

Being gored in the bowels of a storm cellar is a pretty uncomfortable metaphor. You make it sound as though we have reached a critical economic crisis point.

We have. Every well-known civilizational collapse and revolution, from ancient Rome to the present day, has experienced a financial crisis resembling ours. The formula is familiar. The government has overspent and resorted to money creation as a desperate expedient to finance itself. This has papered over the problem but ended up making things even worse. For example, the French support for the American colonies against Great Britain was the straw that broke the bank of their monarchy, fomenting the French Revolution. The Romanovs downfall occurred despite Russia’s increasing rate of economic growth in the late 1800s and because of financial profligacy and war – two causes that should be familiar to us.

It sounds as though government can no longer use the tools of fiscal and monetary policy to stimulate the economy.

It never could. After all, the advent of Keynesian economics after 1950 did not usher in unprecedented, uninterrupted world prosperity. We had recessions and depressions before Keynes wrote his General Theory in 1936 and have had them since then, too. And Keynes’s conclusions were anticipated by other economists, such as the American economists Foster and Catchings in the late 1920s. F.A. Hayek wrote a lengthy article refuting their arguments in 1927 and he later opposed Keynes throughout the 1930s and thereafter. The principles of his business-cycle theory were never better illustrated than by real-world events during the run-up to the recession and financial crisis in 2007-2008 and the later stimulus, ZIRP and QE.

It seems amazing, but Keynesian economists today justify government policies by claiming that the alternative would have been worse and by claiming responsibility for anything good that happens. Actually, the real force at work was described by the Chairman of Great Britain’s Bank of England, Mervyn King, in the central bank’s February Inflation Report:

“We must recognize [sic], however, that there are limits to what can be achieved via general monetary stimulus – in any form – on its own. Monetary policy works, at least in part, by providing incentives to households and businesses to bring forward spending from the future to the present. But that reduces spending plans tomorrow. And when tomorrow arrives, an even larger stimulus is required to bring forward yet more spending from the future. As time passes, larger and larger doses of stimulus are required.”

King’s characterization of transferring spending or borrowing from the future accurately describes the effects of textbook Keynesian economics and the new variant spawned by the Bernanke Fed. Keynesians themselves advertised the advantage of fiscal policy as the fact that government spending spends 100% of every available dollar, while private consumers allow part of the same dollar to leak into savings. This dovetails exactly with King’s account. The artificially low interest rates created by monetary policy have the same effect of turning saving into current consumption.

Today, we are experiencing a grotesque, nightmarish version of Keynesian economics. Ordinarily, artificially low interest rates would stimulate excessive investment – or rather, would drive investment capital into longer-maturing projects that later prove unprofitable, like the flood of money directed toward housing and real-estate investment in the first decade of this century. But our current interest rates are so absurdly low, so palpably phony, that businesses are not about to be suckered by them. After all, nobody can predict when rates might shoot up and squelch the profitability of their investment. So corporations have pulled up their financial drawbridges behind balance sheets heavy with cash. Consumers have pulled consumption forward from the future, since that is the only attractive alternative to the stock investments that only recently wrecked their net worth. This, too, validates King’s conclusions. Whether “successful” or not, Keynesian economics cannot last because the policy of borrowing from the future is self-limiting and self-defeating.

Didn’t I just read that our budget deficit is headed lower? Doesn’t this mean that we’ve turned the corner of both our budget crisis and our flagging recovery?

If you read carefully, you discovered that the improvement in the federal government’s fiscal posture is temporary, mostly an accounting artifact that occurs every April. Another contributing factor is the income corporations distributed at year-end 2012 to avoid taxation at this year’s higher rates, which is now being taxed at the individual level. Most of this constitutes a one-time increase in revenue that will not carry over into subsequent quarters. Even though the real economic benefits of this are illusory, it does serve to explain why Fed Chairman Bernanke has picked this moment to announce an impending “tapering off” of the QE program of Fed bond purchases.

How so?

The fact that federal deficits will be temporarily lower means that the federal government will be selling fewer bonds to finance its deficit. This, in turn, means that the Fed will perforce be buying fewer bonds whether it wants to or not. Even if there might technically be enough bonds sold for the Fed to continue buying at its current $85 billion level, it would be inadvisable for the federal government to buy all, or virtually all, of an entire issue while leaving nothing for private investors. After all, U.S. government bonds are still the world’s leading fixed-income financial instrument.

Since the Fed is going to be forced to reduce QE anyway, this gives Bernanke and company the chance to gauge public reaction to their announcement and to the actual reduction. Eventually, the Fed is going to have to end QE, and the more accurately they can predict the reaction to this, the better they can judge when to do that. So the Fed is simply making a virtue out of necessity.

You said something awhile back that I can’t forget. You referred to the Keynesian policy of artificially lowering interest rates to stimulate investment as the “proximate” cause of the business cycle. Why is that true and what is the qualifier doing there?

To illustrate the meaning, consider the Great Recession that began in 2007. There were many “causes,” if one defines a cause as an event or sequence of events that initiated, reinforced or accelerated the course of the recession. The housing bubble and ensuing collapse in housing prices was prominent among these. That bubble itself had various causes, including the adoption of restrictive land-use policies by many state and local jurisdictions across America, imprudent federal-government policies promoting home-ownership by relaxing credit standards, bank-regulation standards that positioned mortgage-related securities as essentially riskless and the creation and subsidy of government-sponsored agencies like Fannie Mae and Freddie Mac that implemented unwise policies and distorted longstanding principles of home purchase and finance. Another contributor to recession was the decline in the exchange-value of the U.S. dollar that led to a sharp upward spike in (dollar-denominated) crude oil prices.

But the reign of artificially low interest rates that allowed widespread access to housing-related capital and distorted investment incentives on both the demand and production side of the market were the proximate cause of both the housing bubble and the recession. The interest rates were the most closely linked causal agent to the bubble and the recession would not have happened without the bubble. Not only that, the artificially low interest rates would have triggered a recession even without the other independent influences – albeit a much milder one. Another way to characterize the link between interest rates and the recession would be to say that the artificially low interest rates were both necessary and sufficient to produce the recession. The question is: Why?

For several centuries, an artificial lowering of interest rates accompanied an increase in the supply of money and/or credit. Prior to the 20th century, this was usually owing to increases in stocks of mined gold and/or silver, coupled with the metallic monetary standards then in use. Modern central banks have created credit while severing its linkage with government holdings of stocks of precious metals, thus imposing a regime of fiat money operating under the principles of fractional-reserve banking.

In both these institutional settings, the immediate reaction to the monetary change was lower interest rates. The effect was the same as if consumers had decided to save more money in order to consume less today and more in the future. The lower interest rates had complex effects on the total volume of investment because they affect investment through three different channels. The lower rate of discount and increased value of future investment flows greatly increase the attractiveness of some investments – namely, those in long-lived production processes where cash flows are realized in the relatively distant future. Housing is a classic example of one such process. Thus, a boom is created in the sector(s) to which resources are drawn by the low interest rates, like the one the U.S. enjoyed in the early 2000s.

The increase in employment and income in those sectors causes an increase in the demand for current consumption goods. This bids up prices of labor and raw materials, provided either that full employment has been reached or that those resources are specialized to their particular sectors. This tends to reduce investment in shorter-term production processes, including those that produce goods and services for current consumption. Moreover, the original investments are starting to run into trouble for three reasons: first, because their costs are unexpectedly increasing; second, because the consumer demand that would ordinarily have accompanied an increase in saving is absent because it was monetary expansion, not saving, that produced the fall in interest rates; and third, because interest rates return to their (higher) natural level, making it difficult to complete or support the original investments.

Only an increase in the rate of monetary expansion will allow original investments to be refinanced or validated by an artificial shot of consumer demand. That is what often happened throughout the 20th century – central banks frantically doubled down on their original monetary policy when its results started to go sour. Of course, this merely repeated the whole process over again and increased the size and number of failed investments. The eventual outcome was widespread unemployment and recession. That is the story of the recent housing bubble. This mushrooming disaster couldn’t happen without central banking, which explains why 19th century business cycles were less severe than many modern ones.

I don’t recall reading this rather complicated explanation before or hearing it discussed on television or radio. Why not?

The preceding theory of business cycles was developed by F. A. Hayek in the late 1920s, based on monetary theory developed by his mentor, Ludwig von Mises, and the interest-rate theory of the Swedish economist, Knut Wicksell. Hayek used it to predict the onset of what became the Great Depression in 1929. (Von Mises was even more emphatic, foreseeing a “great crash” in a letter to his wife and refusing a prestigious appointment in his native Vienna to avoid being tarred by exposure to events.) Hayek’s theory earned him an appointment to the London School of Economics in 1931. It was cited by the Nobel committee that awarded him the prize for economic science in 1974.

But after 1931, Hayek engaged several theoretical controversies with his fellow economists. The most famous of these was his long-running debate with John Maynard Keynes. One long-term consequence of that debate was the economics profession’s exile of capital theory from macroeconomics. They refused to contemplate the distinction between long-term and short-term production processes and capital goods. They treated capital as a homogeneous lump or mass rather than a delicate fabric of heterogeneous goods valued by an intricate structure of interest rates.

That is why Keynesian macroeconomics textbooks pretend that government can increase investment by creating money that lowers “the” interest rate. If government could really do this, of course, our lives would all be radically different than they actually are. We would not experience recessions and depressions.

Public-service radio and television advertisements warn consumers to beware of investment scams that promise returns that are “too good to be true.” “If it sounds too good to be true,” the ad declares sententiously, “it probably is.” What we really need is a commercial warning us to apply this principle to the claims of government and government policymakers – and, for that matter, university professors who are dependent upon government for their livelihood.

It turns out to be surprisingly difficult to refute the claims of Keynesian economics without resorting to the annoyingly complicated precepts of capital theory. Ten years before Keynes published his theory, the American economists Foster and Catchings developed a theory of government intervention that embodied most of Keynes’ ideas. They published their ideas in two books and challenged the world to refute them, even offering a sizable cash prize to any successful challenger. Many prominent economists tried and failed to win the prize. What is more, as Hayek himself acknowledged, their failure was deserved, for their analysis did not reveal the fallacies inherent in the authors’ work.

Hayek wrote a lengthy refutation that was later published under the title of “The ‘Paradox’ of Saving.” Today, over 80 years later, it remains probably the most meticulous explanation of why government cannot artificially create and preserve prosperity merely by manipulating monetary variables like the quantity of money and interest rates.

There is nothing wrong with Hayek’s analysis. The main problem with his work is that it is not fashionable. The public has been lied to so long and so convincingly that it can hardly grasp the truth. The idea that government can and should create wealth out of thin air is so alluring and so reassuring – and the idea of its impossibility so painful and troubling – that fantasy seems preferable to reality. Besides, large numbers of people now make their living by pretending that government can do the impossible. Nothing short of social collapse will force them to believe otherwise.

The economics profession obsessively studied and research Keynesian economics for over 40 years, so it has less excuse for its behavior nowadays. Keynes’ main contentions were refuted. Keynesianism was rejected by macroeconomists throughout the world. Even the head of the British Labor Party, James Callaghan, bitter denounced it in a famous speech in 1976. The Labor Party had used Keynesian economics as its key economic-policy tool during its installation of post-World War II socialism and nationalization in Great Britain, so Callaghan’s words should have driven a stake through Keynes’ heart forevermore.

Yet economists still found excuses to keep his doctrines alive. Instead of embracing Hayek, they developed “New Keynesian Economics” – which has nothing to do with the policies of Bernanke and Obama today. The advent of the financial crisis and the Great Recession brought the “return of the Master” (e.g., Keynes). This was apparently a default response by the economics profession. The Recession was not caused by free markets nor was it solved by Keynesian economics. Keynesian economics hadn’t got any better or wiser since its demise. so there was no reason for it to reemerge like a zombie in a George Romero movie. Apparently, economists were reacting viscerally in “we can’t just sit here doing nothing” mode – even though that’s exactly what they should have done.

If QE and ZIRP are not the answer to our current economic malaise, what is?

In order to solve a problem, you first have to stop making it worse. That means ending the monetary madness embodied in QE and ZIRP. Don’t try to keep interest rates as low as possible; let them find their natural level. This means allowing interest rates to be determined by the savings supplied by the private sector and the investment demand generated by private businesses.

In turn, this means that housing prices will be determined by markets, not by the artificial actions of the Fed. This will undoubtedly reverse recent price increases recorded in some markets. As the example of Japan shows only too well, there is no substitute for free-market prices in housing. Keeping a massive economy in a state of suspended animation for two decades is no substitute for a functioning price system.

The course taken by U.S. economic history in the 20th century shows that there is no living with a central bank. Sooner or later, even a central bank that starts out small and innocuous turns into a raging tiger with taxpayers riding its back and looking for a way to get off. (The Wall Street Journal‘s recent editorial “Bernanke Rides the Bull” seems to have misdirected the metaphor, since we are the ones riding the bull.) Instead of a Fed, we need free-market banks incapable of wangling bailouts from the government and a free market for money in which there are no compulsory requirements to accept government money and no barriers to entry by private firms anxious to supply reliable forms of money. Bit Coin is a promising development in this area.

What does all the talk about the Fed “unwinding” its actions refer to?

It refers to undoing previous actions; more specifically, to sales that cancel out previous purchases of U.S. Treasury bonds. The Fed has been buying government bonds in both primary and secondary bond markets pursuant to their QE and ZIRP policies, respectively. It now has massive quantities of those bonds on its balance sheet. Technically, that makes the Fed the world’s largest creditor of the U.S. government. (Since the Fed is owned by its member banks, the banks are really the owner/creditors.) That means that the Federal Reserve has monetized vast quantities of U.S. government debt.

There are two courses open to the Fed. One of them is hyperinflation, which is what will happen when the Fed stops buying, interest rates rise to normal levels and banks have no alternative but to use their reserves for normal, profit-oriented purposes that put money into circulation for spending. This has never before happened in peacetime in the U.S. The other is for the Fed to sell the bonds to the public, which will consist mostly of commercial banks. This will withdraw the money from circulation and end the threat of hyperinflation (assuming the Fed sterilizes it). But it will also drive bond prices into the ground, which means that interest rates will shoot skyward. This will create the aforementioned government budget/debt crisis of industrial strength – and the high interest rates won’t do much for the general business climate for awhile, either.

Since it is considered a public-relations sin for government to do anything that makes the general public uncomfortable and which can be directly traced to it, it is easy to see why the Fed doesn’t want to take any action at all. But doing nothing is not an option, either. Eventually, one of the two aforementioned scenarios will unfold, anyway, in spite of efforts to forestall them.

Uhhhh… That doesn’t sound good.

No spit, Spurlock. Yet, paradoxical as it might seem at first, either of these two scenarios will probably make people more receptive to solutions like free banking and free-market money – solutions that most people consider much too radical right now. There are times in life when things have to get worse before they can get better. Regrettably, this looks like one of those times.

DRI-306 for week of 6-2-13: What Is (Or Was) ‘American Exceptionalism’?

An Access Advertising EconBrief:

What Is (Or Was) ‘American Exceptionalism’?

Ever since the 1970s, but increasingly since the financial crisis of 2008 and ensuing Great Recession, eulogies have been read for American cultural and economic preeminence. If accurate, this valedictory chorus would mark one of the shortest reigns of any world power, albeit also the fastest rise to supremacy. Even while pronouncing last rites on American dominance, however, commentators unanimously acknowledge our uniqueness. They dub this quality “American exceptionalism.”

This makes sense, since you can’t very well declare America’s superpower status figuratively dead without having some idea of what gave it life in the first place. And by using the principles of political economy, we can identify the animating features of national greatness. This allows us to perform our own check of national vital signs, to find out if American exceptionalism is really dead or only in the emergency room.

Several key features of the American experience stand out.

Free Immigration

Immigration (in-migration) fueled the extraordinary growth in U.S. population throughout its history. Immigration was mostly uncontrolled until the 1920s. (The exception was Chinese immigration, which was subject to controls in the late 19th century.) Federal legislation in the 1920s introduced the concept of immigration quotas determined by nation of origin. These were eventually loosened in the 1960s.

From the beginning of European settlement in the English colonies, inhabitants came not only from the mother country but also from Scotland, Ireland, Wales, the Netherlands, Spain, France, Germany and Africa. Scandinavia soon contributed to the influx. Some of the earliest settlers were indentured servants; slaves were introduced in the middle of the 17th century.

Today it is widely assumed that immigrants withdraw value from the U.S. rather than enhancing it, but this could hardly have been true during colonial times when there was little or no developed economy to exploit. Immigrants originally provided the only source of labor and have continued to augment the native labor supply down to the present day. For most of American history, workers were drawn to this country by wages that were probably the highest in the world. This was due not just to labor’s relative scarcity but also to its productivity. Immigrants not only increased the supply of labor (in and of itself, tending to push wages down) but also complemented native labor and made it more productive (tending to push wages up). The steady improvements in technology during the Industrial Revolution drove up productivity and the demand for labor faster than the supply of labor increased, thereby increasing real wages and continually drawing new immigrants.

Economists have traditionally underrated the importance of entrepreneurship in economic development, but historians have noted the unusual role played by Scottish entrepreneurs like Andrew Carnegie in U.S. economic history. At the turn of the 20th century, the business that became the motion-picture industry was founded almost entirely by immigrants. Most of them were Jews from Eastern Europe who stepped on dry land in the U.S. with no income or assets. They built the movie business into the country’s leading export industry by the end of the century. In recent years, Asians and Hispanics have taken up the entrepreneurial slack left by the native population.

An inexplicably ignored chapter in U.S. economic history is the culinary (and gastronomic) tradition linked to immigration. Early American menus were heavily weighted with traditional English dishes like roast beef, breads and puddings. Soon, however, immigrants brought their native cuisines with them. At first, each ethnic enclave fed its own appetites. Then immigrants opened up restaurants serving their own. Gradually, these establishments attracted native customers. Over decades, immigrant dishes and menus became assimilated into the native U.S. diet.

Germans were perhaps the first immigrants to make a powerful impression on American cuisine. Many Germans fought on the American side in the Revolution. After independence was won, a large percentage of opposing Hessian mercenaries stayed on to make America their home. Large German populations inhabited Pennsylvania, Illinois and Missouri. The so-called Pennsylvania Dutch, whose cooking won lasting fame, were German (“Deutsch”).

In the 19th century, hundreds of thousands of Chinese laborers came to the U.S., many to work on western railroad construction. They formed Chinese enclaves, the largest one located in San Francisco. Restaurants serving regional Chinese cuisines sprang up to serve these immigrants. When Americans displayed a taste for Chinese food, restaurateurs discovered that they had to tailor the cooking to American tastes, and these “Chinese restaurants” served Americanized Chinese food in the restaurant and authentic Chinese food in the kitchen for immigrants. Today, this evolutionary cycle is complete; American Chinese restaurants proudly advertise authentic dishes specialized along Mandarin, Szechuan and Cantonese lines.

Meanwhile, back in the 1800s, Italians were emigrating to America. Italian food was also geographically specialized and subsequently modified for American tastes. Today, Italian food is as American as apple pie and as geographically authentic as its Chinese counterpart. The Irish brought with them a simple but satisfying mix of recipes for starches and stews. Although long restricted to cosmopolitan coastal centers, French cooking eventually made its way into the American diet.

Mexicans began crossing the Rio Grande into the U.S. during the Great Depression. Their numbers increased in the 1950s, and this coincided with the advent of Mexican food as the next great ethnic specialty. Beginning in the late 1960s and coinciding with the rise of franchising as the dominant form of food retailing, Mexican food took the U.S. palate by storm. It followed the familiar pattern, beginning with Americanized “Tex-Mex” and culminating with niche Mexican restaurants catering to authentic regional Mexican cuisines.

Today, restaurant dining in America is an exercise in gastronomic globe-trotting. Medium-size American cities offer restaurants flying the ethnic banners of a dozen, fifteen or twenty nations – not just Italian, Chinese and Mexican food, but the dishes of Spain, Ethiopia, Thailand, Vietnam, Ireland, India, Greece, Denmark, the Philippines, Germany and more.

Immigration was absolutely necessary to all this development. As any experienced cook can attest, simple copying of recipes could not have reproduced the true flavor of these dishes, nor could non-natives have accomplished the delicate task of modifying them for the American palate while keeping the original versions alive until they eventually found favor with the U.S. market.

It is ironic that so much debate focuses on the alleged need for immigrants to assimilate U.S. culture. This single example shows how America has assimilated immigrant culture to a far greater degree. Indeed, American culture didn’t exist prior to immigration and has been created by this very assimilation process. Now, apart from learning English, it is not clear how much is left for immigrants to assimilate. For example, consumer products like Coca-Cola and McDonald’s hamburgers have become familiar to immigrants before they set foot here through U.S. exports.

Cultural Heterogeneity

Many of the great powers of the past were trading civilizations, like the Phoenicians and the Egyptians. By trading in the goods and languages of many nations, they developed a cosmopolitan culture.

In contrast, physical trade formed a fairly modest fraction of economic activity in the U.S. until well into the 20th century. The U.S. achieved its cultural heterogeneity less through trade in goods and services than via trade in people. The knowledge and experience shared by immigrants with natives produced a similar result.

Economists have long known that these two forms of trade substitute for each other in useful ways. For example, efficient use of a production input – whether labor, raw material or machine – requires that its price in different locations be equal. Where prices are not equal, equalization can be accomplished directly by movements of the input from its low-priced location to its high-priced location, which tends to raise the input’s price in the former location and lower it in the latter location. Or, it can be accomplished indirectly by trade in goods produced using the input; since the good will tend to be cheaper in the input’s low-priced location, exports to the high-priced location will tend to raise the good’s price, the demand for the input and the input’s price in that location.

Input-price equalization is a famous case of trade in goods obviating the necessity for trade in (movement of) people. Cultural heterogeneity is a much less well-known case of the reverse phenomenon – immigration substituting for trade in goods.

The importance of cultural heterogeneity has been almost completely overshadowed by the modern obsession with “diversity,” which might be concisely described as “difference for difference’s sake.” Unlike mindless diversity, cultural heterogeneity is rooted in economic logic. Migration is governed by the logic of productivity; people move from one place to another because they are more productive in their new location. Estimates indicate, for example, that some low-skilled Mexican workers are as much as five times more productive in the U.S. than in Mexico.

That is only the beginning of the benefits of migration. Because workers often complement the efforts of other workers, immigration also raises the productivity (and wages) of native workers as well. And there is another type of benefit that is seldom, if ever, noticed.

The late, great Nobel laureate F.A. Hayek defined the “economic problem” more broadly than merely the efficient deployment of known inputs for given purposes. He recognized that all individuals are limited in their power to store, collate and analyze information. Consumers do not recognize all choices available to them; producers do not know all available resources, production technologies or consumer wants. The sum of available knowledge is not a given; it is locked up in the minds of billions of individuals. The economic problem is how to unlock it in usable form. That is what free markets do.

Our previous extended example involving immigration and the evolution of American cuisine illustrates exactly this market information process at work. The free market made it efficient and attractive for immigrants to come to the U.S. U.S. consumers became acquainted with a vast new storehouse of potential consumption opportunities – eventually, U.S. entrepreneurs could also mine this trove of opportunity. Immigrant producers became aware of a new source of demand and new inputs with which to meet it. And the resulting knowledge became embedded in the mosaic of American culture, making our cuisine the most cosmopolitan in the world.

The upshot is that, without consciously realizing it, Americans have had access to vast amounts of knowledge, expertise and experience. This store of culture has acted as a kind of pre-cybernetic Internet, the difference being that culture operates outside our conscious perception. At best, we can observe its residue without directly measuring its input. One way of appreciating its impact is to compare the progress of open societies like the U.S. with civilizations that were long closed to outside influence, like Japan and China. Isolation fearfully retarded economic development.

Status Mobility

In his recent book, Unintended Consequences, financial economist Edward Conard stresses the necessity of risk-taking entrepreneurial behavior as a source of economic growth. The risks must be organically generated by markets rather than artificially created by politicians; the latter were the source of the recent financial crisis and ensuing Great Recession.

According to Conard, it is the striving for status that drives entrepreneurs run big risks in search of huge rewards that few will ultimately attain. Status may take various forms – social, occupational or economic. Its attraction derives from the human craving to distinguish oneself. It is this need for disproportionate reward – whether measured in esteem, dollars or professional recognition – that balances the high risk of failure associated with big-league entrepreneurship.

In the U.S., status striving has long been ridiculed by sociologists and psychologists. “Keeping up with the Joneses” has been stigmatized as a neurotic preoccupation. Yet the American version of status compares favorably with its ancient European ancestor.

England is famous for its class stratification. A half-century ago, its “angry young men” revolted against a stifling class system that defined status at birth and sharply limited upward mobility. Elsewhere in Europe, lingering remnants of the feudal system remained in place for centuries.

But the U.S. was comparatively classless. Economics defined its classes, and the economic categories embodied a high degree of mobility. Even those who started on the bottom rung usually climbed to the higher ones, where the rarefied climate proved difficult to endure for more than a generation or two.

The best feature of the status-striving U.S. class system has been the broad distribution of its benefits. The unimaginable fortunes acquired by titans of Industry like Carnegie, Rockefeller, Gates, Buffett, et al have made thousands of people rich while building a floor of real income under the nation. Our working lives and leisure have been defined by these men. The value created by a Bill Gates, say, is almost beyond enumeration.

Thus, it is not the striving for status per se that makes a national economy exceptional. It is the mobility that accompanies status. This will determine the form taken by the status striving process.

Before free markets rose to prominence, wealth was gained primarily through plunder. Seekers after status were warlords, kings or politicians. They gained their status at the expense of others. Today, plunder is the exception rather than the rule. Drug cartel bosses are the vestige of Prohibition; they profit purely from the illegalization of a good. Politicians are their counterpart in the straight world.

When status is accompanied by mobility, anybody can gain status. But they cannot have it without increasing the real incomes of large numbers of people. Ironically, the biggest complaint lodged against the American version of capitalism – that it promotes greed and income inequality – turns out to be both dead wrong and inaccurate. Mobility is achieved through competition and free markets, which absolutely demand that in order to get rich the status-striver must satisfy the wants of other people en masse. And income inequality is the inevitable concomitant of risk-taking entrepreneurship – somebody must bear the risks of ferreting out the dispersed information about wants, resources and technologies lodged in billions of human brains. If we don’t reward the person who succeeds in doing the job, the billions of people who gain from the process don’t get their real-income gains.

Free Markets

You might suppose that bureaucracy was invented by the New Deal. In fact, Elizabethan England knew it well. Price controls date back at least to the Roman emperor Diocletian. Prior to Adam Smith’s lesson on the virtues of trade and David Ricardo’s demonstration of the principle of comparative advantage, the philosophy of mercantilism held that government must tightly regulate economic activity lest it burst its bonds. Thus, free markets are a historical rarity.

England’s abolition of the Corn Laws in the mid-1800s provides a brief historical window on a world of free international trade, but the U.S. prior to 1913 probably best approximates the case of a world power living under free markets. Immigration was uncontrolled and tariffs were low; both goods and people flowed freely across political boundary lines.

Prices coordinate the flow of goods and services in the “present;” that is, over short time spans. Production and consumption over time are coordinated by markets developed to handle the future delivery of goods (futures and forward markets) and by prices that modify the structure of production and consumption in accord with our needs and wants for consumption and saving in the present and the future. For the most part, these prices are called interest rates.

Interest rates reflect consumers’ desires to save for future consumption and producers’ desires to invest to augment productive capabilities for the future. Just as a price tends to equalize the amount of a good producers want to produce and consumers want to purchase in a spot market, an interest rate tends to equalize the flow of saving by consumers with the investment in productive capital by producers. Without interest rates, how would we know that the amounts of goods wanted by consumers in the future would correspond to what producers will have waiting for them? As it happens, we are now experiencing first hand the answer to that question under the Federal Reserve’s “zero-interest-rate policy,” which substitutes artificial Federal Reserve-determined interest rates for interest rates determined by the interaction of consumers and producers.

Without knowing what policies were followed, we can scrutinize development outcomes in countries like China, India, Southeast Asia and Africa and draw appropriate inferences about departures from free-markets. High hopes and failure were associated with statism and market interference in China, India and Africa for over a half-century. Successful development has followed free markets like pigs follow truffles. But the obstacles to free markets are formidable, and no country has as yet found the recipe for keeping them in force over time.

What About Political Freedom?

Discussions of American exceptionalism invariably revolve around America’s unique political and constitutional history and its heritage of political freedom. Yet the preceding definition of exceptionalism has leaned heavily on economics. The world does not lack for political protestations and formal declarations of freedom and justice. Many of these are modeled on our own U.S. Constitution. History shows, though, that only a reasonably well-fed, prosperous population is willing to fight to preserve its political rights. Time and again, economic freedom has preceded political freedom.

When the level of economic development is not sufficient and free markets are not in place, the populace is not willing to sacrifice material real income to gain political freedom because it is too close to the subsistence level of existence already. And even in the exceptional case (usually in Africa or Latin America) in which a charismatic, status-striving leader heads a successful political movement, the leader will not surrender leadership status – even though the ostensible purpose of the independence movement was precisely to gain political freedom. Instead, he or she preserves that status by cementing political power for life. Why? Because there is no substitute status reward to fall back on; his or her economic and social status depends on wielding political power. This is the fault of the political Left, which has demanded that “mere” economic rights be subrogated to claims of equality, with the result that neither equality nor wealth has been realized.

Observation shows that when economic growth begins – but not before that – people begin to sacrifice consumption to control pollution and improve health. Similar considerations apply to political freedom. Expressing the relationship in the jargon of economics, we would say that political freedom is a normal good. This means that we “purchase” more of it as our real incomes increase. In this context, the word “purchase” does not imply acquisition with money as the medium of exchange; it means that we must sacrifice our time and effort to get political freedom, leaving less leisure time available for consumption purposes.

The U.S. was the exception because its economic freedom and real income was well advanced before the Revolution. Enough Americans were willing to oppose the British Crown to achieve independence because colonial America was living well above the subsistence level – at that, the ratio of rebels to Tories was close to even. George Washington was offered a crown rather than a Presidency, but he declined – and declined again when offered a third Presidential term. His Virginia plantation offered a substitute status reward; he did not need to hold office to maintain his economic status or social esteem. It is interesting to speculate about the content of the Constitution and the course of U.S. history had the U.S. lacked the firm economic foundation laid by its colonial history and favorable circumstances.

DRI-326 for week of 5-12-13: Paul Krugman Can’t Stand the Truth About Austerity

An Access Advertising EconBrief: 

Paul Krugman Can’t Stand the Truth About Austerity

The digital age has produced many unfortunate byproducts. One of these is the rise of shorthand communication. In journalism, this has produced an overreliance on buzzwords. The buzzword substitutes for definition, delineation, distinction and careful analysis. Its advantage is that it purports to say so much within the confines of one word – which is truly a magnificent economy of expression, as long as the word is telling the truth. Alas, all too often, the buzzword buzzsaws its way through its subject matter like a chain saw, leaving truth mutilated and amputated in its wake.

The leading government budgetary buzzword of the day is “austerity.” For several years, members of the European Union have either undergone austerity or been threatened with it – depending on whose version of events you accept. Now the word has crossed the Atlantic and awaits a visa for admission to this country. It has met a chilly reception.

In a recent (05/11/2013) column, economist Paul Krugman declares that “at this point, the economic case for austerity…has collapsed.” In order to appreciate the irony of the column, we must probe the history of the policy called “austerity.” Tracing that history back to the 1970s, we find that it was originated by Keynesian economists – ideological and theoretical soul mates of Paul Krugman. This revelation allows us to offer a theory about otherwise inexplicable comments by Krugman in his column.

The Origin of “Austerity”

The word “austerity” derives from the root word “austere,” which is used to denote something that is harsh, cold, severe, stern, somber or grave. When applied to a government policy, it must imply an intention to inflict pain and hardship. That is, the severity must be inherent in the policy chosen – it cannot be an invisible or unwitting byproduct of the policy. There may or may not be a compensating or overriding justification for the austerity, but it is the result of deliberation.

The word was first mated to policy during the debt crisis. No, this wasn’t our current federal government debt crisis or even the housing debt and foreclosure crisis that began in 2007. The original debt crisis was the 1970s struggle to deal with non-performing development loans made by Western banks to sovereign nations. At first, most of the debtor countries were low-income, less-developed countries in Africa and Latin America. Eventually, the contagion of bad loans and debt spread to middle-income countries like Mexico and Argentina. This episode was a rehearsal for the subprime-mortgage-loan defaults to follow decades later.

The original debt crisis was motivated by the same sort of “can’t miss” thinking that produced the housing mess. Sovereign nations were the perfect borrower, reasoned the big Wall Street banks of the 1970s, because a country can’t go broke the way a business can. After all, it has the power to tax its citizens, doesn’t it? Since it can’t go broke, it won’t default on its loan payments.

This line of reasoning – no, let’s call it “thinking” – found willing sets of ears on the heads of Keynesian economists, who had long been berating the West for its stinginess in funding development among less-developed countries. Agencies like the International Monetary Fund and the World Bank perked up their ears, too. The IMF was created at the end of World War II to administer a worldwide regime of fixed exchange rates. When this regime, named for the venue (Bretton Woods, New Hampshire) at which it was formally established, collapsed in 1971, the IMF was a great big international bureaucracy without a mandate. It was charmed to switch its attention to economic development. By brokering development loans to poor countries in Africa, Central and South America, it could collect administrative fees coming and going – coming, by carving off a chunk of the original loan in the form of an origination fee and going, by either rolling over the original loan or reformulating the development plan completely when the loan went bust.

The reformulation was where the austerity came in. Standard operating procedure called for the loan to be repaid either with revenues from the development project(s) funded by the loan(s) or by tax revenues reaped from taxing the profits of the project(s). Of course, the problem was that development loans made by big bureaucratic banks to big bureaucratic governments in Third World nations were usually subverted to benefit leaders in the target countries or their cronies. This meant that there were usually no business revenues or tax revenues left from which to repay the loans.

Ordinarily, that would leave the originating banks high and dry, along with the developers of the failed investment projects. “Ordinarily” means “in the context of a free market, where lenders and borrowers must suffer the consequences of their own actions.” But the last thing Wall Street banks wanted was to get their just desserts. They influenced their colleagues at the IMF and the World Bank to act as their collection agents. The agencies took off their “economic development loan broker” hats and put on one of their other hats; namely, their “international economics expert advisor” hat. They advised the debtor country how to extricate itself from the mess that the non-performing loan – the same one that they had collected fees for arranging in the first place – had got it into. Does this sound like a conflict of interest? Remember that these agencies were making money coming and going, so they had a powerful incentive to maintain the process by keeping the banks happy – or at least solvent.

Clearly, the Third World debtor country would have to scare up additional revenue with which to pay the loan. One possible way would be to divert revenue from other spending. But the agency economists were Keynesians to the marrow of their bones. They believed that government spending was stimulative to the economy and increased real income and employment via the fabled “multiplier effect,” in which unused resources were employed by the projects on which the government funds were spent. So, the last thing they were willing to advise was a diversion of spending away from the government and into repayment of debt. On the other hand, they were willing to advise Third World countries to acquire money to spend through taxation. If government were to raise $X in taxes and spend those $X, the net effect would not be a wash – it would be to increase real income by X. Why? Because taxation acquires money that private citizens would otherwise spend, but also money that they would otherwise save. When the entire amount of tax revenue is then spent by government, the net effect is to increase total spending – or so went the Keynesian thinking. One of Keynes’ most famous students, Nicholas Kaldor, later to become Lord Kaldor in Great Britain, complained in a famous 1950s’ article: “When will underdeveloped nations learn to tax?”

Thus, the development agencies kept a clear conscience when they advised their Third World clients to raise taxes in order to repay the debt incurred to Western banks. Not surprisingly, this policy advice was not popular with the populations of those countries. That policy acquired the descriptive title of “austerity.” Viewing it from a microeconomic or individual perspective, it is not hard to see why. By definition, a tax is an involuntary exaction that reduces the current or future consumption of the vict-…, er, the taxpayer. The taxpayer gains from it if, and only if, the proceeds are spent so as to more-than-compensate for the loss of that consumption and/or saving. Well, in this case, Third World taxpayers were being asked to repay loans for projects that failed to produce valuable output in the first place and did not produce the advertised gains in employment either. A double whammy – no wonder they called it “austerity!”

How austere were these development-agency recommendations? In Wealth and Poverty (1981), George Gilder offers one contemporary snapshot. “The once-solid economy of Turkey, for example, by 1980 was struggling under a 55 percent [tax] rate applying at incomes of $1,600 and a 68 percent rate incurred at just under $14,000, while the International Monetary Fund (IMF) urged new ‘austerity’ programs of devaluation and taxes as a condition for further loans.” Note Gilder’s wording; the word “austerity” was deliberately chosen by the development- agency economists themselves.

“This problem is also widespread in Latin America,” noted Gilder. Indeed, as the 1970s stretched into the 80s and 90s, the problem worsened. “[Economic] growth in Africa, Latin America, Eastern Europe, the Middle East and North Africa went into reverse in the 1980s and 1990s,” onetime IMF economist William Easterly recounted sadly in The Elusive Quest for Growth (2001). “The 1983 World Development Report of the World Bank projected a ‘central case’ annual percent per-capital growth in the developing countries from 1982 to 1995″ but “the actual per-capita growth would turn out to be close to zero.”

Perhaps the best explanation of the effect of taxes on economic growth was provided by journalist Jude Wanniski in The Way the World Works (1978). A lengthy chapter is devoted to the Third World debt crisis and the austerity policies pushed by the development agencies.

Two key principles emerge from this historical example. First, today’s knee-jerk presumption that government spending is always good, always wealth enhancing, always productive of higher levels of employment depends critically on the validity of the multiplier principle. Second, the original definition of austerity was painful increases in taxation, not decreases in government spending. And it was left-wing Keynesians themselves who were its practitioners, and who ruled out government spending decreases in favor of tax increases.

Fast Forward

Fast forward to the present day. Since the 1970s, the worldwide experience with taxes has been so unfavorable – and the devotion to lower taxes has become so ingrained – that virtually nobody outside of Scandinavia will swallow a regime of higher taxes nowadays.

Keynesian economics, thoroughly discredited not only by its disastrous economic development policy failures but also by the runaway inflation it started but could not stop in the 1970s, has emerged from under the earth like a protagonist in a George Romero movie. Its devotees still preach the gospel of stimulative government spending and high taxes. But they stress the former and downplay the latter. And, instead of embracing their former program of austerity as the means of overcoming debt, they now accuse their political opponents of practicing it. They have effected this turnabout by redefining the concept of austerity. They now define it as “slashing government spending.”

The full quotation from the Paul Krugman column quoted earlier was: “At this point, the economic case for austerity – for slashing government spending even in the face of a weak economy – has collapsed.” Notice that Krugman says nothing about taxes even though that was a defining characteristic of austerity as pioneered by development-agency Keynesians of his youth. (Krugman does not neglect devaluation, the other linchpin, since he advocates printing many more trillions of dollars than even Ben Bernanke has done so far.)

When Krugman’s Keynesian colleagues originated the policy of austerity, they did it with malice aforethought – using the term themselves while fully recognizing that the high-tax policies would inflict pain on recipients. Now Krugman projects this same attitude on his political opponents by claiming that not only does reduced government spending have harmful effects on real income and employment, but that Republicans will it so. The Republicans, then, are both evil and stupid. Republicans are evil because they “have long followed a strategy of ‘starving the beast,’ slashing taxes so as to deprive the government of the revenue it needs to pay for popular programs. They are stupid because their reluctance “to run deficits in times of economic crisis” is based on the premise that “politicians won’t do the right thing and pay down the debt in good times.” And, wouldn’t you know, the politicians who refuse to pay down the debt are the Republicans themselves. The Republicans are “a fiscal version of the classic definition of chutzpah…killing your parents, then demanding sympathy because you’re an orphan.”

But the real analytical point is that Krugman, and Democrats in general, are exhibiting the chutzpah. They have taken a policy term originated and openly embraced not merely by Democrats, but by Keynesian Democrats exactly like Krugman himself. They have imputed that policy to Republicans, who would never adopt this Democrat policy tool because its central tenet is excruciatingly high taxes. They have correctly accused Republicans of wanting to reduce government spending but wrongly associated that action with austerity in spite of the fact that their Keynesian Democrat forebears did not include it in the original austerity doctrine.

Why have they done this? For no better reason than that they oppose the Republicans politically. Psychology recognizes a behavior called “projection,” the imputing of a detested personal trait or characteristic to others. Having first developed the policy of austerity in the late 1970s and seen its disastrous consequences, Democrats now project its advocacy on their hated Republican opponents. In Krugman’s case, there are compelling reasons to suspect a psychological root cause for his behavior. His ancillary comments reveal an alarming propensity to ignore reality.

Paul Krugman’s Flight from Reality

In the quoted column alone, Krugman makes numerous factual claims that are so clearly and demonstrably untrue as to suggest a basis in abnormal psychology. Pending a full psychiatric review, we can only compare his statements with the factual record.

“In the United States, government spending programs designed to boost the economy are in fact rare – FDR’s New Deal and President Barack Obama’s much smaller recovery act are the only big examples.” Robert Samuelson’s recent book The Great Inflation and Its Aftermath (2008)covers in detail the growth and history of Keynesian economics in the U.S. During the Kennedy administration, Time Magazine featured Keynes on its cover to promote a story conjecturing that Keynesian economics had ended the business cycle. Samuelson followed Keynesian economics and such luminaries as Council of Economic Advisors Chairman Walter Heller, Nobel Laureates Paul Samuelson and James Tobin through the Kennedy, Johnson, Carter and Reagan administrations. One of his major theses was precisely that Keynesian economists produced the stagflation of the 1970s by refusing to stop deficit spending and excessive money creation – a view that helped to discredit Keynesianism in the 1980s. There can be no doubt that U.S. economic policy was dominated by Keynesian policies “designed to boost the economy” throughout the 1960s and 1970s.

Moreover, every macroeconomics textbook from the 1950s forward taught the concept of “automatic stabilizers” – government programs in which spending was designed to automatically increase when the level of economic activity declined. These certainly qualify as “big” in terms of their omnipresence, although since Krugman is an inflationist in every way he might deny their bigness in some quantitative sense. But they are certainly government spending programs, they are certainly designed to boost the economy and they are certainly continually operative – which makes Krugman’s statement still more bizarre.

“So the whole notion of perma-stimulus is a fantasy… Still, even if you don’t believe that stimulus is forever, Keynesian economics says not just that you should run deficits in bad times, but that you should pay down debt in good times.” The U.S. government has had one true budget surplus since 1961, bequeathed by the Johnson administration to President Nixon in 1969. (The accounting surpluses during the Clinton administration years of 1998-2001 are suspect due to borrowing from numerous off-budget government agencies like Social Security.) This amply supports the contention that politicians will not balance the budget cyclically, let alone annually. European economies are on the verge of collapse due to sovereign debt held by their banking systems and to the inexorable downward drift of productivity caused by their welfare-state spending. Krugman’s tone and tenor implies that “Keynesian economics” should be given the same weight as a doctor prescribing an antibiotic – a proven therapy backed by solid research and years of favorable results. Yet the history of Keynesian economics is that of a discredited theory whose repeated practical application has failed to live up to its billing. Now Krugman is in a positive snit because we don’t blindly take it on faith that the theory will work as advertised for the first time and that politicians will behave as advertised for the first time. If nothing else, one would expect a rational economist to display humility when arguing the Keynesian case – as Keynesians did when repenting their sins in favor of a greatly revised “New Keynesian Economics” during the mid-1980s.

“Unemployment benefits have fluctuated up and down with the business cycle and as a percentage of GDP they are barely half what they were at their recent peak.” Unemployment benefits have “fluctuated” up to 99 weeks during the Great Recession because Congress kept extending them. The rational Krugman knows that his fellow economists have debated whether these extensions have caused people to stop looking for work and instead rely on unemployment benefits. Robert Barro says they have, and finds that the extensions have added about two percentage points to the unemployment rate. Keynesian economists demur, claiming instead that the addition is more like 0.4%. In other words, the profession is not arguing about whether the extensions increase unemployment, only about how much. Meanwhile, Krugman is in his own world, pacing the pavement and mumbling “up and down, up and down – they’re only half what they were at their highest point when you measure them as a percentage of GDP!”

“Food stamp use is still rising thanks to a still-terrible labor market, but historical experience suggests that it too will fall sharply if and when the economy really recovers.” Food stamp (SNAP) use has steadily risen to nearly 48 million Americans. Even during the pre-recession years 2000-2008, food-stamp use rose by about 60%. Thus, the growth of the program has far outpaced growth in the rate of poverty. The Obama administration has bent over backward to liberalize criteria for qualification, allowing even high-wealth, low-income households into the program. This does not depict a temporary program whose enrollment fluctuates up and down with economic change, but rather a tightening vise of dependency.

Krugman’s picture of a “still-terrible labor market” cannot be reconciled with his claim that government spending is an effective counter-cyclical tool. If Krugman’s reaction to the anemic response to the Obama administration economic stimulus is a demand for much higher spending, he will presumably pull out that get-home-free card no matter what the effects of a spending program are. Why would much higher spending work when the actual amount failed? Krugman makes no theoretical case and cites no historical examples to support his claim – presumably because there are none. Governments need no urging to spend money – European governments are collapsing like dominos from doing exactly that. European unemployment has lingered in double digits for years despite heavy government spending, recent complaints about “austerity” to the contrary notwithstanding.

“The disastrous turn toward austerity has destroyed many jobs and ruined many lives. And its time for a U-turn.” Keep in mind that Krugman’s notion of “austerity” is reduced government spending but not higher taxes. This means that he is claiming that taxes have not gone up – when they have. And he is claiming that government spending has gone down, presumably by a lot since it has “destroyed many jobs and ruined many lives.” But government spending has not gone down; only a trivial reduction in the rate of growth of government spending has occurred during the first four and one-half months of 2013.

“Yet calls for a reversal of the destructive turn toward austerity are still having a hard time getting through.” Krugman’s rhetoric implies that Keynesian economics is a sound, sane voice that cannot be heard above the impenetrable din created by right-wing Republican voices. As a rational Krugman well knows, the mainstream news media has long been completely dominated by the Left wing. (It is the Right wing that should be complaining because the public is unfamiliar with the course of economic research over the last 40 years and the mainstream news media has done nothing to educate them on the subject.) Its day-to-day vocabulary is permeated with Keynesian jargon like “multiplier” and “automatic stabilizers.” The rhetorical advantage lies with Democrats and Keynesians. It is practical reality that has let them down. The economics profession conducted an unprecedented forty-five year research program on Keynesian economics. Its obsession with macroeconomics led to a serious neglect of microeconomics in university research throughout the 40s, 50s and 60s. By approximately 1980, the verdict was in. Keynesian economics was theoretically discredited, although its theoretical superstructure was retained in government and academia. Even textbooks were eventually revised to debunk the Keynesian debunking of Classical economics. Macroeconomic policy tools were retained not because free markets were inherently flawed but because policy was ostensibly a faster way to return to “full employment” than by relying on the slower adjustment processes of the market. The reaction to recent “stimulus” programs has demonstrated that even that modest macroeconomic aim is too ambitious.

Keynesian economics has had no trouble getting a hearing. It has had the longest, fairest hearing in the history of the social sciences. The verdict is in. And Krugman stands in the jury box, screaming that he has been framed by conservative Republicans as the bailiffs try to remove him from the courtroom.

Memory records no comparable flight from reality by a prominent economist.

DRI-334 for week of 5-5-13: Economics and Geography: The Case of Africa

An Access Advertising EconBrief:

Economics and Geography: The Case of Africa

Economics is the social science dealing with human choice. Geography is the physical science that deals with the above-ground features of the Earth’s surface. The two are seldom mentioned in the same breath. Yet they work hand-in-hand. The subject matter of geography forms the objective physical, structural parameters with which economics must cope. Geography’s brute facts can mold, shape and manhandle economics to a stunning degree. No better example could be cited than the effect of Africa’s geography on its economic history.

The Geographic Dimensions of Sub-Saharan Africa

The continent of Africa encompasses a geographic kaleidoscope. The “Africa” of popular imagination and special economic interest lies to the south of the Sahara – the world’s largest desert whose land area exceeds the size of the continental United States. Sub-Saharan Africa stretches to the tip of the Cape of Good Hope, north of the Antarctic Ocean. It abuts the Indian Ocean to the east and the Atlantic Ocean to the west.

The adjacency of oceans on three sides suggests that African economic history should be a tale of international maritime trade. Just the opposite – although at least three important trade lanes developed between Africa and the rest of the world, international trade was not a tremendous engine of economic development and growth in Africa. The noted economist Thomas Sowell found the source of this apparent paradox in two geographic drawbacks. First, winds and ocean currents near the African coast were (and are) among the world’s trickiest and most variable. Throughout most of world history, the expertise necessary to cope with them was absent.  Second, the African coastline was and is mostly smooth and shallow, thus unsuitable for harbors. Ships had to anchor offshore and transfer cargo by boat – a time-consuming, cumbersome and costly method. This gave rise to one of the great economic-historical ironies, noted by Sowell: As a large fraction of world international trade passed back and forth from Asia around the Cape of Good Hope to the New World, it passed within hailing distance of the African sub-continent – but seldom stopped.

It is hard to overrate the importance of these factors for Africa’s economic development. (And while for most other purposes we would need to analyze African economic development by separating the continent into its constituent nations, this geographic analysis is better conducted by treating the sub-continent as a unitary whole.) These days, many people treat foreigners and foreign trade as unwelcome intruders in economic life, but throughout human history international trade has been the key to a better life for most people and nations. Many of the great nations, from Rome to Greece to Carthage to Phoenicia to Egypt to the modern European nations, were either trading civilizations or encouraged trade beyond national borders. Trade allows nations to consume a broader range and larger volume of goods than they individually produce. It also increases the amount, scope and accuracy of human knowledge. When deliberately imposed from within, trade deprivation is a form of self-starvation.

In Africa’s case, the effects of geography are directly analogous to those of anthropogenic taxes and quotas. Those effects were not limited to the coastlines. They were even more pronounced in the interior. To appreciate their effects, we can compare them with the effects of geography on economic development here in the U.S.

From the arrival of Europeans in North America just prior to and after 1600, rivers were the transportations arteries of choice for north-south (and some east-west) travel. Barge, keelboat and canoe were media of transport. Cities sprung up at the confluence of rivers and at convenient landing points. Today, the history of the nation’s major cities is writ in their rivers. Despite the plethora of new transport media, ranging from planes to trains to automobiles, river transport is still an important secondary source of freight transportation for goods whose ratio of bulk to value is high.

Africa has always has even more and bigger rivers than North America. But they have been a much smaller boon to her economic development, which has been drastically curtailed compared to that of the New World.  The problem has been that African rivers are often unnavigable. Hard-core movie fans are familiar with the 1951 film The African Queen, starring Humphrey Bogart and Katharine Hepburn. The movie follows the adventures of a hard-drinking, World-War I era ship’s captain and a spinster missionary who set out on a long river journey aimed at locating and sinking a German steamer in Central Africa. The two must traverse the length of the unnavigable UlangaRiver (also called the Bora) to reach the Kenyan lake where the steamer resides. The formidable river hazards they surmount form the basis of the movie’s plot. These include rapids, plagues of swarming insects, river animals such as hippos, inclement weather and the river itself – which eventually becomes so choked with reeds and vegetation that they have to climb in the water and pull their weatherbeaten old tub of a tiny craft through the muck. Hollywood films are legendary for mangling the truth, but in this case the screenwriter (and novelist C.S. Forester, on whose book the film was based) hit the nail on the head.

In addition to above-mentioned hazards to navigation, African rivers suffer the drawback that African geologic structure is often mesa-like – plateaus followed by sharp dropoffs that form a falls. (Indeed, sharp changes in altitude hinder mobility on land as well as water over most of the continent.) The result is navigational nightmare; traversing a falls is not merely awkward but downright dangerous. The Zaire River is 2,900 miles long and contains a volume of water second only to that of the legendary Amazon River. But the Zaire’s succession of falls and rapids prevent entering ships from getting very far. This is typical – according to Sowell, “no river in sub-Saharan Africa reaches from the open sea to deep into the interior.” We must travel up the Mediterranean coast to the Nile to find a river that stretches inland. While the 1,500-mile total length of navigable water in the Zaire is impressive, this is not a continuous stretch, but is rather comprised of many discontinuous stretches. For several centuries, a map maker attempting to travel the river’s entire length would have had to make repeated portages across land to bypass the unnavigable parts of the river. This was typical of Africa’s waterways.

When water transport is unavailable or infeasible, animals are the historical second-best means of transporting people or goods. In the U.S., oxen and horses pulled wagons and carried travelers on the westward migrations beyond the original 13 colonies. African settlers made similar attempts to employ draft animals but were thwarted by the tsetse fly, an insect pest that carries disease that is deadly to animals. Animal populations were so ravaged that human beings often stepped in as beasts of burden. The stereotypes of African males as jungle bearers on safari and females carrying loads on their heads were born of this necessity. But the tropical climate was not much friendlier to humans. In the 20th century, some 90% of all deaths from malaria occurred in sub-Saharan Africa.

Bacteria tend to flourish in the tropics because of dampness. Oddly enough, the moisture content is favorable to disease organisms but less so to agriculture. Even though total rainfall seems adequate, the boom-or-bust pattern of rainfall – torrential rains alternating with sizable periods of drought – is hard on soils. Drought bakes and hardens soils, enabling heavy rains to wash them away. This destroys valuable nutrients, damaging agricultural prospects. The use of fertilizers was long hindered by the dearth of animals – yet another point of unfavorable contrast with North America, where animals were a plentiful source of fertilizer.

While it is true that not all regions of sub-Saharan Africa suffered all these deficiencies simultaneously, virtually all areas suffered at least one of them. Normally, when some areas produce some things but sorely lack others, trade can make up for this by allowing each area to specialize in its comparative advantage good and trade a surplus of its good for the other things it lacks. But when transport between areas is absent or highly costly, the value of trade is greatly reduced.

The effect of transport costs is exactly analogous to that of a specific tax. Suppose that it costs $10 to transport a good from point A to point B. This drives a wedge between the price paid by the buyer of the good (located at B) and the price received by the seller (at A). This holds true regardless of who pays the transport costs; in fact, the welfare of buyer and seller are unaffected by the identity of the taxpayer. Suppose, first, that the buyer is responsible for paying the tax and that the final market price is $90. That means that the buyer pays $100 (the $90 market price plus the $10 tax) and the seller receives the market price of $90. Alternatively, suppose that the seller is responsible for paying the tax. We previously established the buyer’s willingness to pay $100 for the same quantity of the good – this time the buyer pays it all to the seller instead of paying $90 to the seller and $10 to the government (collected at the sales counter by the seller). Now the seller receives a larger market price of $100, instead of the $90 market price in the first example. But the seller must subtract the $10 tax paid to the government, so the seller nets only the same $90 as before. In both cases, the buyer pays $100 net and the seller receives $90 net. In public finance, this is called the equivalence theorem. But the same logic applies to transport costs. Both tax and transport costs deter economic activity because they reduce the gain to the seller and increase the sacrifice made by the buyer.

Transport costs are ubiquitous. But they loom especially large in Africa. In African economic history, transport costs have been a figurative cross borne on the shoulders of the African citizen. They have severely limited both international and intranational trade.

African Trade

Northwestern Africa, including what eventually became the countries of Nigeria and Ghana, was least disfavored by nature and accordingly hosted several relatively prosperous civilizations. Because prospects for interior trade with other African nations were so poor, these nations increased their real incomes by regional warfare and international trade. Their higher standard of living allowed them to produce weapons of war with which to subjugate their neighbors and reduce their citizens to slavery. The Niger River provided one of the few navigable water routes leading to the ocean, which facilitated the export of slaves to Europe and the West Indies, whence they were often re-exported to North America.

Slaves were one of the few African export items because a slave was a very valuable capital good, capable of earning a decades-long stream of income for its owner. This future stream of income could be estimated, discounted to a present value using an interest rate and sold for a purchase price that capitalized that future stream of income into a current capital gain for the seller. This made it worthwhile to incur the sizable costs of transporting imprisoned slaves across a vast ocean. Consequently, the Nigerian coast acquired the appellation of the “slave coast.”

Another profitable export of this region was gold, which was mined in Ghana. Gold was (and still is) used for limited industrial and decorative purposes, but its primary value lies in its scarcity and acceptability as a medium of exchange and store of value. Investors bid up its price whenever money loses its value in exchange. Even today, the world’s entire physical stock of gold would fit into a single sanitary landfill. Mined gold resembles dust. This means that gold has an extremely high value relative to its physical bulk – the perfect kind of good to overcome the barrier erected by high transport costs. It is not surprising, then, that the Ghanaian coast acquired the nickname of the “gold coast.”

The third of Africa’s famous “coasts” was its “ivory coast,” located to the west of Ghana in Northwest Africa. The ivory was obtained from the tusks of African elephants, hunted to near extinction because private ownership of elephants was mistakenly forbidden. (In the late 20th century, those African nations that experimented with allowing private ownership of elephants saw dramatic increases in elephant populations and successful control of poaching.) Ivory was greatly prized for a myriad of uses and elephants were extant only in Africa and India. Thus, elephant tusks also attained a high value relative to their substantial bulk.

Overall, this represented an incredibly meager showing for one of the world’s largest continents and populations. At the most, it produced prosperity for small African regions for limited historical periods. The slave trade was outlawed in the 19th century and this edict was policed by the British navy. The ivory trade was a self-limiting business, plagued by its illegal status and the short-sightedness of officialdom. Gold mining is limited by the stinginess of nature and the expense of extracting gold from the ground.

Alternative Explanations for Africa’s Lagging Economic Development

Africa has long been the poster child for the failures of economic development in the Less Developed Countries, or what was formerly called the Third World. Most of the blame for this failure was placed on the fact that, for comparatively short historical time periods, many African nations were colonies of European countries.

On general direction, this seems an odd position to take. The theory of colonial immizeration – if it can be called that – apparently assumes that colonizers gained by withdrawing resources from colonies in some way analogous to that in which, let’s say, an embezzler gains by withdrawing funds from a successful company. But that misconceives not only the basic nature of trade between nations but also the stylized relationship between colonizer and colonized.

There is a theory that colonizers gained by imposing unfavorable terms of trade on their colonies and by substituting less efficient trade relationships for those that colonies would otherwise have developed with the rest of the world. But even if we subscribe to this, it does not imply that colonizers wanted to prevent or retard economic development in their colonies. Presumably, just the opposite was true, since development would enable the colonies to produce more and better goods for the colonizer to acquire via biased trade. And in fact, colonizers expended vast sums of time and money on attempts to promote development in the African colonies. If they failed, their failures seem small in comparison to the spectacular failures achieved by Western economists and development agencies like the World Bank and the International Monetary Fund after World War II.

Another oft-cited villain in African non-development is authoritarian political institutions. Doubtless, the fact that Africans exchanged colonial masters for home-grown despots in case after case is not only ironic but tragic, in view of the appalling human toll taken by famine, executions and all-around misery. But the question here is: Is despotism per se responsible for the lack of economic development in Africa? There is a very well-established relationship between political freedom and economic freedom, but the causality seems to run mostly in one direction – from economics to politics, not vice-versa. It seems reasonable to think that more democratic institutions would lead to fewer executions and political imprisonments and less repression in Africa. But Western nations have proven that democracy is fully compatible with economic serfdom and penury.

One of the most objectionable theories of African economic development is racial. It ascribes Africa’s development failures to the genetic inferiority of a predominantly black population. Since this theory offends current sensibilities, it seldom receives serious discussion. A dispassionate examination would cite, among other objections, the economic and intellectual success that the same genetic strains have achieved elsewhere in the world. Indeed, one of the most compelling counterarguments was played out in southern Africa itself, where the successful competition of poor black workers forced dominant white minorities to impose apartheid in order to protect white incomes. The same scenario was played out in the American South under the Jim Crow laws. If blacks are inferior in some economically meaningful sense, why do whites so often need the law to enforce economic protection against black competition?

That last example should click on the light of realization in our minds. Africa seems to be an object lesson in how badly a free market is needed. The African continent is home to less than 10% of the world’s population but over one-third of its languages. This cultural indicator reeks of economic and cultural isolation. In an America blessed with plentiful natural resources, navigable rivers, hospitable climate and a century’s worth of relatively benign colonial stewardship, some sort of economic development was virtually inevitable. Our experience with free markets was a huge bonus that made us the world’s leading economic power. Scandinavia, with its added advantages including complete cultural homogeneity, needed free markets even less. But nothing less than free markets would have sufficed to bring economic development to the Dark Continent.

Free markets do not work miracles; they merely permit the best to be made from available opportunities at any particular point in time. They also provide the widest scope for innovation and technological advancement over time. When nature has dealt you an inferior hand of cards, you can only make the optimal draw, then play those cards for all they are worth. Freedom and free markets are that optimal strategy for economic development.

Today, there are stirrings of economic development in Africa, as there are in longtime development laggards like China and India. The Economist has reported on the ability of individual African fishermen to use cellphones to check the market prices of their daily catch. At long last, technology is beginning to improve the bad hand that Africans have been dealt. Technology has been working its wonders for a couple centuries in the West. Now free markets are bringing them to the poorest of the poor in the heart of Africa.

DRI-336 for week of 4-7-13: ‘The Pattern of the Anointed’ Strikes Again

An Access Advertising EconBrief:

‘The Pattern of the Anointed’ Strikes Again

How many times have you seen it happen? The intelligentsia and the mainstream news media discover a crisis. It starts as a single news story or a documentary. Gradually it builds into the crisis of the week, or the month. Eventually, there is a consensus – this is a disaster, or an epidemic, or malaise, or an Armageddon in the making. The remedy is a program, or a national effort, or the moral equivalent of war. Only full-bore, full-speed-ahead action by the federal government can solve the problem.

Agencies are created and staffed. Programs are created, legislated and implemented. The federal government spends oceans of money. What happens? For awhile – nothing. Then, slowly and almost imperceptibly at first, but soon clearly and gnawingly… the problem gets worse.

Ultimately, we find out that the problem was actually getting better all long – until the federal government acknowledged it and tried to solve it. This reversed the pattern of improvement and got things headed in the wrong direction. And they stayed that way.

After watching this sequence of events more than once, it probably occurred to you that it was more than just happenstance. You may even have contemplated announcing your observations to the world in the form of a theory about how government works – or fails.

Congratulations. You have undergone the scientific experience known as independent discovery. Unfortunately, you were too slow in getting your ideas down on paper, so you won’t be able to claim credit for them. That belongs to the great black economist, Thomas Sowell.

Sowell called his discovery “The Pattern of the Anointed.”

The Pattern of the Anointed

Sowell identified what he considered “the prevailing vision of our time” in social theory. The intelligentsia – consisting of leading figures in academia and the mainstream communications media – has been anointed by themselves and the political left as the agenda-setters for big government. This visionary process has four stages.

In Stage 1, a Crisis is identified. “Some situation exists, whose negative aspects the anointed propose to eliminate…even though all human situations have negative aspects, and even though evidence is seldom asked or given to show how the situation…is either uniquely bad or threatening to get worse. Sometimes [it] has in fact already been getting better for years.”

In Stage 2 comes the Solution. “Policies to end the ‘crisis’ are advocated by the anointed, who say these policies will lead to beneficial result A. Critics say that these policies will lead to detrimental result Z. The anointed dismiss these latter claims as absurd and ‘simplistic,’ if not dishonest.

Stage 3 brings the Results: “The policies are instituted and lead to detrimental result Z.”

Of course, this is not the end of the process. In Stage 4, we witness the Response: “[Predictors of] detrimental result Z …are dismissed as ‘simplistic’ for ignoring the ‘complexities’ involved, as ‘many factors’ went into determining the outcome. The burden of proof is put on the critics to demonstrate to a certainty that these policies alone were the only possible cause of [the result]. No burden of proof whatsoever is put on [proponents of the Solution]. Indeed, it is often asserted that things would have been even worse, were it not for the wonderful programs that mitigated the inevitable damage from other factors [emphasis added].”

Sowell observed that evidence for his theory was “abundant.” He cited three well-known examples.

The War on Poverty

The “War on Poverty” is associated with the “Great Society” programs of the Lyndon Johnson administration, but the enabling legislation dates back to 1962, during the Kennedy administration. From the beginning, it was advertised as a means to end or prevent dependence of federal-government welfare programs. The emphasis was on “prevention and rehabilitation,” not on recruiting more recipients for the dole. And a by-product of success would be reduction in federal spending on welfare programs. “Make taxpayers out of tax eaters” and “give a hand, not a handout” were two of the many slogans that flavored War propaganda.

As time went on, the War accumulated subsidiary themes like barnacles adhering to the hull of an aging ship. One such theme was the preemption of urban violence through alleviation of poverty. The racial disorder and riots of the late 1960s provided a convenient setting for “civil rights leaders” to demand more government programs to ward off a “long, hot summer” of violence.

A minority of right-wing critics, the most visible and insistent being Sen. Barry Goldwater, predicted that the War would be a losing venture. It would promote dependence on government and encourage violence by rewarding it. It would divide Americans into those benefitting from government subsidy and those paying the subsidies; e.g., net recipients and net payees. And veteran observers of government noted that since the cost of government only rose, never declined, the War was almost certain to produce higher, not lower, welfare expenditures.

In retrospect, Sowell pointed out, the most eye-opening analytical aspect of this debate was the complete omission to carefully check the actual state of poverty and its historical trend. As of Lyndon Johnson’s ascension to the Presidency in 1965, the number of people living in poverty had been declining since 1960. This was a residue of the end of the 1958 recession. From 1950 to 1965, the officially poor declined by approximately one-third – without taking into account any benefits they received from government.

Oops. So much for the crisis of poverty. How about the War itself? Did it succeed or fail in reducing poverty? What about the aim of reducing dependency on government?

The War picked up momentum with the addition of reinforcements in the form of a growing roster of federal programs. Even though the federal Office of Economic Opportunity, the War’s official HQ, disbanded in 1974, the War did not end. The number of people on public assistance doubled between 1960 and 1977. Expenditures on public housing increased by a factor of five; food-stamp expenditures increased by a factor of ten. In-kind government benefits increased by a factor of twenty. As a percentage of Gross National (now Domestic) Product, federal social-welfare spending rose from 8% to 16%.

The Nixon administration changed the approach by officially declaring victory in the War on Poverty and demobilizing by disbanding OEO in 1974. In fact, they simply adopted a new approach – federalism and block grants given to states. One salutary effect of the change was a decline in urban violence, which stopped completely in the Reagan administration.

By 1992, there were more people officially in poverty than there had been at the War’s start.

Today, of course, over forty million people receive food stamps. Over half of American households receive federal benefits of some kind. Dependency on government is at an all-time high.

This summary of results runs diametrically opposite to the predictions of the anointed and roughly consistent with the predictions of critics. Yet the vision itself remained “hermetically sealed off from the contaminating influence of facts,” in Sowell’s words. Rather than acknowledge this failure, proponents of the War shifted the terms of the debate by citing the number of those who receive (and continue to receive) benefits as the criterion of success. “The goal was redefined as reducing poverty by redistributing resources.” Present-day defenders of the War, like Shelton Danziger of the Institute for Research on Poverty, claim that “I think we’d have poverty rates over 25%” if not for the agglomeration of federal anti-poverty programs.

“In short, no matter what happens, the vision of the anointed always succeeds, if not by the original criteria, then by criteria extemporized later,” Sowell ruefully concludes.

Sex Education

Another product of the crusade-happy decade of the 1960s was sex education. Because the founding intentions of this reform were so completely at odds with its results – and with its current intentions – it is necessary to revisit the origins of this staple of public education.

A 1968 article in Education Digest declared that “contraception education and counseling is now urgently needed to help prevent pregnancy and illegitimacy in high-school girls.” This reinforced prior Congressional testimony by the head of Planned Parenthood that the purpose of sex education was to “assist our young people in reducing the incidence of out-of-wedlock births and early marriage necessitated by pregnancy.” Reduction of venereal disease was another commonly cited rationale for sex education.

How would sex education accomplish these goals? From today’s perspective, the answers seem astonishingly vague. Boys “will find decreased need for casual, irresponsible and self-centered experimentation with sex,” was the prediction of one academic, a so-called “Professor of Family Life (!). It was frequently reiterated that girls became pregnant through ignorance, panic and meek submission; these would be counteracted by sex education. Exactly how this would happen, though, remains mostly a mystery to this day. It is apparent that the stated intentions of the anointed were accepted as sufficient collateral to warranty the results of their proposed solutions – the same attitude that prevails today.

Critics feared for the moral health of the nation. They predicted that sex education divorced from moral instruction would produce effects opposite to those desired and predicted by proponents; that is, more pregnancy, illegitimacy and venereal disease. And for their pains, they were stigmatized and demonized as sexually phobic religious fundamentalists and fanatics – and worse.

Once again, apparently nobody thought to actually investigate the factual extent of this “crisis” before enlisting the federal government to alleviate it. As of 1968, the fertility rate of teenage girls had been declining for over a decade, since 1957. The rates of both syphilis and gonorrhea – the two main venereal diseases in those days – fell throughout the decade of the 1950s.

Yet once again, the solution to the non-existent crisis was massive federal-government spending and interference with the private sector and the federal system. This took two main forms: federal aid to public schools to fund sex-education curricula and federal aid to “family-planning” clinics. Even as early as 1968, sex education programs were found in almost half of all public schools. The concept grew like Topsy until it became omnipresent. Family-planning clinics grew in tandem.

The results of this anointed vision have been unqualified, unshirted disaster – perhaps the most ghastly of all the visionary fiascos foisted on the American public. Pregnancy rates among young girls rose by close to 30% in each of the first two decades after 1968 – despite a doubling of abortions during the same time period. Indeed, abortions soon outpaced live births among young girls. Surveys found a higher percentage of (unmarried) girls between 15 and 19 engaging in sex in 1976 than had been true when the big push for sex-ed began.

Sargent Shriver was a high priest of the anointed – head of the OEO until 1974 and a leading sex-ed supporter. In an unusual mea culpa, he testified before Congress in 1978 that “just as venereal disease has skyrocketed 350% in the last 15 years when we had more clinics, more pills, and more sex education than ever in history, teenage pregnancy has risen.” Ensuing decades saw the implications of these trends worsen with the emergence of new sexually transmitted diseases like HIV and HPV.

Illegitimacy became epidemic. Rates that – in the 1960s – had driven Daniel Patrick Moynihan to fear for the health of the black population were now far exceeded among blacks, Latinos and whites alike as the new millennium dawned. Illegitimacy rates exceeding 70% for blacks, 50% for Hispanics and 25% for whites would have been considered unimaginable at the dawn of the “crisis.”

As if this weren’t scandalous enough, the response of the anointed rivals the results in its breathtaking horror. The rise in pregnancy, illegitimacy, abortions and venereal disease were broadly ignored. If mentioned at all, they were treated as prima facie evidence of the need for more spending and more programs. Opponents were demonized even more strongly as opponents of democracy.

But when speaking to themselves, away from the glare of publicity, the anointed have shifted sex-ed’s focus away from controlling social pathologies and towards encouraging “healthy attitudes about sex and sexuality.” Of course, the definition of “healthy” is the exclusive province of the anointed. Thus, The Journal of School Health rephrases the goal of sex education as “an exciting opportunity to develop new norms.” Sowell correctly deduces that the only purpose behind beginning sex education in kindergarten must be to accomplish the longer, more tenuous goal of indoctrination rather than the more basic program of biological instruction.

Sowell provides an example of this indoctrination at work: a “popular sex instructional program for junior high-school students, aged 13 and 14,” which “shows film strips of four naked couples, two homosexual and two heterosexual, performing a variety of sexually explicit acts.” Not surprisingly, the accompanying teaching materials warn teachers not to show these films to parents or friends so as not to “evoke misunderstanding and difficulties.” What other curriculum prescribes a course of study for students that is not supposed to be available for review by parents? Yet parents who objected to these materials were demonized as “fundamentalists” and “right-wing extremists.” Sowell is quick to remind his readers that this episode, though not typical, is also not rare; similar ones have popped up throughout the U.S.

The response of the anointed is that the typical parent is “either uninformed or too bashful to share useful sexual information with his child.” The direction of these efforts is all too clear: the anointed wish to establish the State in loco parentis, as bearing primary child-rearing responsibility.

The Rights of Criminals and Criminal Suspects

The 1960s also saw a revolution in criminal justice. It is best characterized as an attitude toward crime and punishment. Sowell dissects this attitude with surgical skill, dubbing it the “therapeutic approach.” Several highly placed figures in the judicial system, among them Supreme Court Chief Justice Earl Warren, Attorney General Ramsey Clark and Chief Judge of the D.C. Circuit Court of Appeals David Bazelon, believed that crime was primarily the fault of the law-abiding population rather than criminals. The criminal is “like us, only somewhat weaker;” we imprison criminals out of a “highly irrational… need to punish,” which is a “primitive urge” motivated by “childish fear” (Bazelon).

Alas, the “dehumanizing process” of imprisonment only produces “social branding” and “social failure.” Instead, we need to “[turn] all jails …into hospitals or rehabilitation centers,” which employ “psychiatric treatment” using “new, more sophisticated techniques (Bazelon).” Long prison sentences “will not reduce crime” (Clark).

These men sought to solve this crisis through constitutional interpretation. In particular, they broadened the application of the Bill of Rights from federal law to state and local law and broadened its meaning from a charter of liberties or freedoms to a list of powers or immunities.

In Mapp vs. Ohio (1961), the U.S. Constitution’s 4th Amendment provision barring “unreasonable search and seizure” was broadened to apply to state law as well as federal law. It was interpreted to exclude evidence illegally obtained at trial – thus, the shorthand term “exclusionary rule,” which came to describe its defining point of law. After Mapp, ironclad evidence of guilt was ignored if it was (say) obtained via a warrantless search of a suspect’s premises.

In Gideon vs. Wainwright (1963), a criminal defendant’s right to representation at trial was made absolute, so that indigent defendants were guaranteed a right to state-appointed and compensated counsel.

Escobedo vs. Illinois (1964) invoked the 6th Amendment to broaden this right to apply during a suspect’s interrogation by the police. Thus, confessions obtained during a custodial interrogation (that is, when a suspect was held prior to indictment) were invalid if the suspect was not allowed to confer with his attorney.

Miranda vs. Arizona (1966) essentially superseded Escobedo by invoking the 5th Amendment provision against self-incrimination in place of the 6th Amendment to require that suspects be informed of their right to an attorney, the right to confer with the attorney and their right to avoid self-incrimination prior to interrogation; e.g., immediately upon arrest and detention. In order to confess to a crime, a suspect must first waive their Miranda rights. Any confession not complying with these stipulations was invalidated and inadmissible at trial.

Dissenters in each of these opinions, who comprised the first line of critics to this revolutionary approach to criminal justice, included distinguished jurists like Potter Stewart and Byron White. Apart from the various points of constitutional law, which centered on the departure of these decisions from the original intent of the Framers, the dissents stressed the highly adverse effects the decisions would have on the incidence of crime and violence and the administration of criminal justice.

The results in ensuing years amply bore out those fears. To quote Sowell: “Crime rates skyrocketed. Murder rates shot up until the murder rate in 1974 was more than twice as high as in 1961. Between 1960 and 1976, a citizen’s chances of becoming a victim of a major violent crime tripled. The number of policemen murdered also tripled during the decade of the 1960s. Young criminals, who had been especially favored by the new solicitude, became especially violent. The arrest rate of juveniles for murder more than tripled between 1965 and 1990, even allowing for changes in population size.”

One point not raised by Sowell that deserves mention was the virtual abolition of capital punishment by the Warren Court in the early 1970s. Economists have studied capital punishment for decades and firmly disagree with the conventional thinking that it does not deter murder. The increase in murders in this time period closely followed the judicial moratorium on capital punishment.

The response of the anointed followed two lines. The first was to stress confounding factors like education. The second was to accuse critics of using a call for “law and order” as code language for racism; e.g., suppression of constitutional rights for blacks. The “thinking” behind this accusation was that blacks were disproportionately perpetrators of crime; therefore they would be disproportionately affected by procedures making it harder for criminals to escape punishment for their crimes.

Sowell found one response particularly worthy of notice. Chief Justice Warren found complaints about rising crime to be “self-righteous indignation” based on “oversimplification.” Rather than attribute the surge in crime to the new criminal-justice policies, Warren claimed that “all of us must assume a share of the responsibility” since “for decades we have swept under the rug” the environmental conditions that bred crime – slums, poverty, and the like.

The problem with this theory of crime causation, as Sowell pointed out, is that the U.S. murder rate fell steadily after 1934 (that is, after the repeal of Prohibition), throughout the remainder of the 1930s, the 40s and the 50s. In 1960, the murder rate was less than half of what it had been in 1934. Yet according to Warren, this was precisely the time period in which Americans were sweeping the ostensible behavioral causes of crime under the rug. Crime rates should have been exploding – if Warren’s hypothesis was correct.

Sowell was content with validating the Pattern’s effects on criminal justice. We should carry the analysis further to understand why the Warren Court went wrong.

The left-wing judiciary viewed themselves as anointed spokesmen for freedom. Like the Left’s founding philosopher, John Dewey, they confused freedom with power. A valid right can be exercised without depriving someone else of their rights; power implies the ability to compel obedience or control real resources. Freedom or liberty is the absence of external constraint, not the power to command obedience or control over real resources. The Declaration of Independence and Constitution (including the Bill of Rights) are charters of liberty, not enumerations of powers. They limit the powers of government as a way to secure our freedom; they do not list our freedoms.

By conferring powers on criminals and suspects, the Warren Court judicial reforms perverted the Bill of Rights by reducing the powers of law-abiding citizens. By increasing the real incomes of criminal suspects via guaranteed representation, they reduced our real incomes and happiness. Instead of treating criminal justice as a process for determining guilt or innocence, they treated it as a game in which criminals and innocent suspects deserved the same chance of “winning;” e.g., escaping unscathed. The presumption of innocence in a criminal-justice sense was subtly altered to a presumption of innocence in a moral sense. The real income transferred to criminals inevitable came at the expense of law-abiding citizens; it could not be otherwise because that real income had to come from somewhere.

Other Examples of the Pattern

Thomas Sowell claimed that the Pattern of the Anointed was ubiquitous; examples were “abundant.” Without straining unduly, we can call others to mind.

In earlier works, Sowell himself marshaled evidence for the Pattern. He cited the landmark civil-rights case Brown vs. Topeka Board of Education, whichreversed the longstanding presumption in favor of racial segregation in public schooling. Brown overturned the “separate but equal” doctrine that had previously ruled, making the argument that “separate is inherently unequal.” It has long been assumed that progress toward equality between blacks and whites dated from this decision. Subsequent federal programs like as “affirmative action” escalated the goals of federal policy from promoting equality to conferring special privileges on blacks.

Any discussion of equality should distinguish between ex ante equality (equality of opportunity) and ex post equality (equality of result). Indeed, early federal policy was oriented toward opportunity, emanating from bodies such as the Equal Employment Opportunity Commission (EEOC). But one of the most interesting outcomes of Sowell’s early research was the realization that free markets could produce equalizing results even if equal opportunity was formally lacking.

“As far back as the First World War,” Sowell discovered, “black soldiers from New York, Pennsylvania, Illinois and Ohio scored higher on mental tests than white soldiers from Georgia, Arkansas, Kentucky and Mississippi.” This was not a temporary aberration. “During the 1940s, black students in Harlem schools had test scores very similar to those of white working-class students on the lower east side of New York.” While segregation often produced black public schools that were grossly inferior to their white counterparts, other black schools like Dunbar High School in Washington, D.C., were among the country’s finest secondary schools. “As far back as 1899, [Dunbar] had higher test scores than any of the white schools in Washington, and its average IQ was eleven points above the national average in 1939 – fifteen years before the Supreme Court declared such things impossible.” The common factor behind all these results was that economic incentives and freedom of migration allowed blacks to migrate out of the American South and into the Northeast, thereby allowing them to profit from better schools and economic opportunities there.

Sowell showed that the trend toward equality between white and black incomes began well before Brown, let alone later civil rights legislation and affirmative-action legislation. Indeed, the rate of black advance slowed during the later civil-rights era, rather than speeding up. Once again, free markets rather than government proved to be the effective agent for beneficial social change and economic growth. Once again, we learned in retrospect that the alleged crisis justifying massive government intervention was, in reality, an improving situation before the government intervention – but it became worse afterward.

Coming Soon – The Pattern of the Anointed Strikes Again

With practice, we can learn to anticipate the Pattern. The next EconBrief will reveal the Pattern of the Anointed underway again today.

DRI-326 for week of 3-31-13: The Kansas City Star Meets Flexible Baseball-Ticket Pricing

An Access Advertising EconBrief:

The Kansas City Star Meets Flexible Baseball-Ticket Pricing

Economics is the formal logic of human choice. Newspapers report human affairs. Reporting the news affords endless scope for economics as a tool of explanation and analysis. Yet newspapers are notorious for their ignorance and mishandling of economics. Why?

One possible answer is deliberate misrepresentation and concealment of facts by the papers for ideological reasons. Another is simple error. The latter hearkens to the old maxim, “Never ascribe to venality that which can be explained by mere stupidity.”

Whatever the cause, examples of this phenomenon abound. A recent front page of the Kansas City Star offers fresh evidence of it. The subject is the pricing of baseball tickets by the Kansas City Royals.

Major-League Baseball Meets “Dynamic Pricing”

“Get Set for Big Swings,” shouted the front-page headline of the Star on Sunday, March 31, 2013. An overhead explained: “Royals Ticket Prices: Like airfares and hotel rates, they will fluctuate.” The subhead continued with: “Dynamic pricing, a fixture in the travel industry and growing more common in the entertainment world, has come to Kauffman Stadium. Below are prices for the same outfield seat to see the Royals in their first week at home – as of now.” The graphic chart showed a $54 price for the sold-out home opener on April 8, followed by prices ranging from $23 to $31 to the identical seat for subsequent games that week.

The article underneath, written by veteran staffer Mike Hendricks, contrasts the age-old procedure of fixed seasonal pricing for Kansas City Royals’ baseball games with its successor. So-called “dynamic pricing” is familiar to contemporary shoppers for airline and hotel reservations. Prices can fluctuate from day to day instead of from one season to another. Moreover, these daily fluctuations are not uni-directional; they will move up and down. That is something new for baseball fans – for decades, the only changes in official ticket prices have been upward ratchets from one season to the next.

Economists will immediately recognize that the term “dynamic pricing” is a misnomer – probably owing to (bad) advertising psychology. The precise descriptive term is “flexible pricing.” It implies the actual state of affairs, in which prices are responsive to changes in consumer demand. Failure to recognize and report this misnomer is the first of many depredations committed by the author of this piece.

The headline – “Get Set for Big Swings” – embodies a longtime Kansas City Star tradition: promising revelations that the accompanying article does not deliver. This constitutes lying to the reader. It is reasonable to suppose that Star readers resent being lied to and that this has contributed to the precipitous declines in the paper’s circulation and consequent ad revenue. The only “big swing” in price cited in the article occurs between opening day and succeeding games. One of the safest predictions about any Royals season is that the opening-day game will sell out and that attendance will immediately plummet thereafter. Given flexible pricing, it is therefore axiomatic that opening day will command a high price and that the price will thereupon fall. Maybe there will be “big swings” later in the season, maybe not. But the author doesn’t say that and offers no evidence that it will happen.

By any reasonable standard of journalism, this article is off to a miserable start.

Flexible Pricing of Baseball Tickets

The author’s vagueness on future price fluctuations is not surprising because his grasp of the basis for pricing is demonstrably shaky. Although the phrase “supply and demand” appears once in the article, its underlying logic is left to the reader’s imagination.

The importance of consumer demand to pricing is never mentioned, let alone explained. In this case, the supply of tickets is fixed – limited by the seating capacity of Kauffman Stadium. Thus, the economic logic of baseball ticket pricing comes straight out of the textbook diagram marked “Very Short Run,” in which the supply curve is a vertical line and price is completely determined by its intersection with the downward-sloping demand curve. In the very short run, economists teach, price is “demand-determined.”

Thus, price changes are caused by changes in demand. These are given very short shrift indeed by the author. His marquee explanation for the Royals’ new pricing strategy is that “the hotel and airline industries have used variable pricing strategies for years as a way to encourage customers to make their reservations early.” It is true that hotels and airlines do have one thing in common with baseball teams; namely, a fixed capacity (seating or lodging) that offers the constant incentive to keep capacity utilization as high as possible.

Hotels and airlines, though, commonly suffer the peak-load problem. Their capacity is insufficient to handle demand at its very highest point(s), but too great to utilize efficiently much – perhaps most – of the time. Since the late 1980s, the Royals have suffered from inadequate capacity about one day each season – opening day. In recent years, they have had a hard time giving away tickets to late-season games – and that is not hyperbole. In any case, baseball teams simply do not suffer the kind of scheduling problems endemic to the airline and hotel industries. Business travelers or vacationers on strict timetables are key components of airline and hotel demand, but much less important to baseball teams. Even allowing for the Royals’ atypical status as a regional franchise, buying weeks or months in advance usually provides little value to fans and little convenience to the team.

Why Now? The Timing of the Shift to Flexible Pricing

Mel Brooks’ famous protagonist Maxwell Smart on the classic TV series Get Smart once responded to a villain’s derisive defense “You’re not going to try to convict me on that flimsy evidence, are you?” with the rejoinder “No, I’ve got some more flimsy evidence.” Similarly, the author buttresses his non-explanation of Royals’ ticket pricing with more flimsy evidence. “Of all professional sports, major-league baseball teams have the greatest challenge in selling tickets, given the number of seats [and] games played,” gravely declares a “market analyst” employed by a ticket reseller.

But when baseball was truly America’s national pastime, its long season and big edge in games played was not viewed as a disadvantage. On the contrary, it was cited as a+ leading factor in the economic advantage enjoyed by baseball. Pro football, basketball and hockey were second- and third-string sports, miles behind baseball in income and prestige. Owners envied baseball its long season, which provided a tremendous opportunity to generate revenue. Baseball’s only rival as a leisure-time activity was the movies, which were probably the true national pastime.

No, the long baseball season is only a drawback when the team is a poor attraction. 1985 marked the Royals’ last post-season playoff appearance – they won the World Series by overcoming 3-1 deficits in both post-season playoffs – and they have threatened to return only in 1989, 1994 and 2003. They are the deadbeats of major-league baseball. Their 27-year absence from the playoffs is by far the longest of any team in North American professional sports.

Of course, this begs the question of why the Royals have chosen to introduce flexible pricing now, at this particular point in their history. As it turns out, it is not pure happenstance. Flexible pricing is one of various types of pricing alternatives to single pricing. The common feature behind all these is motivation – the seller’s desire to increase total revenue and profit by charging multiple prices rather than just one.

That motivation stems from more than merely the desire to profit from multipart pricing. Conditions have to be right in order for the alternative scheme to work. The different prices must be designed to gain from differing characteristics of different buyers or different conditions existing among the same buyers at different times. Either way, the firm must have the ability not only to identify the differences but to act upon them. When it does that, it is engaging in price discrimination.

Baseball teams already strive to segment different groups of buyers and charge them different prices to watch the same baseball game. That is the purpose behind different seat categories such as general admission, reserve seats, box seats, field level, upper level, stadium boxes and luxury suites. Each seat category is geared to a different category of buyer and priced accordingly. The general admission tickets are geared toward low-income fans and students. Outfield general admission is the farthest away from the action and is also geared toward the low-income fans who might otherwise not attend games if not for the affordability of a low price. Luxury suites are reserved for corporate clients and millionaires who can afford to plunk down five figures to reserve a season ticket in relative luxury. Box and reserve seats are targeted toward upper-middle-class fans that want a good seat and can afford to pay a price slightly above general admission.

This system has long been in effect in baseball and other sports. It is familiar throughout the entertainment industry. The Star article cites the symphony – an art form whose legendary disdain for solvency seemingly places it above the vulgar domain of commerce and profit. Yet the time-honored seating divisions separating dress circle, orchestra, ground floor, loge or mezzanine and balcony represent the same price-discrimination segmentation of demand practiced by sporting events.

Flexible pricing takes the idea of differential demand in a different direction. Rather than focusing on demand differences among consumers at the same point in time, it considers fluctuations in demand that affect all categories of buyers – but at different points in time. For example, instead of targeting different groups of buyers, segmented by income, it targets different games that support a higher price. These are late-season games when pennant races and individual honors such as batting championships and pitching titles are at stake. These games should command premium prices, as long as the team can stand up under pressure. For over two decades, the Royals did not play such games because they were never in contention that late in the season. Consequently, there was little purpose in setting up flexible pricing because the team would not benefit that much from flexibility. There was little additional pricing strategy the Royals could use to enhance their revenue; all they could do was get what little they could from the standard price-discrimination techniques. The introduction of inter-league play did briefly inject some novelty into the schedule, particularly by adding an interstate rivalry with the St. Louis Cardinals, the Royals’ 1985 World Series opponent. This allowed the team to give flexible pricing a tryout last year in Cardinals’ games.

But prior to the 2013 season, the Royals beefed up their pitching staff. They acquired ace starter James Shields and starter/reliever Wade Davis from the Toronto Blue Jays and starter Ervin Santana in another trade. This transformed the league’s worst pitching staff into a potentially serviceable one while retaining their current offensive strength, spearheaded by all-star Billy Butler and Alex Gordon. Shields is currently pictured on Sports Illustrated’s cover, highlighting the magazine’s baseball pre-season issue. For the first time in years, the team seems able to contend for a playoff berth.

At lastthere is a prospect that late-season games may be competitively meaningful. Opening day may not be the only sellout game on the schedule this year. Thus, an effort to milk more box-office revenue from those games makes sense, since there is more potential revenue to seek.

In theory, flexible pricing benefits teams whenever there are substantial fluctuations in demand from game to game. Various factors other than competitive performance might influence the amplitude of demand over the course of a season. Weather is the most obvious; Kansas City is subject to cool Springs, hot Summers and brisk Falls. A spate of unseasonably bad weather might give the team a chance to head off bad attendance by offering offsetting discounts to fans. Games for which announced starting pitchers are marquee players will generate stronger demand.

But these subsidiary factors will become more important when core demand for tickets is strong. The improvement in the Royals’ competitive position was clearly the driving factor in the team’s change in pricing policy.

Baseball, Politics and the Star

One would suppose that an above-the-fold, front-page article would command the full attention and premium resources of a metropolitan newspaper. Yet none of the real considerations found their way into the Star‘s story on the Royals’ ticket-pricing change. Aside from simple incompetence, how can we explain this?

The Star is a left-wing newspaper. That encompasses more than merely a capsule summary of its editorial stance. Ideology infects every aspect of the newspaper’s operations, from coverage to reporting to editorials to op-eds to advertising. It permeates not only the editorial page but the front page as well. It infiltrates the sports pages, the entertainment section and even the comics. It also affects how the paper treats the Royals.

Sports teams have grown accustomed to public subsidies. These take various forms. Most commonly, they include stadia built and maintained at taxpayer expense – including periodic repairs, refurbishment and reconstruction. That does not mean there are not quid pro quo, though. It is tacitly understood that the team and its employees are to back the multifarious public projects launched by the local political establishment with endorsements and campaign cash.

The newspaper, as the establishment’s informal public-relations and promotion agency, treats the Royals with due deference. The team is viewed as a kind of quasi-public utility – an economic and psychological necessity that is not so much too big to fail as too important to fail. The newspaper sees the team’s economic interactions as gifted with remarkable generative powers – multiplier effects and such – that are really beyond the reach of any mortal business firm. But the Royals have a tacit left-wing seal of approval, which means that they are assumed to be above such vulgar considerations as profit. That is why the economic rationale for flexible and multipart pricing never reaches the tender ears of Star readers.

To the Star, the Royals are not so much a sports franchise as a political franchise and ideological asset. No information potentially damaging or embarrassing to that franchise – no matter how newsworthy – will pass unfiltered through the Star to the general public.

How has the new pricing regime been received by fans? “So far there hasn’t been much of an outcry here or anywhere else.” (21 of the 30 major-league baseball teams have now adopted some form of flexible pricing, the article discloses.) Why not? Again, the article’s author’s lips are sealed on this matter. But the answer is clear. The rise of ticket brokers and a legal secondary market for tickets, cultivated by firms like Stub Hub, has prepared the ground for flexible pricing. In other words, the free market is way ahead of Royals’ management. The author, a faithful Star minion, holds no brief for freedom or free markets and saw no reason to enlighten readers on this point.

The Economics of Flexible Ticket Pricing

The point of the Star‘s story is obscure. The headline promises “big swings” in ticket prices, but the article doesn’t provide any, nor does it suggest any real basis for them. It seems clear that something pretty new and different has come to baseball ticket pricing in particular and to professional sports in general, but the author either doesn’t know what it is or doesn’t want to reveal it. At this point, it is necessary for economic logic to take the tiller of the story in order to bring us to a coherent destination.

Will flexible pricing produce higher or lower prices than the old seasonally fixed pricing method? The short answer is: Both. But that’s not a satisfactory answer. The precise answer is that price will be closely attuned to demand on a game-by-game basis, rather than a yearly basis. (We should bear in mind that there are as many separate “demands” as there are ticket categories – that was true under the old system and remains so under flexible pricing.) From a fundamental economic perspective, that is a good thing.

The article is woefully ambiguous on this point. It first informs us (correctly) that “the prices…will fluctuate day to day, and across all sections based on supply and demand.” (This is the article’s only reference to supply and demand.) It then continues by revealing that “fewer than half the seats in your average ballpark are occupied by fans who have bought season tickets,” thereby setting a “challenge for baseball clubs…to attract casual fans who want to see a game or two during the year.” And “free bobbleheads and ‘buck nights’ only go so far in building attendance numbers.” So far, so good – flexible pricing’s raison d’être is improving ballpark-capacity utilization.

Sure enough, a company called Qcue, headed by entrepreneur Barry Kahn, sold the San Francisco Giants on the concept of flexible pricing on a trial basis in 2009. It yielded a 20% increase in sales of the seats in sections picked for the trial. Today, the company works with two-thirds of major-league clubs and has achieved revenue increases of between 5% and 30%. “That’s ticket-revenue dollars, not an increase in the number of tickets sold. However, that tends to go up, too. Dynamic pricing doesn’t necessarily make it more affordable to attend a ball game than before, but it can.”

This burbling incoherence is typical Star analysis. If attendance is increasing across the board and the only thing that’s changed is prices charged, then the prices must be falling on net balance. That’s the Law of Demand at work. The questions are: What makes them fall? When do they fall? Do they ever rise? When is the best time to buy? And – the $64,000 question – is flexible pricing a good thing overall for baseball fans and for the rest of us?

The article implies that midweek games will carry a lower price tag. It is certainly true that, all other things equal, the demand is greater on weekends when kids and working parents are less encumbered by obligation. But that is a comparatively minor factor in segmenting demand.

High-demand games are special occasions – opening day, marquee players or teams appearing – and pennant-race games. A computer algorithm will alert team officials to opportunities for price increases, which will be implemented electively. It is these games in which Royals’ sales director Steve Shiffman’s advice to “buy early, save money” makes sense. Not only will buying early get the best price, it will also avert the possibility of a shutout; e.g., failure to “score” a ticket at all due to unavailability.

The rest of the time, buying early benefits the team, not the fan. A baseball ticket, like a stock option airline seat or radio advertising time, is a wasting asset whose value expires when the game’s first pitch is thrown. (More precisely, it plummets dramatically, expiring completely at about the fourth or fifth inning.) As game time nears, the holder will likely accept successively lower prices rather than see it expire unused. This is particularly true of sports teams, who have a vested interested in filling seats to increase the incomes of concessionaires. The rise of ticket brokers has complicated pricing for team management, who are extremely reluctant to stimulate price wars lowering seat prices too much. Thus, the Royals advertise the season-ticket-holder’s discounted single-game price as their rock-bottom price. But from the fan’s standpoint, there is no point in transacting before this price is offered and no reason to rush once it is in place – for garden-variety, low-demand games.

Thus, the brave new world of flexible baseball-ticket pricing does demand that fans distinguish between high-demand and low-demand games, in order to get the best price. But this should not tax the capabilities of any experienced fan or intelligent non-fan. As a practical matter, it will not severely disadvantage even the most incapable consumer until and unless the Royals become contenders.

Is flexible pricing economically efficient? Flexible pricing brings the number of tickets fans wish to purchase in each seat category closet to equality with the number available, using price as the coordinating mechanism. This is another way of saying that the amount of alternative consumption fans are willing to sacrifice to get a ticket (their demand for it) is closer to the amount they have to sacrifice (determined by the ticket price). Equality between those two things constitutes the famous economic condition called “equality at the margin.” It is one good way of defining economic efficiency. Thus, the verdict on flexible pricing and economic efficiency is favorable.

This is good for everybody because we all have a stake in using what we have to make each other as well off as possible. It’s good for taxpayers because baseball is publicly subsidized, but the presence of subsidies doesn’t make the case stronger. In fact, the subsidies themselves are inefficient and should be ended – that would make things even better. (Sports meet none of the textbook criteria for subsidy and none of the claims to economic exceptionalism advanced in their behalf.)

If prices sometimes go down but sometimes go up, how can we claim that fans, per se, are better off? Prices go up when people value a ticket than they value the alternative consumption that the ticket’s price embodies. Flexible pricing enables us to sort out the cases when this is true from the cases when it isn’t true. In the old days, we needed illegal ticket scalpers to do that. Now ticket brokers can do it, but not as well as when the team gets involved in the process, too.

If the Royals benefit from flexible pricing, doesn’t this mean that fans must lose? Both entities can’t benefit at the same time, can they? The left-wing, socialist concept of exchange as a power relation implies that trade is a zero-sum game in which the gains of one party are the losses of the other. Mutually beneficial voluntary exchange benefits both parties to the exchange, and when the gains from trade are increased the gain can be divided to benefit both traders. This needn’t be true in every transition from inefficient to efficient conditions, but there is no reason to doubt its occurrence here.

Perhaps the most concrete way to drive home the importance of this principle is by stressing the fact that the benefits of sports teams are heavily location-dependent. If the Royals move away from Kansas City and operate elsewhere, most of the benefits created by the team will flow to sports fans in that new location. Allowing the Royals to maximize the benefits they earn from the value the team itself actually creates will maximize the chances that the Royals continue to operate in Kansas City. The current system strives to keep the team in town by giving them subsidies extracted from non-fans based on phony economic value not really created. Baseball fans deserve to get the value they want and are willing to pay for – not value extorted from unwilling third parties who gain nothing from the team’s presence.

DRI-313 for week of 3-24-13: The Power to Tax

An Access Advertising EconBrief:

The Power to Tax

The long-running economics news story of 2013 has been the budgetary battle between the Obama Administration and Congressional Republicans. The most recent skirmish featured a clash between Senate- and House-approved budgets – that is, between Democrat and Republican pretenses to reform.

Both sides are pretending because neither side really wants to abandon big government and out-of-control spending. The Republicans are harder pressed because they have long given lip service to concepts of limited government and budgetary control. But both sides want to make a political show of deficit reduction. The Democrats are wedded to their political constituencies unto death and must fund the spending that supports them.

The Republican approach is to cut spending in order to lower government expenditures closer to revenue. The Democrat philosophy is to raise taxes to raise revenue to meet expenditures. The failure of either side to change their position significantly is presumably what the public means when it charges individual legislators with deliberately promoting gridlock and refusing to compromise.

Republicans are adamant in their unwillingness to raise taxes. This attitude has won them a public reputation for being unwilling to compromise. In recent years, the Republican reaction to public disapproval has been to retreat in confusion and dismay. This time, though, they remain intractable. Why are they so unwilling to raise taxes? What is the overarching purpose of a tax, anyway? How do taxes affect economic welfare and growth?

Taxation

Taxation is as old as civilization. Before democratic government, monarchs used it to extract wealth and income from their subjects. It has taken numerous forms, but the underlying principle invariably requires an involuntary levy or exaction paid to government by the governed.

One traditional form is a tax on either the production or consumption of a good or service. This is called an excise tax. This tax may consist of a fixed amount per-unit (a specific tax) or an amount expressed as a percentage of the selling price (an ad valorem tax, where the Latin phrase means “to the value”). It is a good place to start looking at taxes because its simplicity gives us a good look at the general principles of taxation.

The basic economic effect of a tax is to drive a wedge between the price paid by the buyer of the good and the price received by the seller. The result applies regardless of whether the tax is levied on the buyer or the seller. In fact, the resulting market price and quantity are the same regardless of who bears the nominal impact of the tax. This is referred to as the equivalence theorem; it is a fundamental principle of Public Finance, the economic sub-discipline under which taxation is studied.

The words “nominal impact” imply that the people who pay the tax may not necessarily be the ones who bear its real economic burden. This is correct. The ultimate end-in-view behind all economic activity is consumption, now or in the future. Only human beings can consume in this meaningful economic sense. Only human beings can suffer a loss of current or future consumption (e.g., savings). While a non-human entity like a corporation – recognized by law as a “fictitious person” – may pay a tax in the legal sense, it cannot bear the true economic burden or incidence of the tax.

Because both short- and medium-term market demand and market supply are each a function of price, an excise tax affects both the quantity buyers wish to purchase and the quantity producers wish to produce and sell. This means that the incidence of the tax is shared by consumers and business owners.

Consider first the case in which buyers pay the tax. If the tax is (say) $2 per unit of the good, the market price (net of tax) that buyers are willing to pay for every quantity of the good is now $2 less, since their total outgo will include the market price plus the tax. That is, their demand for the good will fall. This will lower the market price, forcing producers to produce and sell a lesser quantity. Alternatively, suppose that producers are liable for the tax. Now their costs will rise by $2 per unit, decreasing supply and increasing price. Consumers will pay a higher price for the decreased quantity.

In either case, consumers will pay more than before the tax – in the first case, a lower market price plus the tax; in the second case, a higher market price inclusive of the tax as reflected in producers’ costs. In either case, producers will receive less than before the tax- in the first case, a lower market price; in the second case, a higher market price whose value is reduced by their higher costs due to the tax they owe. The equivalence theorem states that the buyer and seller pay and receive, respectively, exactly the same in the two cases no matter who “pays” the tax.

In the long run, there is sufficient time for business firms to enter and (in this case) leave the market. Exit of firms tends to increase price by the full amount of the tax and drive profit toward the so-called “normal” level, at which owners receive a return just equal to what they could earn in the best alternative investment of equal risk. Thus, the long-run incidence of the tax may be shared by consumers and suppliers of inputs to the industry, or it may be borne by consumers alone.

Our excise-tax example illustrates general principles applicable to all taxes. Taxes discourage economic activity. They harm people on both sides of the market. Over and above this harm, they distort the prices faced by buyers and sellers, creating what public-finance economists call the “excess burden” of a tax.

Because taxes have these adverse effects, economic textbooks deem them a tool of limited resort. Some goods and services, such as national defense, cannot be produced and sold in private markets. These “public goods” must be produced and administered by government. To finance this activity, taxes are considered expedient.

In practice, however, public goods are very few in number, while government is pervasive. Taxes are numerous and lucrative sources of government revenue. Instead of a necessary evil, taxes have become a threat to our well-being. America today has become a locus classicus of the aphorism “the power to tax is the power to destroy.”

In the United States, the most familiar excise taxes have long been specific taxes on gasoline, cigarettes and alcohol. At the federal level, gas tax proceeds are devoted to maintenance of federal highways. Or rather, that was the original intention; today, about 40% of proceeds are diverted to general revenue for earmarked programs. Meanwhile, our roads and (especially) bridges have deteriorated markedly.

Maintaining vital infrastructure with a funding mechanism that is both ineffective and harmful to growth and prosperity seems quixotic. Recently, some state legislatures have begun to make long-term lease contracts with private firms who operate and maintain roads in exchange for the right to charge tolls and book the revenue. The companies have the strongest possible incentive to keep the roads in good condition and maximize their use. Other countries have already seized this chance to improve their transportation network by relieving government of a responsibility it handles badly.

We now shift from general principles of taxation to evaluation of particular types of tax.

Excise vs. Ad-valorem Taxation

A longstanding source of periodic irritation to Americans is the retail price of gasoline. The usual focus of anger is “the oil companies,” who are popularly supposed to possess monopoly power with which they earn “obscene profits” – modified to read “windfall profits” whenever an increase in gasoline prices accompanies an oil-related event on the national or international scene or “record profits” whenever a quarterly release of income statement date from Exxon Mobil reveals that the company’s total net income has exceeded its previous high.

The complete lack of cogency in these complaints has been demonstrated time and again. Another recurring gripe, however, bears on the issue of taxation. Talk-show callers often gripe that gasoline sellers are quick to raise prices but slow to lower them – even when this appears justified by events. If price increases are merely supply and demand at work, they inquire heatedly, why does the process only work in one direction? Shouldn’t prices be just as quick to fall when supply increases, when costs decrease, when Middle-East tensions dissolve, when demand goes slack?

Nearly fifty years ago, the distinguished specialist in international trade and industrial organization, Richard Caves, pointed out the role played by specific excise taxation in pricing. To modify his example using fictitious numbers for convenience, suppose that the retail price of gasoline is $2 per gallon and the excise tax is $1. Now ponder the effects of a 10 cent price reduction by a seller. In actual fact, sellers pay the tax, so the seller’s gross receipts fall by 10% (10 cents as a percentage of $1). But the price faced by buyers falls by only 5% (10 cents as a percentage of $2). Thus, the purchasing response to a price reduction will be depressed by a price reduction, compared to the case where taxation is absent.

Now consider the opposite case, where price is increased. A 10-cent price increase will increase gross margin by 10% while increasing the price faced by buyers by only 5%. It is the opposite situation to the price-decrease case. Specific excise taxation increases the incentive to raise the price of the good while reducing the incentive to lower price. In other words, it tends to create just the short of world complained of by gasoline consumers – one in which sellers are relatively quick to raise price but slow to lower it!

As Caves mentioned, this flaw could be remedied by changing the specific excise tax to an ad-valorem tax, in which the tax is comprised of a fixed percentage of the good’s selling price. But this hasn’t happened in the 49 years since Caves wrote.

The excise taxes on cigarettes and alcohol have created additional problems by encouraging smuggling and illegal production to avoid payment of the taxes. Unlike the fuel tax, those taxes do not have a clear-cut rationale other than the raising of revenue. Lip service is given to the goals of discouraging smoking, but a prohibitive tax would be high enough to persuade all smokers to quit. Since the tax is set well below this point, its purpose is presumably to raise revenue instead. Another oft-stated goal is to use tax proceeds to defray medical expenses attributable to use of the products, such as medical bills of lung cancer sufferers. Again, this ambition has not been fulfilled. The only reasonable explanation for the persistence of these taxes is to support government – not for any productive or valuable purpose, but merely to provide income for officials and employees.

Income Taxation

This year, the federal income tax celebrates its centenary. From its miniscule beginnings, the federal income tax code has grown into a monstrosity fed and cared for by a huge federal bureaucracy, the Internal Revenue Service. The top marginal tax rate began at 7%, has grown as high as 92% and currently resides at 39.6%.

But the most destructive thing about income taxes is not their height but the indirect costs they impose on all of us. These include the impossibility of definition, verification and collection. The continual additions and modifications to the tax code have made it a byzantine nightmare for preparers; it is proverbial that even pre-eminent experts cannot warranty their interpretations of its provisions. Each year, Americans spend a chunk of Gross Domestic Product on federal tax preparation. This calculation includes the time and effort devoted to tax avoidance.

The biggest irony associated with the income tax is that its central logic was developed by a free-market libertarian economist, Henry Simons of the University of Chicago, while working for the federal government during World War II. In particular, it was Simons who developed the definition of “income” that had made the tax code so baffling and infuriating to subsequent generations. In fact, Simons sought to make the income tax consistent with the concept of real income or utility as defined by economic theory. Alas, his efforts demonstrated that the precise theoretical categories beloved of economists all too often lack real-world referents.

The clearest demonstration of the damage done by income taxation may be migration by high individual earners and businesses away from high income-tax rate habitations. Over the years, some of the world’s wealthiest authors, movie stars, athletes and moguls have become tax exiles. Among the historical sufferers of this brain drain have been Great Britain (movie stars Anthony Hopkins and Michael Caine), Italy (movie mogul Carlo Ponti and star Sophia Loren) , France (movie star Gerard Depardieu) and Sweden (tennis great Bjorn Borg).

Apart from revenue, the other claim made in behalf of income taxes is fairness. For over a century, the Left has maintained that progressive income-tax rates are necessary to insure an equitable distribution of income. The most recent rhetorical recurrence accompanied the Occupy Wall Street movement. The counterarguments, marshaled concisely by authors Blum and Kalven in The Uneasy Case for Progressive Taxation, are convincing on a theoretical level. Empirically, the utter failure of regimes such as Soviet Russia and Communist China to achieve distributional equality suggests that government power is either inappropriate or insufficient for the task – even assuming it is worth doing.

Property Taxation

Property taxes have long been the primary source of income for local governments and schools in the United States. That constitutes a recommendation only to those employed by governments and schools. The assessments used to determine the property values to which the property-tax rates apply are notoriously inaccurate when compared to actual market values. For years, local politicians used rising property values and the prestige associated with education as levers to ratchet up property taxes and continually increase education funding.

In the late 1970s and early 80s, this gravy train came to a screeching halt. It became clear that continually rising taxes were funding an education system that was failing its customers. Despite fivefold spending increase in real terms over the previous three decades, average test scores were flat or falling. California taxpayers felt so thoroughly victimized that they approved the landmark tax-limitation measure Proposition 13. Other state-level tax limitation measures, such as Missouri’s Hancock Amendment, accomplished the same goals through less direct means.

The concept of property taxation bears at least a family resemblance to the form of taxation most admired by economic students of the subject. Henry George’s “single tax” on land was based on the premise that land is the only resource in completely inelastic supply. Given this, a tax on land cannot discourage its supply. George became the most popular economist of the 19th century by promoting this program of public finance.

Unfortunately, his view was simplistic. While the physical supply of land is indeed in fixed supply, the economically valuable and available supply of land is not. To achieve its goals, the single tax would have to apply only on the undeveloped component of developed land. It is not commonly feasible to sort out this datum and property taxes in reality do not even make the attempt.

Sales Taxes

Just as water seeks its own level, taxation tends to follow the path of least resistance. In recent years, this has been traced out by the sales tax. A tax on retail commercial transactions is easy to implement, verify and collect. This gives it a big advantage over other forms of tax that can be avoided legally, evaded illegally and put off indefinitely.

Ironically, the sales tax has also become popular with the organized anti-tax movement, many of whom have proffered it as a composite replacement for virtually all other forms of taxation. A flat sales tax of X%, where X might be some number between 10 and 25, could substitute for all other taxes by providing government with roughly the same amount of total revenue it currently collects, but without the tremendous costs of collection, verification and monitoring it now incurs. Similarly, citizens would be spared the tremendous burden of preparing, calculating and worrying over the taxes they now pay. And they could fight one single battle against future tax increases rather than having to fight on multiple fronts simultaneously.

One counterargument, perhaps the most telling, is that government cannot be trusted to first pass an omnibus sales tax, then repeal other taxes. We might well be stuck with a vastly higher sales tax on top of our current tax burden. From the Left comes the objection that sales taxes are highly regressive, falling much more heavily on low-income taxpayers whose annual retail transactions form a large part of their incomes and wealth.

Taxes and Economic Growth

In the late 1970s and early 80s, the economic philosophy of “supply-side economics” drew attention to the effect of taxes on economic incentives and growth. Federal tax-rate reductions in the U.S. and Great Britain, followed by the revival of growth and retreat of inflation in both countries, preceded tax-rate reductions in dozens of other countries around the world. To this day, economists argue about the effects of this revolution. The argument centers mainly on the sensitivity of households and business to tax-rate changes, with left-wing economists seeing little reaction and right-wingers finding great responsiveness.

One way to break this logjam would be to examine state-level U.S. data. Policy studies by think tanks like the Heartland Institute, American Legislative Exchange Council, Cato Institute and Heritage Foundation have all found in-migration toward, and higher rates of economic growth in, states with lower tax rates and downward tax-rate trends. These states have also tended to be the so-called “red states,” which have voted Republican in national elections.

The Power to Tax

There is simply no doubt that the incentives created by taxation are perverse; that is, they tend to discourage economic value, welfare and growth. The arguments for taxation are twofold – first, that its undesirable effects are quantitatively small; second, that it is necessary to support activities that would otherwise go begging and needs that would otherwise go unmet.

Both these arguments are remarkably weak. Given the omnipresence of taxes, their aggregate impact can hardly be weak. The case for a tepid reaction by individuals to changes in tax rates does not accord with everyday life or historical experience. And the Left has done nothing at all to convince the public that government programs are necessary, successful and responsive to consumer wants.

It is no wonder that Republicans in Congress are drawing a line in the sand on taxation. The wonder is that they have waited so long. Doubtless their reluctance reflects their unwillingness to face the implications of this decision. The welfare state has come to a dead end. It survives in an artificial atmosphere oxygenated by spending pumped in by government. We can no longer borrow or print the money to spend. Opposition to taxes implies opposition to spending. And that requires a political will that Republicans have not had to summon for many decades.

DRI-276 for week of 3-10-13: Understanding Motivation in the Nutritional Regulation Debate

An Access Advertising EconBrief:

Understanding Motivation in the Nutritional Regulation Debate

What is the rationale for government to intervene in our lives – that is, to insert itself between us and the objects of our actions? It is either to prevent something bad from happening or to bring about something good that would otherwise not occur.

This would appear to be an obvious answer to a straightforward question. Yet by this simple standard recent declarations on nutrition by American authorities and government officials seem utterly incoherent.

To make any sense of their comments, we must ask ourselves: Who are they really talking to? What are their real motives, as opposed to their stated or apparent ones? And what is their underlying agenda?

That is where our understanding of economics comes in handy.

The First Lady’s “Business Case for Healthier Food Options”

In a widely publicized Wall Street Journal op-ed (2/28/2013), First Lady Michelle Obama made what she called “the business case for healthier food options.” Ms. Obama has seized upon the so-called “epidemic” of childhood obesity as her personal cause célèbre, much as Nancy Reagan urged kids to “just say no” to recreational drugs. “For years,” she recounts, the problem was viewed as “insurmountable” because “healthy food simply didn’t sell – the demand wasn’t there and higher profits were found elsewhere.”

No longer. “Today we are proving the conventional wisdom wrong… American companies are achieving greater and greater success by creating and selling healthy products.” Now, it seems, “what’s good for kids and good for family budgets can also be good for business.”

A herald of the dawn of a new age had better be able to point to sunlight on the horizon. Ms. Obama cites the example of Wal Mart, which has “cut the costs to its consumers of fruits and vegetables by $2.3 billion and reduced the amount of sugar in its products by 10%. It has also “launched a labeling program that helps customers spot healthy items on the shelf.” Sales of these products have increased.

Disney has “eliminate[d] ads for junk foods from its children’s programming and improve[d] the food served in [its] theme parks.” Walgreens is adding fruits and vegetables to (selected) stores. Restaurants “are cutting calories, fat and sodium from menus and offering healthier kids’ meals.”

The First Lady refers to an opinion-survey finding that “82% of consumers feel that it’s important for companies to offer healthy products that fit family budgets.” She cites a Hudson Institute study that finds over 70% of sales growth in consumer-packaged goods comes from “healthier foods.” Moreover, in recent years the Institute found a direct correlation between percentage of healthy food sold and rate of return.

Ms. Obama closes her piece on a ringing note of patriotic boilerplate. American businesses are at last heeding the call to arms – they are “stepping up to invest in building a healthier future for our kids.” The bandwagon is rolling like an avalanche down a mountain and everybody is hopping aboard. Teachers, mayors, faith leaders, parents, leaders from every sector – why, even “Republicans and Democrats are working together in Congress” to improve school lunches, for goodness sake!

In Mississippi, obesity rates have fallen by 13% at the elementary-school level. Childhood obesity has fallen measurably in California and in cities like New York City and Philadelphia. Of course, we have “a long way to go” since “the problem is nowhere near being solved,” but she “has never been more optimistic.”

Fact Check: Mirabile Dictu, Ms. Obama Seems Factually and Substantively Accurate

It is never safe to take politicians at their word. Ms. Obama does not hold elective office, but First Ladies have long been as politically saturated as their husbands. Thus a fact check of Ms. Obama’s contentions is in order.

Lo and behold, there are indications that not only the sum but the substance of her remarks is accurate. Quoting from the HealthDay newswire earlier last month (2/7/2013): “A leaner menu may lead to a fatter wallet for those involved in the restaurant industry, research suggests.” The Raymond Johnson Foundation surveyed 21 of the nation’s largest restaurant chains for a 5-year (2006-11) period. According to an analyst from the Hudson Institute, “those [businesses] that increased the amount of low-calorie options they served had greater increases in customer traffic and stronger gains in total servings than those that didn’t.”

The researchers developed their own categories for “low-calorie” servings of main courses, side dishes, desserts and drinks. During the survey period, new items that met the low-calorie criteria outperformed others in 17 of the 21 participating chains. The servings classified as low-calorie increased their percentage of total sales in all 21 chains.

Among the chains surveyed were McDonald’s, Wendy’s, Burger King, Taco Bell, Applebee’s, Olive Garden, Chili’s and Outback Steakhouse. Not surprisingly, the chains included in the study comprised 49% of the total revenue in the top 100 U.S. restaurant chains, or some $102 billion in annual sales.

Meanwhile, Back at the Regulatory Ranch…

A few days earlier (2/1/2013), the HealthDay newswire carried this story: “FDA Should Work to Cut Sugar Levels in Sodas, Experts Say.” The subheading was: “Petition by leading consumer-advocate group and academics urges artificial sweeteners be used instead.”

The “leading consumer-advocate group” was none other than the Center for Science in the Public Interest (CSPI), the left-wing group that can with justification be characterized as the nation’s leading scientific-scare-mongering activist organization. Draping the organization in the mantle of “nutrition experts and health agencies from a number of U.S. cities,” Director Michael Jacobson announced a petition urging the Food and Drug Administration (FDA) to set a “safe level” for high-fructose corn syrup (HFCS) and other sugars used to sweeten sodas and other drinks.

The petition claimed that 14 million Americans get over 1/3 of their daily calories from added sugars like these. They “are causing serious problems – obesity, diabetes and heart disease, among others,” according to Jacobson. But Jacobson’s statement accompanying the petition dialed down the relationship from causation to correlation, pointing out the “great deal of evidence linking sugar to [these] problems,” from which CSPI is now “concluding that much of the evidence centers on HFCS.”

Having first donned his scientist hat, Jacobson then doffed it for his lawyer hat by declaring that the FDA is legally obligated to act by – in effect – treating sugar as a toxic substance. Since the First Law of toxicology is “the dose makes the poison,” the FDA must determine the safe level of consumption for HFCS and other sugars in drinks. Then it should set “voluntary sugar targets” for manufacturers of other foods. Finally, Jacobson completes his expert-witness hat trick by posing as an expert on education. The FDA should “educate consumers” about the dangers of sugar, he concludes.

Just what, exactly, should manufacturers of America’s most popular drinks use to sweeten their products – assuming that they are permitted to go on producing them at all? “Artificial sweeteners” is CSPI’s papal verdict. Ironically, their blessing is exquisitely timed to coincide with a barrage of publicity suggesting that aspartame and alternative sweeteners are linked to diabetes and other adverse health outcomes. Yet here, Jacobson is mysteriously complacent. “The FDA considers all these sweeteners perfectly safe. We think the certain harm [from HFCS and sugars] greatly outweighs the speculative risk from artificial sweeteners.”

Jacobson’s position conclusively establishes CSPI as an irony-free zone. Two decades ago, CSPI waged a vocal public campaign to gain regulatory approval for Trans fats as the

“healthy alternative” to saturated fats in the American diet. Today, of course, Trans fats are so firmly fixed in the bad graces of regulators that food manufacturers take care to note their absence on ingredient lists whenever possible.

The Economics of the Current Nutrition Debate

It is no accident, as old-time Marxists were fond of saying, that economics is routinely omitted from the public debate about nutrition and regulation. (Indeed, Michael Jacobson went so far as to demand that “economic issues shouldn’t figure in this” at all.) Economic logic reveals that – even when the principals make statements that are factually accurate – their underlying logic is completely awry and their motives have no relationship to their public utterances.

First, consider Michelle Obama’s “business case” for “healthy foods.” To whom is she speaking? And why? After all, her remarks appear in the Bible of American business, The Wall Street Journal. On their face, they appear to be addressed to food manufacturers in an effort to persuade them to produce more “healthy foods.”

And this is completely crazy, is it not? After all, the livelihoods and happiness of food manufacturers depend on their producing foods that people like and want to buy. Their sales and profits provide a clear-cut index of consumer wants. Their gaze is fixed on sales 24/7. This truism can hardly be news to left-wing commentators like the Obama’s, since half the time the Left is criticizing business for producing the wrong things and the other half the Left excoriates business for its maniacal pursuit of profit and inordinate success in attaining it. If there is one thing business does not need to be reminded to do, it is to produce more goods that consumers want in order to earn higher profits.

Ms. Obama spends a thousand words in the Journal telling American business that healthy foods are now profitable. Does she really believe this was unknown to them before she spilled the beans in print? When these foods are the sales leaders for 17 of 21 leading restaurant chains over the last five reporting years? Who is she kidding?

No. Even the First Lady cannot be this obtuse. She cannot believe that food manufacturers are utterly ignorant of their own business, nor can she expect them to take her advice on how to run their business. They are the experts on the food business. Even if they needed advice, they would never solicit hers. She has openly disdained business; advising young people to forego careers in the business world.

No, Ms. Obama’s motive is not what it seems to be. She has another agenda.

The same thing is true of Michael Jacobson. In response to CSPI’s petition, the American Beverage Association released a statement pointing to the elephant in the room alongside CSPI and the self-appointed nutrition “experts.” 45% of all non-alcoholic beverages consumed in the U.S. today have zero calories. Average calories per beverage serving are down 23% since 1998. Calories from sugar in beverages are down 37% since 2000. In other words, the free market has beaten CSPI and its small army of would-be regulators to the punch.

If the FDA had set voluntary guidelines in 1998 for a changeover to artificial sweeteners, it would today hail the current state of the market as a great victory for regulation. (And then it would demand an increase in its budget on the grounds that much, so very much, remained to do in order to usher in soft-drink nirvana for American consumers.) But because our current status was achieved by free markets, without regulatory carrots or sticks, a crisis exists for the regulatory Left.

The obesity “epidemic” is not a crisis for the Left because it threatens the health of Americans. It is a crisis because it represents a wasting opportunity. Onetime Obama advisor Rahm Emmanuel gained fame by coining the slogan “Never let a crisis go to waste.” His point was that every crisis is a potential opportunity to expand the scope and power of the federal government. A political administration should seize that opportunity by creating more federal agencies, spending more money and enacting more regulations. Failure to do so sacrifices power – and power is all that matters in politics.

The history of the obesity and diabetes “epidemics” reveals why the Left is now sweating bullets on the issue of nutritional regulation.

The Real Cure for the Obesity “Epidemic” – and the Scramble to Get Back in Front of the V

Ms. Obama’s op-ed was not only right about the growing popularity of so-called healthy eating. She was also right about its previous lack of appeal to consumers. For many years, the Left hectored food producers to produce what consumers ought to eat instead of what they wanted to eat. And for years, consumers voted for tasty food over what the experts told them they ought to want.

The Left reacted to this by blaming the victim. Academics and public-health officials insisted that consumers were sluggards who refused to eat right and shunned exercise. In reality, consumers were only following the revised nutrition guidelines that advised them to make carbohydrates the chief source of energy and eschew fat in general and meat in particular. Weight loss was a mechanical process in which calories expended in energy exceeded those ingested in food. Counting calories was the necessary centerpiece of this process.

Food manufacturers did not refuse to produce low-calorie foods. But fat not only produces flavor in food, it also makes it filling – thus producing satiety. Food manufacturers discovered that cutting back on fat made foods tasteless and left consumers feeling hungry. They learned that by adding carbohydrates in the form of sugars, they could replace the taste with only a moderate increase in calories. (Simple carbohydrates are not calorie-dense.)

The problem with this program is that it did not work. Consumers would buy foods that replaced fat with simply carbohydrates but these did not promote weight loss. In this regard, it is vital to appreciate the difference between expertise in the marketplace – as represented by food manufacturers – and in academia. Food manufacturers are experts because they have to be. If they fail, they go out of business and are experts no longer. But academic experts on nutrition and weight loss did not actually have to aid customers in losing weight or reaching optimal nutrition. They only had to surmount the hurdle of peer review. Consequently, they were able to mislead two generations of consumers.

The damage wasn’t limited to obesity. The emphasis on carbohydrates also caused a ghastly upward spike in the incidence of late-onset, Type II diabetes. While the rush of insulin generated by the ingestion of carbohydrates was sufficient to prevent diabetic shock and coma, it did not prevent the damage caused by frequent, repetitive upward spikes in blood sugar. Because the insulin eventually returned blood sugar to normal and because standard medical practice has been to check blood glucose after a period of fasting, these spikes and the accompanying damage went undetected for many years. Eventually, the baby-boom generation began to suffer peripheral neuropathy and other symptoms of nerve damage resulting from Type II diabetes. The influx of television commercials offering treatments for this condition is an index of its prevalence.

It took a doctor on the fringes of scientific respectability named Robert Atkins to explain that the culprit in weight gain was not fat consumption per se. Instead, carbohydrates were at fault. When consumed undiluted, carbohydrates entered the bloodstream quickly and caused the body to release insulin to counteract the resulting upward spike in blood sugar. The insulin caused the body to store fat in the body’s cells rather than burning it as energy.

The key to weight loss was to make protein, rather than carbohydrates, the body’s leading energy source. Carbohydrates should be consumed only when their release into the bloodstream could be slowed by simultaneous consumption of fiber (as with whole apples), fat (as with butter), protein (as with meat) or acid (as with sourdough bread). Meat consumption is not problematic for weight gain or cholesterol accumulation because the body burns fat for energy.

The observed trend toward healthy eating is largely this revolution in blood sugar regulation, which modifies the original Atkins insight with a more precise scientific rationale developed by cardiologists like Arthur Agatston. The difficulty in finding unsweetened iced tea on store shelves – Rush Limbaugh was forced to add an unsweetened option to his lineup of sweet teas – is one indication of the power of this approach. The sudden ubiquity of broccoli, once the butt of standup comedy routines, is another. (Broccoli is high in fiber as well as phytonutrients.) The French fry was a mainstay of the 20th century American diet, but it now shares menu space with sweet-potato fries because sweet potatoes do not share the glycemic (blood-sugar related) drawbacks of white potatoes.

This is the healthy eating referred to by Ms. Obama and the Robert Johnson Foundation study. It endures where its predecessors failed because it works. People actually lose weight without having to count calories. They eat until they are full, so do not feel deprived and find it easy to put up with the loss of starches and desserts.

This approach was developed by the free marketplace, over the hysterical objections of the academic and regulatory nutrition authorities. The establishment labeled the Atkins diet dangerous and predicted it would kill its adherents. Instead, the descendants of Atkins’ program are killing off the nutritional competition. And this puts the Left wing in an untenable position today.

The late Nobel-prizewinning economist Milton Friedman compared political leaders to the leader in a V-formation of ducks. Every once in a while, Friedman said, the leader would look back and notice that he was flying alone. The ducks had deserted him, flying off in another direction. The leader was forced to scramble after them in order to get back in front of the V and resume his leadership position. In this case, the public realized that the government’s nutrition leadership was wrongheaded and disastrous. They left formation and flew off in pursuit of an approach that worked – the principles pioneered by Atkins and refined by newer, more scientific approaches like the SouthBeach diet.

Now the Left is scrambling to get back in front of the V. She knows that she needs to hurry. Ms. Obama doesn’t just want children to stop getting fat. She wants to be able to claim the credit for that result.

As it stands now, she can hardly claim credit for trends that began well before she even became First Lady. Her only hope is to associate herself in the public’s mind with the business trend away from the failed Establishment diet. That is why she chose the Wall Street Journal as the venue for her piece, because of its association with business. She knows her public will not actually read the op-ed. That is good; she wants them to hear about it through the filter of the mainstream media, which will spin its content in ways favorable to her. Her constituency will believe anything bad about business, so she will be seen as telling business what is good for them and for the public. And when the Administration eventually proposes rules telling food producers what they can and can’t produce and how to produce its output, she can then be seen as benign – somebody who is merely helping business to do what is good for it as well as everybody else. And she can claim credit for outcomes attained before those rules ever went into place. The public will have forgotten, if it ever knew, the real story.

Michael Jacobson’s back is against the wall because the blood-sugar revolution is threatening to abort the CSPI’s cherished goal of federal regulation of the American diet. If free markets are allowed to cure the obesity “epidemic” unaided by FDA regulation, it will dawn on the public that the FDA is the equivalent of Edmund Rostand’s character “Chantecler” in the eponymous allegorical play. Chantecler was a rooster who lived his life convinced that his own crowing was responsible for the rising of the sun at dawn, only to suffer cruel disillusionment after a bout with laryngitis.

Jacobson is desperate to achieve FDA regulation in time to claim credit for lobbying in its favor. He, too, is scrambling to get back in front of the V. If the free-market stampede away from sugar and carbohydrates goes on, soon he will not be able to claim the existence of a crisis as grounds for an FDA takeover of American nutrition. CSPI’s raison d’être will be exposed as intellectual pretense.

Jacobson cannot appeal to consumers directly because they are not about to accept his self-appointed expertise as a substitute for their own. After all, they are the experts on their own bodies, their own tastes and preferences – not Michael Jacobson and CSPI. He wants his views enacted as regulatory law because consumers won’t be able to veto their adoption and will then be stuck with them, like it or not.

How (Not) to Cure Social Problems

The most striking aspect of the nutritional debate is its utter clarity when explained as above. So-called “epidemics” of obesity and diabetes were caused by failures of regulation and academic expertise. They are now being eradicated by the free market. As the lawyers say, these facts are not in dispute. Ms. Obama herself is at pains to establish them, although she does not say so in these words.

The Left only wants nutritional regulation for reasons of naked opportunism. There is no case for regulation to prevent obesity and diabetes because it is too late for that. There is no case for regulation to cure them because that is already happening, to whatever extent possible. There is no case for regulation to prevent future incidence because the free market’s program for prevention is already well under way.

Of course, one could always argue that the free market is taking too long to do its work. The contention that markets work slowly while regulatory government works quickly and expeditiously seems grotesque on its face, which is probably why we don’t hear it advanced by Ms. Obama or Jacobson. Of course, we will continue to be browbeaten with news reports about obesity and diabetes. It takes time for blood-sugar levels to normalize. Nerve damage caused by diabetes, even the Type II kind, is probably irreversible.

But truth is like a cat. Once it escapes confinement, it is hard to get back in the sack.

DRI-293 for week of 3-3-13: The Sequester: A Barack H. Obama Production

An Access Advertising EconBrief:

The Sequester: A Barack H. Obama Production

The appearance of First Lady Michelle Obama as presenter of the climactic Best Picture Academy Award at the recent Oscar ceremony is the latest sign of the symbiosis between American politics and Hollywood. The convergence between the political and entertainment industries is now so close that we can use the same economic model to analyze them.

Since both industries are popular and objects of public scrutiny, this model will have great practical value. Its first application will be to analyze the sequester, the current political-theater production now enjoying its first run on popular media throughout the nation.

The Model

The late Nobel-Prize-winning economist James Buchanan campaigned tirelessly against what he called the “romantic view” of government as the promoter of the “public interest.” Government is composed of particular individuals. In order to be operational, the concept of the “public interest” must be comprehensible to those people. If the activities of government were limited only to those whose net benefits were positive for everybody, it would be a miniscule fraction of its present size. Clearly, the actual purposes of government are redistributive. But what unique redistributive plan could possibly command unanimous support from the bureaucratic minions of government? The only conceivable answer is that bureaucrats serve their own interests, presumably having convinced themselves that their interest and the public interest coincide.

Government bureaucrats thus share a common goal with private business owners. But whereas private businesses produce goods and services in markets under the discipline of market competition, governments provide only executive, legislative and judicial services while contracting out for the production of and needed goods and services. Far from submitting to the discipline of competition, governments claim monopoly privileges for themselves and dispense them to others – often in exchange for political support.

From inception until the gradual disintegration of the studio system of moviemaking, Hollywood operated under the marketplace model of competition. When adverse antitrust decisions in the 1940s killed the long-term viability of the giant studios, movies changed their way of living. This drift away from competitive capitalism accelerated over the last two decades. Today, the approach of government and Hollywood to production is remarkably similar.

Private businesses produce goods and services in order to satisfy the demand of consumers. They satisfy consumer demand in order to earn profits and maximize the profit of their owners. Thus, both sides of the market strive to maximize their real income or utility through the consumption of goods and services. Consumers act directly when purchasing for their own consumption or saving for their future consumption. Producers act indirectly when producing for the consumption of others or directly when producing for themselves. Input suppliers act indirectly by supplying labor and raw materials to producers to facilitate production and consumption for others.

Governments cannot act as private businesses do because their bureaucrats are not spending their own money and taxpayers have no effective leverage over them. Bureaucrats serve their own interests – which are those of the politicians who control their fate. Politicians, in turn, most want to retain their hold on office. For the most part, this is accomplished by redistributing money in favor of those who vote for them. Since government has little or no power to increase the supply of goods and services but considerable power to reduce it, redistribution is accomplished predominantly by harming some people while purporting to help others.

We know that private production is beneficial because consumers voluntarily choose from among many competing products in a free marketplace in which producers can enter and leave at will. The existence of prices allows everybody to incrementally assess the value of every unit of input and output to insure its net benefit before purchase. Profit directs the flow of resources to areas of greatest value to consumers.

None of these safeguards applies to political production. In government, the principle of coercion replaces voluntary choice. No profits exist to tell bureaucrats whether they have succeeded or erred. No prices direct the incremental flow of resources and no competition is allowed to provide an alternative to government provision of goods and services. Sure, voting does take place. But the notion that a one-time choice between a restricted field of two candidates can somehow take the place of millions of everyday choices made under vastly better marketplace circumstances is quaint, if not utterly ridiculous.

How Hollywood Has Come to Resemble Politics

More and more, Hollywood production has come to resemble political production. This evolution has accelerated during the last two decades.

Under the old studio system, motion-picture production often left the confines of Hollywood in favor of distant locations. This was sometimes motivated by concern for production values, as when director John Ford sought the scenic vistas of Monument Valley, Utah for his revival of the Western genre in the 1939 film Stagecoach. Increasingly, however, economics lay behind the decision of producers to abandon Hollywood in favor of locations in the eastern U.S., Canada, Mexico, Spain or elsewhere in Europe. Hollywood production was hamstrung by inefficient work rules established by Hollywood craft unions under the sway of organized crime. It became far cheaper to incur heavy travel costs to foreign locations than to bear the costs of a Hollywood shoot.

That was back in the day when Hollywood still operated under the rules of economics that govern private markets. Today, every state of the Union has a state-level “department of economic development.” These Orwellian entities are distinguished by their lack of adherence to economic principles. In particular, they offer subsidies to private businesses for locating and operating within the state. In the case of motion pictures, this takes the form of subsidies to production companies that shoot movies in-state. The rationale for this activity is almost always a purported “multiplier benefit” to the location’s “economy.”

A subsidy is the opposite number of a tax. Both drive a wedge between the price paid by the buyer and that received by the seller; both are inefficient actions with adverse effects on production and consumption. Whereas a tax causes too little of the taxed good to be produced and consumed, a subsidy causes too much production and consumption of the good affected and too little production and consumption of other things. State agencies justify their actions by ignoring their bad results in favor of the supposed good effects.

The most highly touted benefit of movie-location subsidies is “job creation.” Even under the studio system, it was standard operating procedure for casting directors to scour the rolls of local actors to play subordinate parts, rather than pay travel expenses and higher salaries of Hollywood actors. The principal cast, whose work comprised the guts of the movie, was chosen on the basis of star power and acting ability. This was basic economics at work. Today, however, the pretense that subsidies are necessary to insure work for locals and keep local industry alive is another way in which Hollywood has abandoned economics for politics. Movie subsidies are directly analogous to protective tariffs (taxes) levied on foreign goods to make their prices higher than local prices, thus protecting the jobs of local workers.

There is no economic value in creating or protecting jobs because the end-in-view in all economic activity is consumption, not production. The idea is to provide the best combination of output quantity and quality. The implication behind job creation – rarely stated outright but unmistakable – is that our goal should be to maximize the quantity of human labor employed in producing output, rather than to produce the most and best output. This suggests that the profession of economics should hold up ancient Egypt as its model state. The production of pyramids using slave labor may be the best means ever devised for maximizing the number of human beings doing work and eliminating unemployment. (The slave-labor camps in the old Soviet Union’s Gulag Archipelago are a legitimate contender for the title, but lose out on the grounds that they produced comparatively little tangible output and services.)

Since the general idea is to make people as happy as possible, though, we can rule out “job creation” as our lodestar. The reason it is such a popular political goal despite its economic drawbacks is that it concentrates benefits heavily on a group of easily identifiable people who can readily recognize and gauge their gains. The beneficiaries of a job-creation policy are a good bet to vote for their benefactor.

Another prime example of Hollywood’s shift from economic to political priorities is the re-ordering of the bottom line. The mainstream media still behaves as though the success or failure of a movie depends on its box-office receipts. This was certainly true throughout the 20th century, during the birth and development of the motion picture. But it is no longer true today.

Most movies today are conceived or at least approved by the “talent” – stars, writers, directors and their agents. Studios are coordinating and marketing vehicles. The astronomical fees commanded by the talent, together with high labor and insurance costs, make it prohibitively expensive to make most movies. The only way turn a profit is by marketing ancillary products to young customers. Most movies lose money at the box office and are subsidized by ancillary revenues and (as the studio level) the occasional box-office blockbuster.

The shift in priorities away from the box office has allowed the talent to cater to their own tastes in choosing the subject matter of movies. Under the studio system, the preferences of the audience were worshipped by movie-studio moguls like Louis B. Mayer, Irving Thalberg, Harry Cohn and Darryl Zanuck. Many of the moguls were immigrants and Jews who had strong opinions and might have loved to indulge their own tastes. Instead, they ruthlessly pruned the esoteric and controversial output of their directors, writers and stars because their instincts sensed that public tastes would not embrace it. Now Hollywood’s implicit motto is “the public taste be damned” – an attitude it would condemn unhesitatingly were it struck by a private industry producing hula hoops, automobiles or soap. This allows the talent to freely indulge their political preferences on screen.

Hollywood’s bias has long been to the Left. The Obama administration is now busily engaged in centralizing as much production as possible under the aegis of government – executive, legislative, judicial and regulatory. The case of Solyndra is a representative example of the results. Large subsidies were given for the production of an alternative energy facility. Market demand was unfavorably disposed toward the company’s output and it lost money hand over fist. But ancillary considerations – in this case, the ostensible necessity for the gestation of alternative energy production – outweighed the losses in the disposition of funds.

Losses, subsidies and the substitution of personal priorities for those of consumes has long characterized political production. But now it describes Hollywood, too.

How Politics Has Come to Resemble Hollywood

In the early 1950s, veteran actor and movie star Robert Montgomery was asked to tutor President Dwight D. Eisenhower on the fundamentals of spoken communication to improve Eisenhower’s performance in televised speeches and news conferences. Much was made of this intrusion of Hollywood into the pristine, public-spirited world of politics. The election of Ronald Reagan as Governor of California and U.S. President led to his subsequent anointing as the “Great Communicator” – a title that was given a pejorative cast by his critics on the Left. While these episodes may have painted the Oval Office with a show-business veneer, they hardly tell a story of Faustian corruption.

Today, however, candidates are chosen on the basis of qualities associated with movie stars rather than statesmen. Would a candidate as homely as Lyndon Johnson or with the profile of William Howard Taft even bother to register for the Presidential primaries? Journalists have expressed a public longing to sleep with Bill Clinton and Barack Obama, even though political scientists have never ranked amatory skill among the vital attributes of a Chief Executive. The candidacy of Mitt Romney was widely felt to be fatally handicapped by his biography, as if a Presidency were a movie that needed suitable first and second acts to set the stage for a dramatic finish.

Movies are an emotional medium rather than an intellectual one. Their narrative form is highly stylized, based on that of the theater. Movie scripts can be divided into first, second and third acts. There is a (preferably heroic) protagonist, who wages a conflict with one or more villains during the course of the movie. The protagonist undergoes a transformative experience and emerges better for it. There is a climactic resolution of the conflict.

Today, politicians structure campaigns and issues in this manner. They cast themselves as the hero. They demonize their political opponents as villains. And, most importantly, they appeal to the emotions of voters rather than to their intellect.

The timing of announcements, and sometimes even the substance of policies, is determined by “optics” – the snap judgments and emotive reactions of the public. The weight of issues is measured by their standing in opinion polls rather than by their impact on the real incomes of citizens. Like motion pictures, politics has become a purely emotional business in which objective truth is completely overshadowed by subjective perception.

The Sequester: A Barack H. Obama Production

Now we are engaged in a great civil debate on the issue government spending. It will ultimately determine whether our nation – or any nation so constituted – can long endure. The opening volley in that debate has been fired by President Obama himself. But it has not been launched in the rhetorical tradition of intellectual inquiry and contention. Instead, it has been presented as a production of political theater – a Barack H. Obama production. Its title is: “The Sequester.”

In 2011, the Obama administration and Congressional Republicans fought a symbolic struggle over the raising of the debt limit. In order to orchestrate a victory over Republicans, the President crafted the sequester. The word “sequester” means “to set apart, segregate, or hand over (as to a trustee).” That refers to funds in the budget that were removed from consideration for spending purposes. In return for agreement to raise the debt limit, the President met Republicans halfway by agreeing to spending reductions in the form of sequestration.

Now, in 2013, when the time to follow up on his promise has come, President Obama has rewritten the script. He has recast himself as the hero and Republicans as villains in a melodrama in which spending reductions threaten hardship and economic setback. The original terms of the sequester called for $1.2 trillion in spending reductions spread over 10 years, averaging out to around $120 billion per year in reductions.

The actual reduction for 2013 would be about $85 billion. But there is more to the story. First, the cuts come only from so-called discretionary spending; entitlement programs like Social Security and Medicaid are unaffected. Second, the $85 billion figure reflects a reduction in budgetary authority – the statutory authorization to spend. Actual reduction in government outlays is projected to be only half the $85 billion total, or about $42 billion. The difference is accounted for by “baseline budgeting,” the notorious government budgetary practice that automatically increases expenditures every year. When the budget is ruled by the implicit logic that government spending is always good and a growing country will always need more of it from year to year, it is easy to grasp why the federal government is swimming in a sea of debt.

The biggest chunk of sequestration (about half of the authorized total) is slated to come from military expenditures. The remainder is sprinkled more or less equally throughout the federal discretionary budget, with the proviso that it should be distributed to cause the most pain to the populace. Does that sound like a pejorative characterization? No, The Wall Street Journal cited a memo to precisely that effect. Perhaps the most telling index of the melodramatic nature of this Barack H. Obama production came from a White House memo announcing that free tours of the White House would be cancelled until further notice due to “staffing reductions” caused by the sequester. As various bloggers hastened to point out, the tours are conducted by volunteers.

The President’s exercise in political theater contained many other dramatic high points. A White House fact (!) sheet stated that federal programs like Meals On Wheels would serve 4 million fewer meals thanks to the sequester. The document also claimed that 70,000 youngsters “would be kicked off Head Start,” the subsidy program for pre-school education, thanks to the sequester – a claim backed up by Health and Human Service Secretary Kathleen Sebelius. White House Press Secretary Jay Carney expressed grave concern for federal-government janitors who would receive less overtime pay because of the sequester. Department of Education Secretary Arne Duncan made headlines by declaring that there are “literally now teachers who are getting pink slips,” a whopper so outrageous that he was forced to retract it within 24 hours. Not to be upstaged by his supporting cast, the President himself gravely warned that federal prosecutors “will have to let criminals go” if the sequester is allowed to proceed.

The public is accustomed to seeing movies tell lies in the service of dramatic effect. That is exactly what this Barack H. Obama production does. Like many popular movies, it has borrowed its storyline from other successes. For over three decades, state government legislatures have faced laws – such as Missouri’s Hancock Amendment – limiting state-government spending. The standard legislative tactic of opponents is to concoct a fantasy wish-list of worst-case spending reductions designed to terrify voters into repealing the laws. In fact, the laws say nothing about specific spending cuts. They allow the legislators themselves the flexibility to choose which spending to cut. The legislators are supposed to cut the most wasteful, redundant spending and retain only vital programs – assuming there are any. Yet in practice, the legislators do just the opposite – they pick the most painful cuts in order to blackmail voters into spending ad infinitum.

That tactic, straight from the playbook of radical activist Saul Alinsky, is the plotline of “The Sequester.” It makes no sense. When air-traffic controllers went on strike in 1981, President Ronald Reagan protected consumers, who were otherwise helpless against the threat posed by a government monopoly. He fired the striking controllers and hired replacements. Are we confronted by angry restaurant owners who threaten to close up unless we spend more money dining out? Of course not; the restaurant industry is competitive. Strikers would simply lose business to competitors who would step up to serve consumers. But government monopoly employees can successfully hold taxpayers hostage unless the Executive branch fulfills its duty to protect the public. Instead, the Obama administration is siding with the blackmailers.

The Administration’s economic rationale for its actions is transparently absurd. Unofficial Administration economic advisor Paul Krugman hints darkly of 700,000 lost jobs and the CBO forecasts a loss of one-half point’s worth of economic growth – all due to a net reduction in discretionary spending of $42 billion. Yet the Administration absolutely demanded that the Bush tax cuts end on schedule, producing a much larger effect on “aggregate demand” by Keynesian economic lights. Krugman has consistently maintained that the 2009 stimulus of nearly $800 billion was not nearly large enough to produce marked effects, so how can he now bemoan this piddling spending reduction?

Movie plots are not supposed to make sense. They are structured for emotional impact only. Producers, directors and screenwriters are granted dramatic license to lie in order to manipulate our emotions. Their actors and actresses are expected to speak lines from a script in order to enact the drama.

This is what politics has become. It is political theater, dedicated to the proposition that government of itself, by itself and for itself, shall not perish from the Earth.

DRI-293 for week of 2-17-13: The Man Who Created Today’s Telecommunications Marketplace

An Access Advertising EconBrief:

The Man Who Created Today’s Telecommunications Marketplace

Today we live in a world enveloped by telecommunications. I-phones and Smart-phones provide not only voice communications but data and Internet transmission as well. Cell phones are ubiquitous. Television stations number in the hundreds; their signals are received by consumers in direct broadcast, cable and satellite transmission form. Both radio and TV broadcasts can be streamed over the Internet. The Internet itself is accessible not only using a desktop computer but also via laptops, Wi-Fi and mobile devices.

For anyone below the age of forty, it strains the imagination to envision a world without this all-encompassing marketplace. Yet older inhabitants of the planet can recall a starkly primitive telecommunications habitat. In the United States – the most technologically advanced nation on Earth – there was one telephone company for almost all residents in 1970. There was one satellite transmission provider. In the wildly competitive corner of telecommunications – broadcast television – there were three fiercely competing networks.

How did we get from there to here in forty short years? And can we entertain an alternate scenario in which we might not have made the journey at all? The answers to these questions are chilling, for they open up the possibility that were it not for the efforts of one man, the great revolution in telecommunications might not have happened.

The man who created the telecommunications marketplace of today was Clay “Tom” Whitehead. The unfamiliarity of that name is an index of why we should study the unfolding of competition in the market for telecommunications. Before we introduce the leading character in that drama, we first set the scene by describing the terrain of the market in 1970 – and what shaped it.

The Economic Doctrine of Natural Monopoly

In 1970, American Telephone and Telegraph – the corporate descendant of the Bell Telephone Company founded by Alexander Graham Bell – was the monopoly telephone service provider for virtually all of America. The rationale for this arrangement was provided by the doctrine of natural monopoly.

A natural monopoly was said to exist when a single firm was the most efficient supplier for the entire market. This was caused by the unique cost structure of that market, in which the average cost of production decreased as output increased. It is vital to visualize this as a static condition, not a dynamic one; it is not dependent on a succession of technological innovations of the sort for which Bell’s scientists were renowned. If Bell Labs had never developed a single invention, in other words, the company’s status as a natural monopoly would not have changed.

If decreasing average cost was not due to innovation, what did cause it? The most plausible explanation came from engineering. The 2/3 Rule related the productivity of transmission through a pipe or transportation via a container to its cost. But since its cost increased as the square of surface area while its productivity or throughput increased as the cube of its volume, the average cost or ratio of total cost to total output continually fell as output increased because productivity (in the denominator) increased faster than cost (in the numerator).

Continually falling average cost meant that one firm could constantly lower its price while producing ever more output, while still covering all its costs. This would enable it to underprice and force out any and all competitors. Since monopoly was the eventual fate of the industry anyway, better to relax and enjoy it by declaring a monopolist while striving to mitigate the monopoly outcome.

In America, the mitigation was accomplished by profit regulation. The natural monopoly firm was allowed to earn a “normal” rate of return, sufficient to attract capital to the industry, but no higher. That normal rate of profit was identified by the public utility commission (PUC) based on hearings at which the company, regulators and various interest groups (notably regulators supposedly representing consumers) testified.

When outlined in textbooks and classrooms, this concept sounded surprisingly reasonable. When put into practice, though, it was a mess.

Perhaps the worst feature of PUC-regulation of so-called natural monopoly was the increasing chumminess between commissions and the monopoly firms they oversaw. This sounds like an accusation of collusion, but in reality is was the inevitable by-product of the system. Commissions lacked the technical expertise to regulate a high-tech business. While they possessed both the right and the ability to hire consultants to advise them, trouble and expense relegated this to rate-case hearings at which the profits and rates charged by the company were reviewed. On a day-to-day basis, the commission was forced to cooperate with and rely on the company’s employees to guarantee that the utility’s customers were served.

After all, the firm was a genuine, honest-to-goodness monopoly – not a phony, pseudo-monopoly like the oil companies, which faced scads of competition and any one of whose customers had lots of competitive alternatives to turn to. The oil companies were monopolies only for purposes of political theater, when politicians needed a scapegoat for their foolish energy policies. But if a public utility were threatened with insolvency or operational failure, then the lights might go out or the phones go dead for an entire city, metro area or region. So the PUC was regulating and utility and protecting it at the same time.

Regulation was probably an impossible task anyway, but this ambiguity made things hopeless. The result was that PUCs erred on the side of excessive rates of return and compliance with company wishes. Since high profits were out of the question anyway, public-utility executives took their “excess profits” in the form of perquisites and a quiet life, free from the stresses and strains of ordinary business. Public utilities became noted for lavish facilities, huge administrative budgets and large staffs – in the vernacular of the industry, this was called “gold-plating the rate base.” (The rate base was the agreed-upon list of expenses and investment the company was allowed to recover in rates charged to customers and upon which its rate of return was earned.)

Ordinary businesses feel constant pressure to hold down costs in order to maximize profit; cost-minimization is what helps insure that scarce economic resources are used efficiently to produce output. But public utilities were assured of their profit and coddled by regulators; thus, they faced no pressure to reduce costs or innovate. Indeed, the reverse was true – a cost innovation would theoretically call for new rate hearings to reduce the utility’s rates, since otherwise it would exceed its regulatory allowance of profit. Economists were so fed up with the sluggish pace of technological progress among public utilities in general, and the Bell system in particular, that most viewed the phenomenon of “regulatory lag” as a good thing. It was worth it, they reasoned, for the utility’s profits to exceed its limit in the short run as an inducement to effect cost reductions that would achieve long-run efficiency.

It would seem that PUCs would have faced public criticism for failure to hold down public-utility profits, since that was their primary raison d’être. Commissions sought to inoculate themselves from this criticism by a policy of offering artificially low prices to residential customers of public utilities. Since they had to raise enough total revenue to meet all utility costs plus an allowance for a fat profit, this subsidy to residential customers had to be recouped somewhere. In practice, it was regained by socking business users with onerous rates. The Bell phone companies, for example, charged notoriously high rates to business users of telephone service.

Commissions trotted out a legal rationale for this policy of price discrimination in favor of residential users and against business users. The policy furthered the goal of universal service, claimed commissioners proudly. Because public-utility products were goods like telephone service, electric power and gas service, commissions could plausibly depict them as necessary to public health and safety. Consequently, they justified subsidies to residential users by maintaining the necessity of assuring service to all, regardless of income, on the basis of need.

Of course, the economic logic behind the policy of universal service was non-existent. High rates levied on businesses were not paid by non-human entities called “businesses.” No business ever paid anything in the true economic sense because payment implies a sacrifice of alternative consumption and the utility or happiness delivered by it. Since a business cannot experience happiness – or lose it – a business cannot pay for anything. Those high business rates for phone service, for example, were paid in the long run by consumers of the business’s output in the form of higher prices and by suppliers of inputs to the business in the form of lower remuneration. But to the extent that the public were deceived by the rhetoric of the commission, they may have approved the wasteful doctrine of universal service. This is ironic, for the Bell system never succeeded in increasing the percentage of household subscriptions to phone service to the level of the percentage of households owning a television set. So much for the absolute necessity of telephone ownership!

Meanwhile, public utilities became public menaces when they spotted businesses threatening their turf. Cellular telephone technology was technically feasible as long ago as 1946 (!), but the Bell companies weren’t interested in developing it because they already had a highly profitable and completely secure fiefdom based on landline technology. And they weren’t about to stand idly by while other businesses moved in on their markets! Consequently, applicants for licenses to operate mobile phone businesses were either denied or hamstrung by red tape.

In 1956, the Justice Department was sufficiently fed up with Bell’s antics to launch an antitrust suit against the Bell system. In a sense, this was inherently contradictory since government had granted the monopolies under which the Bell companies operated. But Justice accurately realized that something had to be done to break up the cozy arrangement between Bell and the state and local politicians whose regulation was in fact serving as the barrier to competition in products ancillary to Bell’s landline phone service. It is one measure of the political influence wielded by the Bell empire that this lawsuit proved abortive and was dropped without result.

Another indicator of Bell’s power was the fact that the Bell companies annually issued more debt than did the federal government itself. When the federal antitrust action was revived in 1974, then-Secretary of State George Schultz (formerly a well-known labor economist at the free-market oriented University of Chicago) reminded prosecutors of this fact and advised that the antitrust suit be quashed for fear of “roiling the bond markets” prior to an upcoming bond issue by the U.S. Treasury. This advisory outraged a relatively obscure White House official at the Office of Technology Policy.

Tom Whitehead and the “Open Skies” Policy

In 1970, Clay T. “Tom” Whitehead was a young (32) graduate engineer whose life had taken a detour when he was introduced to economics. He followed up his Master’s in electrical engineering at MIT with a PhD in economics there, studying under noted scholar, theorist and consultant Paul MacEvoy. When the Nixon administration inaugurated the position of White House Office of Telecommunications Policy, Whitehead’s academic credentials and connection to MacEvoy earned him the post of Director. President Nixon viewed the subject of economics with ill-concealed disdain; his aides envisioned the job as a way of grabbing countervailing policymaking power away from the permanent regulatory bureaucracy that controlled the federal government and was dominated by Democrat appointees. Little did they know what kind of policymaker they were getting.

The moon landing in 1969 had achieved the objective of NASA’s space program, which was left with no immediate goal in sight. The Vietnam War had become a fiscal burden as well as a political one, and there was talk of enlisting the private sector to carry some of the financial freight by sponsoring a communications satellite. Up to that point, the satellite program (COMSAT) had been a de facto joint creature of the federal government and AT&T. NASA produced the satellites, the best-known being Telstar. AT&T owned a plurality of the stock shares and seats on the board of directors.

The chairman of the Federal Communications Commission (FCC), a Republican, drafted a proposal for a fully privatized company. It was to be a joint monopoly to be shared by NBC, ABC, CBS, RCA, GTE (a Bell company) and AT&T. The presumption was that satellite communications was a natural monopoly like all other forms of communications – television and radio networks, telephone and telegraph. There was no point in promoting a competitive process that was bound to culminate in a monopoly.

Tom Whitehead begged to differ. He put forward a radically different proposal called the “Open Skies” policy. There was plenty of room in space for many satellites owned by many different private companies, each serving their own interests and customers. There was plenty of bandwidth available for satellites utilize in receiving signals and transmitting them back to Earth. All that was necessary was to adjust orbits and frequencies to preclude collisions and confusion – something that all parties had an interest in doing.

Practically everybody thought Whitehead was crazy. The ones who didn’t doubt him feared him because he threatened their economic or political predominance. But he had the backing of the White House, not for ideological reasons but because he opposed the Establishment, which hated Richard Nixon. And he won his point.

One by one, private firms began sending up communications satellites into space. First came Western Union in 1974. Then came RCA in 1975, followed by Hughes and GTE. The first half-dozen were the pioneers. Eventually, the trickle became a deluge. And the modern age of telecommunications was born.

Privatization of satellite communications also stimulated competition in, and with, cable television. Cable TV had previously been strictly a local phenomenon, tied to AT&T by the need to lease coaxial cable facilities and rights of way. Whitehead approved FCC 1972 policy proposing to loosen federal regulations on cable. In 1974, he chaired a committee whose report advocated federal deregulation of cable. This freed the industry to lease and own satellites and take its product national. Satellite communications allowed competing cable providers uplink popular local and regional stations’ programming to satellite for national distribution. Later, satellite TV emerged as a leading competitor to cable TV, providing more channels, better reception and fewer problems.

More recently, satellite radio and TV have developed their own competitive niches. Satellites have become the transmission media of choice for telecommunications, establishing a transmission position of advantage from which signals could be sent throughout the planet. This revolution was the brainchild of Tom Whitehead.

Tom Whitehead and the Breakup of AT&T

Tom Whitehead did not initiate the antitrust suit against AT&T, nor was he directly involved in prosecuting it. But he was a powerful influence behind it nonetheless.

His staff at OTP had independently reached the conclusion that the political power and economic inertia of the Bell system formed an insuperable obstacle to competition in telecommunications. When he urged them to approach the Department of Justice about reactivating its 1956 suit against Bell, they learned that DOJ was moving in that direction already.

Had the White House opposed this initiative, it would have stalled out like its predecessor. The Department of Defense claimed that the lawsuit was a threat to national security because the Bell system was a vital cog in the national defense. (Among other things, AT&T worked closely with DOD, the Pentagon and the FBI on civil defense, counter-espionage and domestic military exercises.) As noted above, AT&T even wielded financial clout in government circles because its capital-intensive production methods made it even more heavily reliant on debt finance than the federal government itself.

But Whitehead was adamantly in favor of the action. The American public complained about the absurdity of fixing a phone system that wasn’t broke and compared the suit to a parallel action against IBM. In fact, the two had nothing in common, since IBM wasn’t a monopoly while AT&T was a monopoly in the old-time, classical sense – it was not only a single seller of a good with no close substitutes, but entry into its market was legally barred by the government itself.

The regional Bell companies resisted the breakup tenaciously and still to this day continue to fight harder against competition than they do commercially against their competitive rivals. After all, they were created as creatures of regulation, not competition, and don’t really know how to behave in a competitive market.

The result speaks for itself. Today, Americans have decisively rejected landline telephone service and embraced the new world of wireless and digitized telecommunications. They can obtain phone service via cell phones or more sophisticated mobile devices that perform multiple functions. They can combine phone service with data processing functions over the Internet. The last vestiges of the old monopoly remain standing alongside the dying Post Office in the form of mandatory service provided to remote and rural areas. Today, even the staunchest defenders of regulation and the old status quo cannot deny that Whitehead was the visionary and that they were the reactionaries.

Whitehead’s Subsequent Career

After leaving OTP in 1974, Tom Whitehead went first to a subdivision of Hughes Communications, where he started a private cable division. He thus became instrumental in what later became the development of satellite TV. Then he fomented his next revolution by moving to Luxembourg (!), where he started SES Astra, a satellite company that pioneering private television broadcasting in Europe. Before Whitehead, Europe had no private television broadcasters; they were all state-owned.

Luxembourg was chosen because its miniscule size allowed Whitehead and company to chainsaw their way through its government bureaucracy relatively quickly. The nature of their opposition can be gauged by the fact that they faced their first lawsuit within 20 minutes of receiving their incorporation papers. Today, the company Whitehead founded is the world’s second-largest satellite provider, riding herd on more than 50 satellites that serve over 120 million customers.

After retiring, Whitehead taught at GeorgeMasonUniversity where he hosted the world’s leading figures in telecommunications at his seminar. He died in 2008. This year, the Library of Congress received his papers. The American Enterprise Institute commemorated the occasion by organizing a symposium of his friends and co-workers to highlight his role in shaping the world we inhabit.

The Economic Significance of Tom Whitehead

Tom Whitehead’s life starkly defines the importance of individuals to history and human welfare. Only a tiny handful of other human beings on the planet might have occupied his position and achieved the outcomes he did. And without those outcomes, the world would be a vastly different – and far worse – place.

Tom Whitehead was fought tooth and claw by the forces of government regulation. (The historical chain of coincidence that lined up DOJ against AT&T will be the subject of a future EconBrief.) This illustrates the fact that government regulation of business is not a useful supplement to marketplace competition, but rather an inferior substitute for it. The purported aims of regulators are in fact precisely the outcomes toward which competitive markets gravitate. If regulators knew better than businesspeople and consumers how to produce, sell and select appropriate numbers and kinds of goods and services, they would work in the private sector rather than in government. Their position in government places them poorly to run companies or industries, or to impose their will on consumers. In this case, if regulators had their way, we would still occupy the telecommunications equivalent of the Stone Age.

Whitehead’s life illustrated the difference between technological progress and economic progress. Communications satellites became technically possible in the late 1950s; cell phones in the mid-1940s; cable TV in the 1930s. But these did not become economically feasible until the 1970s. And economic feasibility, not technical or engineering feasibility, determines value to humanity.

Economic feasibility requires demand – a use must be found that delivers value to consumers. It requires supply – the technically-feasible product or process must be produced and sold at a sacrifice of alternative output that consumers can accept. Last, but not to be overlooked, the technically feasible product or process must be politically tolerated. Incredible as it might seem, this last hurdle is often the highest.

Tom Whitehead played a direct role in meeting two of these requirements for telecommunications and indirectly allowed the third to be met. He created the telecommunications market we enjoy today as surely as did Edison, Tesla and the technological pioneers of the past.

His name should not languish in obscurity.