DRI-191 for week of 3-15-15: More Ghastly than Beheadings! More Dangerous than Nuclear Proliferation! Its…Cheap Foreign Steel!

An Access Advertising EconBrief:

More Ghastly than Beheadings! More Dangerous than Nuclear Proliferation! Its…Cheap Foreign Steel!

The economic way to view news is as a product called information. Its value is enhanced by adding qualities that make it more desirable. One of these is danger. Humans react to threats and instinctively weigh the threat-potential of any problematic situation. That is why headlines of print newspapers, radio-news updates, TV evening-news broadcasts and Internet websites and blogs all focus disproportionately on dangers.

This obsession with danger does not jibe with the fact that human life expectancy had doubled over the last century and that violence has never been less threatening to mankind than today. Why do we suffer this cognitive dissonance? Our advanced state of knowledge allows us to identify and categorize threats that passed unrecognized for centuries. Today’s degraded journalistic product, more poorly written, edited and produced than formerly, plays on our neuroscientific weaknesses.

Economists are acutely sensitive to this phenomenon. Our profession made its bones by exposing the bogey of “the evil other” – foreign trade, foreign goods, foreign labor and foreign investment as ipso facto evil and threatening. Yet in spite of the best efforts of economists from Adam Smith to Milton Friedman, there is no more dependable pejorative than “foreign” in public discourse. (The word “racist” is a contender for the title, but overuse has triggered a backlash among the public.)

Thus, we shouldn’t be surprised by this headline in The Wall Street Journal: “Ire Rises at China Over Glut of Steel” (03/16/2015, By Biman Mukerji in Hong Kong, John W. Miller in Pittsburgh and Chuin-Wei Yap in Beijing). Surprised, no; outraged, yes.

The Big Scare 

The alleged facts of the article seem deceptively straightforward. “China produces as much steel as the rest of the world combined – more than four times as much as the peak U.S. production in the 1970s.” Well, inasmuch as (a) the purpose of all economic activity is to produce goods for consumption; and (b) steel is a key input in producing countless consumption goods and capital goods, ranging from vehicles to buildings to weapons to cutlery to parts, this would seem to be cause for celebration rather than condemnation. Unfortunately…

“China’s massive steel-making engine, determined to keep humming as growth cools at home, is flooding the world with exports, spurring steel producers around the globe to seek government protection from falling prices. From the European Union to Korea and India, China’s excess metal supply is upending trade patterns and heating up turf battles among local steelmakers. In the U.S., the world’s second-biggest steel consumer, a fresh wave of layoffs is fueling appeals for tariffs. U.S. steel producers such as U.S. Steel Corp. and Nucor Corp. are starting to seek political support for trade action.”

Hmmm. Since this article occupies the place of honor on the world’s foremost financial publication, we expect it to be authoritative. China has a “massive steel-making engine” – well, that stands to reason, since it’s turning out as much steel as everybody else put together. It is “determined to keep humming.” The article’s three (!) authors characterize the Chinese steelmaking establishment as a machine, which seems apropos. They then endow the metaphoric machine with the human quality of determination – bad writing comes naturally to poor journalists.

This determination is linked with “cooling” growth. Well, the only cooling growth that Journal readers can be expected to infer at this point is the slowing of the Chinese government’s official rate of annual GDP growth from 7.5% to 7%. Leaving aside the fact that the rest of the industrialized world is pining for growth of this magnitude, the authors are not only mixing their metaphors but mixing their markets as well. The only growth directly relevant to the points raised here – exports by the Chinese and imports by the rest of the world – is growth in the steel market specifically. The status of the Chinese steel market is hardly common knowledge to the general public. (Later, the authors eventually get around to the steel market itself.)

So the determined machine is reacting to cooling growth by “flooding the world with exports,” throwing said world into turmoil. The authors don’t treat this as any sort of anomaly, so we’re apparently expected to nod our heads grimly at this unfolding danger. But why? What is credible about this story? And what is dangerous about it?

Those of us who remember the 1980s recall that the monster threatening the world economy then was Japan, the unstoppable industrial machine that was “flooding the world” with imports. (Yes, that’s right – the same Japan whose economy has been lying comatose for twenty years.) The term of art was “export-led growth.” Now these authors are telling us that massive exports are a reaction to weakness rather than a symptom of growth.

“Unstoppable” Japan suddenly stopped in its tracks. No country has ever ascended an economic throne based on its ability to subsidize the consumption of other nations. Nor has the world ever died of economic indigestion caused by too many imports produced by one country. The story told at the beginning of this article lacks any vestige of economic sense or credibility. It is pure journalistic scare-mongering. Nowhere do the authors employ the basic tools of international economic analysis. Instead, they employ the basic tools of scarifying yellow journalism.

The Oxymoron of “Dumping” 

The authors have set up their readers with a menacing specter described in threatening language. A menace must have victims. So the authors identify the victims. Victims must be saved, so the authors bring the savior into their story. Naturally, the savior is government.

The victims are “steel producers around the globe.” They are victimized by “falling prices.” The authors are well aware that they have a credibility problem here, since their readers are bound to wonder why they should view falling steel prices as a threat to them. As consumers, they see falling prices as a good thing. As prices fall, their real incomes rise. Falling prices allow consumers to buy more goods and services with their money incomes. Businesses buy steel. Falling steel prices allow businesses to buy more steel. So why are falling steel prices a threat?

Well, it turns out that falling steel prices are a threat to “chief executives of leading American steel producers,” who will “testify later this month at a Congressional Steel Caucus hearing.” This is “the prelude to launching at least one anti-dumping complaint with the International Trade Commission.” And what is “dumping?” “‘Dumping,’ or selling abroad below the cost of production to gain market share, is illegal under World Trade Organization law and is punishable with tariffs.”

After this operatic buildup, it turns out that the foreign threat to America spearheaded by a gigantic, menacing foreign power is… low prices. Really low prices. Visualize buying steel at Costco or Wal Mart.

Oh, no! Not that. Head for the bomb shelters! Break out the bug-out bags! Get ready to live off the grid!

The inherent implication of dumping is oxymoronic because the end-in-view behind all economic activity is consumption. A seller who sells for an abnormally low price is enhancing the buyer’s capability to consume, not damaging it. If anybody is “damaged” here, it is the seller, not the buyer. And that begs the question, why would a seller do something so foolish?

More often than not, proponents of the dumping thesis don’t take their case beyond the point of claiming damage to domestic import-competing firms. (The three Journal reporters make no attempt whatsoever to prove that the Chinese are selling below cost; they rely entirely on the allegation to pull their story’s freight.) Proponents rely on the economic ignorance of their audience. They paint an emotive picture of an economic world that functions like a giant Olympics. Each country is like a great big economic team, with its firms being the players. We are supposed to root for “our” firms, just as we root for our athletes in the Summer and Winter Olympics. After all, don’t those menacing firms threaten the jobs of “our” firms? Aren’t those jobs “ours?” Won’t that threaten “our” incomes, too?

This sports motif is way off base. U.S. producers and foreign producers have one thing in common – they both produce goods and services that we can consume, either now or in the future. And that gives them equal economic status as far as we are concerned. The ones “on our team” are the ones that produce the best products for our needs – period.

Wait a minute – what if the producers facing those low prices happen to be the ones employing us? Doesn’t that change the picture?

Yes, it does. In that case, we would be better off if our particular employer faced no foreign competition. But that doesn’t make a case for restricting or preventing foreign competition in general. Even people who lose their jobs owing to foreign competition faced by their employer may still gain more income from the lower prices brought by foreign competition in general than they lose by having to take another job at a lower income.

There’s another pertinent reason for not treating foreign firms as antagonistic to consumer interests. Foreign firms can, and do, locate in America and employ Americans to produce their products here. Years ago, Toyota was viewed as an interloper for daring to compete successfully with the “Big 3” U.S. automakers. Now the majority of Toyota automobiles sold in the U.S. are assembled on America soil in Toyota plants located here.

Predatory Pricing in International Markets

Dumping proponents have a last-ditch argument that they haul out when pressed with the behavioral contradictions stressed above. Sure, those foreign prices may be low now, import-competing producers warn darkly, but just wait until those devious foreigners succeed in driving all their competitors out of business. Then watch those prices zoom sky-high! The foreigners will have us in their monopoly clutches.

That loud groan you heard from the sidelines came from veteran economists, who would no sooner believe this than ask a zookeeper where to find the unicorns. The thesis summarized in the preceding paragraph is known as the “predatory pricing” hypothesis. The behavior was notoriously ascribed to John D. Rockefeller by the muckraking journalist Ida Tarbell. It was famously disproved by the research of economist John McGee. And ever since, economists have stopped taking the concept seriously even in the limited market context of a single country.

But when propounded in the global context of international trade, the whole idea becomes truly laughable. Steel is a worldwide industry because its uses are so varied and numerous. A firm that employed this strategy would have to sacrifice trillions of dollars in order to reduce all its global rivals to insolvency. This would take years. These staggering losses would be accounted in current outflows. They would be weighed against putative gains that would begin sometime in the uncertain future – a fact that would make any lender blanch at the prospect of financing the venture.

As if the concept weren’t already absurd, what makes it completely ridiculous is the fact that even if it succeeded, it would still fail. The assets of all those firms wouldn’t vaporize; they could be bought up cheaply and held against the day when prices rose again. Firms like the American steel company Nucor have demonstrated the possibility of compact and efficient production, so competition would be sure to emerge whenever monopoly became a real prospect.

The likelihood of any commercial steel firm undertaking a global predatory-pricing scheme is nil. At this point, opponents of foreign trade are, in poker parlance, reduced to “a chip and a chair” in the debate. So they go all in on their last hand of cards.

How Do We Defend Against Government-Subsidized Foreign Trade?

Jiming Zou, analyst at Moody’s Investor Service, is the designated spokesman of last resort in the article. “Many Chinese steelmakers are government-owned or closely linked to local governments [and] major state-owned steelmakers continue to have their loans rolled over or refinanced.”

Ordinary commercial firms might cavil at the prospect of predatory pricing, but a government can’t go broke. After all, it can always print money. Or, in the case of the Chinese government, it can always “manipulate the currency” – another charge leveled against the Chinese with tiresome frequency. “The weakening renminbi was also a factor in encouraging exports,” contributed another Chinese analyst quoted by the Journal.

One would think that a government with the awesome powers attributed to China’s wouldn’t have to retrench in all the ways mentioned in the article – reduce spending, lower interest rates, and cut subsidies to state-owned firms including steel producers. Zou is doubtless correct that “given their important role as employers and providers of tax revenue, the mills are unlikely to close or cut production even if running losses,” but that cuts both ways. How can mills “provide tax revenue” if they’re running huge losses indefinitely?

There is no actual evidence that the Chinese government is behaving in the manner alleged; the evidence is all the other way. Indeed, the only actual recipients of long-term government subsidies to firms operating internationally are creatures of government like Airbus and Boeing – firms that produce most or all of their output for purchase by government and are quasi-public in nature, anyway. But that doesn’t silence the protectionist chorus. Government-subsidized foreign competition is their hole card and they’re playing it for all it’s worth.

The ultimate answer to the question “how do we defend against government-subsidized foreign trade?” is: We don’t. There’s no need to. If a foreign government is dead set on subsidizing American consumption, the only thing to do is let them.

If the Chinese government is enabling below-cost production and sale by its firms, it must be doing it with money. There are only three ways it can get money: taxation, borrowing or money creation. Taxation bleeds Chinese consumers directly; money creation does it indirectly via inflation. Borrowing does it, too, when the bill comes due at repayment time. So foreign exports to America subsidized by the foreign government benefit American consumers at the expense of foreign consumers. No government in the world can subsidize the world’s largest consumer nation for long. But the only thing more foolish than doing it is wasting money trying to prevent it.

What Does “Trade Protection” Accomplish?

Textbooks in international economics spell out in meticulous detail – using either carefully drawn diagrams or differential and integral calculus – the adverse effects of tariffs and quotas on consumers. Generally speaking, tariffs have the same effects on consumers as taxes in general – they drive a wedge between the price paid by the consumer and received by the seller, provide revenue to the government and create a “deadweight loss” of value that accrues to nobody. Quotas are, if anything, even more deleterious. (The relative harm depends on circumstances too complex to enumerate.)

This leads to a painfully obvious question: If tariffs hurt consumers in the import-competing country, why in the world do we penalize alleged misbehavior by exporters by imposing tariffs? This is analogous to imposing a fine on a convicted burglar along with a permanent tax on the victimized homeowner.

Viewed in this light, trade protection seems downright crazy. And in purely economic terms, it is. But in terms of political economy, we have left a crucial factor out of our reckoning. What about the import-competing producers? In the Wall Street Journal article, these are the complainants at the bar of the International Trade Commission. They are also the people economists have been observing ever since the days of Adam Smith in the late 18th century, bellied up at the government-subsidy bar.

In Smith’s day, the economic philosophy of Mercantilism reigned supreme. Specie – that is, gold and silver – was considered the repository of real wealth. By sending more goods abroad via export than returned in the form of imports, a nation could produce a net inflow of specie payments – or so the conventional thinking ran. This philosophy made it natural to favor local producers and inconvenience foreigners.

Today, the raison d’etre of the modern state is to take money from people in general and give it to particular blocs to create voting constituencies. This creates a ready-made case for trade protection. So what if it reduces the real wealth of the country – the goods and services available for consumption? It increases electoral prospects of the politicians responsible and appears to increase the real wealth of the beneficiary blocs, which is sufficient to for legislative purposes.

This is corruption, pure and simple. The authors of the Journal article present this corrupt process with a straight face because their aim is to present cheap Chinese steel as a danger to the American people. Thus, their aims dovetail perfectly with the corrupt aims of government.

And this explains the front-page article on the 03/16/2015 Wall Street Journal. It reflects the news value of posing a danger where none exists – that is, the corruption of journalism – combined with the corruption of the political process.

The “Effective Rate of Protection”

No doubt the more temperate readers will object to the harshness of this language. Surely “corruption” is too harsh a word to apply to the actions of legislators. They have a great big government to run. They must try to be fair to everybody. If everybody is not happy with their efforts, that is only to be expected, isn’t it? That doesn’t mean that legislators aren’t trying to be fair, does it?

Consider the economic concept known as the effective rate of protection. It is unknown to the general public, but is appears in every textbook on international economics. It arises from the conjunction of two facts: first, that a majority of goods and services are composed of raw materials, intermediate goods and final-stage (consumer) goods; and second, that governments have an irresistible impulse to levy taxes on goods that travel across international borders.

To keep things starkly simple and promote basic understanding, take the simplest kind of numerical example. Assume the existence of a fictional textile company. It takes a raw material, cotton, and spin, weaves and processes that cotton into a cloth that it sells commercially to its final consumers. This consumer cloth competes with the product of domestic producers as well as with cotton cloth produced by foreign textile producers. We assume that the prevailing world price of each unit of cloth is $1.00. We assume further that domestic producers obtain one textile unit’s worth of cotton for $.50 and add a further $.50 worth of value to the cloth by spinning, weaving and processing it into the cloth.

We have a basic commodity being produced globally by multiple firms, indicated the presence of competitive conditions. But legislators, perhaps possessing some exalted concept of fairness denied to the rabble, decide to impose a tariff on the importation of cotton. Not wishing to appear excessive or injudicious, the solons set this ad valorem tariff at 15%. Given the competitive nature of the industry, this will soon elevate the domestic price of textiles above the world price by the amount of the tariff; e.g., by $.15, to $1.15. Meanwhile, there is no tariff levied on cotton, the raw material. (Perhaps cotton is grown domestically and not imported into the country or, alternatively, perhaps cotton growers lack the political clout enjoyed by textile producers.)

The insight gained from the effective rate of protection begins with the realization that the net income of producers in general derives from the value they add to any raw materials and/or intermediate products they utilize in the production process. Initially, textile producers added $.50 worth of value for every unit of cotton cloth they produced. Imposition of the tariff allows the domestic textile price to rise from $1.00 to $1.15, which causes textile producers’ value added to rise from $.50 to $.65.

Legislators judiciously and benevolently decided that the proper amount of “protection” to give domestic textile producers from foreign competition was 15%. They announced this finding amid fanfare and solemnity. But it is wrong. The tariff has the explicit purpose of “protecting” the domestic industry, of giving it leeway it would not otherwise get under the supposedly harsh and unrelenting regime of global competition. But this tariff does not give domestic producers 15% worth of protection. $15 divided by $.50 – that is, the increase in value added divided by the original value added – is .30, or 30%. The effective rate of protection is double the size of the “nominal” (statutory) level of protection. In general, think of the statutory tariff rate as the surface appearance and the effective rate as the underlying truth.

Like oh-so-many economic principles, the effective rate of protection is a relatively simple concept that can be illustrated with simple examples, but that rapidly becomes complex in reality. Two complications need mention. When tariffs are also levied on raw materials and/or intermediate products, this affects the relationship between the effective and nominal rate of protection. The rule of thumb is that higher tariff rates on raw materials and intermediate goods relative to tariffs on final goods tend to lower effective rates of protection on the final goods – and vice-versa.

The other complication is the percentage of total value added comprised by the raw materials and intermediate goods prior to, and subsequent to, imposition of the tariff. This is a particularly knotty problem because tariffs affect prices faced by buyers, which in turn affect purchases, which in turn can change that percentage. When tariffs on final products exceed those on raw materials and intermediate goods – and this has usually been the case in American history – an increase in this percentage will increase the effective rate.

But for our immediate purposes, it is sufficient to realize that appearance does not equal reality where tariff rates are concerned. And this is the smoking gun in our indictment of the motives of legislators who promote tariffs and restrictive foreign-trade legislation.

 

Corrupt Legislators and Self-Interested Reporting are the Real Danger to America

In the U.S., the Commercial Code includes thousands of tariffs of widely varying sizes. These not only allow legislators to pose as saviors of numerous business constituent classes. They also allow them to lie about the degree of protection being provided, the real locus of the benefits and the reasons behind them.

Legislators claim that the size of tariff protection being provided is modest, both in absolute and relative terms. This is a lie. Effective rates of protection are higher than they appear for the reasons explained above. They unceasingly claim that foreign competitors behave “unfairly.” This is also a lie, because there is no objective standard by which to judge fairness in this context – there is only the economic standard of efficiency. Legislators deliberately create bogus standards of fairness to give themselves the excuse to provide benefits to constituent blocs – benefits that take money from the rest of us. International trade bodies are created to further the ends of domestic governments in this ongoing deception.

Readers should ask themselves how many times they have read the term “effective rate of protection” in The Wall Street Journal, The Financial Times of London, Barron’s, Forbes or any of the major financial publications. That is an index of the honesty and reputability of financial journalism today. The term was nowhere to be found in the Journal piece of 03/16/2015.

Instead, the three Journal authors busied themselves flacking for a few American steel companies. They showed bar graphs of increasing Chinese steel production and steel exports. They criticized the Chinese because the country’s steel production has “yet to slow in lockstep” with growth in demand for steel. They quoted self-styled experts on China’s supposed “problem [with] hold[ing] down exports” – without every explaining what rule or standard or economic principle of logic would require a nation to withhold exports from willing buyers. They cited year-over-year increases in exports between January, 2013, 2014 and 2015 as evidence of China’s guilt, along with the fact that the Chinese were on pace to export more steel than any other country “in this century.”

The reporters quoted the whining of a U.S. steel vice-president that demonstrating damage from Chinese exports is just “too difficult” to satisfy trade commissioners. Not content with this, they threw in complaints by an Indian steel executive and South Koreans as well. They neglect to tell their readers that Chinese, Indian and South Korean steels tend to be lower grades – a datum that helps to explain their lower prices. U.S. and Japanese steels tend to be higher grade, and that helps to explain why companies like Nucor have been able to keep prices and profit margins high for years. The authors cite one layoff at U.S. steel but forget to cite the recent article in their own Wall Street Journal lauding the history of Nucor, which has never laid off an employee despite the pressure of Chinese competition.

That same article quoted complaints by steel buyers in this country about the “competitive disadvantage” imposed by the higher-priced U.S. steel. Why are the complaints about cheap Chinese exports front-page news while the complaints about high-priced American steel buried in back pages – and not even mentioned by a subsequent banner article boasting input by no fewer than three Journal reporters? Why did the reporters forget to cite the benefits accruing to American steel users from low prices for steel imports? Don’t these reporters read their own newspaper? Or do they report only what comports with their own agenda?

DRI-241 for week of 11-9-14: The Birth of Public-Utility Regulation

An Access Advertising EconBrief:

The Birth of Public-Utility Regulation

Today’s news heralds the wish of President Obama that the Federal Communications Commission (FCC) pass strict rules ensuring that internet providers provide equal treatment to all customers. This is widely interpreted (as, for example, by The Wall Street Journal front-page article of 11/11/2014) as saying that “the Federal Communications Commission [would] declare broadband Internet service a public utility.”

More specifically, the Journal’s unsigned editorial of the same day explains that the President wants the FCC to apply the common-carrier provisions of Title II of the Communications Act of 1934. Its “century-old telephone regulations [were] designed for public utilities.” In fact, the wording was copied from the original federal regulatory legislation, the Interstate Commerce Act of 1887; the word “railroad” stricken and “telephone” was added to “telegraph.”

In other words, Mr. Obama wants to resurrect enabling regulatory legislation that is a century and a quarter old and apply it to the Internet.

We might be pardoned for assuming that the original legislation has been a rip-roaring success. After all, the Internet has revolutionized our lives and the conduct of business around the world. The Internet has become a way of life for young and old, from tribesmen in central Africa to dissidents from totalitarian regimes to practically everybody in developed economies. If we’re now going to entrust its fate to the tender mercies of Washington bureaucrats, the regulatory schema should presumably be both tried and true.

Public-utility regulation has been tried, that’s for sure. Was it true? And how did it come to be tried in the first place?

 

Natural Monopoly: The Party Line on Public-Utility Regulation

 

Public-utility regulation is a subset of the economic field known as industrial organization. Textbooks designed for courses in the subject commonly devote one or more chapters to utility regulation. Those texts rehearse the theory underlying regulation, which is the theory of natural monopoly. According to that theory, the reason we have (or had) regulated public utilities in areas like gas, electricity, telegraphs, telephones and water is that free competition cannot long persist. Regulated public utilities are greatly preferable to the alternative of a single unregulated monopoly provider in each of these fields.

The concept of natural monopoly rests on the principle of decreasing long-run average cost. In turn, this is based on the idea of economies of scale. Consider the production of various economic goods. All other things equal, we might suppose that as all inputs into the production process increase proportionately, the total monetary cost of production for each one might do so as well. Often it does – but not always. Sometimes total cost increases more-than-proportionately, usually because the industry to which the good belongs uses so much of a particular input that expansion bids up the input’s price, thereby increasing total cost more-than-proportionately.

The rarest case is the opposite one, in which total cost increases less-than-proportionately with the increase in output. Although at first thought this seems paradoxical, there are technical factors that occasionally operate to bring it about. One of these is the engineering principle known as the two-thirds rule. In certain applications, such as the thru-put in a pipeline or the contents of containers used by ocean-going freight vessels, the volume varies as the two-thirds power of the surface area of the surrounding enclosure. In other words, when the pipe grows larger and larger, the amount that can be transmitted through the pipe increases more-than proportionately. When the container is made larger, the amount of freight the container can hold increases more than proportionately. The economic implication of this technical law is far-reaching, since the production cost is a function of the size of the pipe or the container (surface area) while the amount of output is a function of the thru-put of the pipe or amount of freight (volume). In other words, this exactly describes the condition called “economies of scale,” in which output increases more-than-proportionately when all inputs are increased equally. Since average cost is the ratio of total cost to output, the fact that the denominator in the ratio increases more than the numerator causes the ratio to fall, thus producing decreasing average total cost.

Why does decreasing average cost create this condition of natural monopoly? Think of unit price as “average revenue.” Decreasing average cost allows a seller to lower price continuously as the scale of output increases. This is important because it suggests that the seller who achieves the largest scale of output – that is, grows faster than competitors – could undersell all others while still charging a viable price. The textbooks go on to claim that after driving all competitors from the field, the successful seller would then achieve an insurmountable monopoly and raise its price to the profit-maximizing point, dialing its output back to the level commensurate with consumer demand at that higher price. Rather than subjecting consumers to the agony of this pure monopoly outcome, better to compromise by settling on an intermediate price and output that allows the regulated monopolist a price just high enough to attract the financial capital it needs to build, expand and maintain its large infrastructure. That is the raison d’etre of public-utility regulation, which is accomplished in the U.S. by an administrative law process involving hearings and testimony before a commission consisting of political appointees. Various interest groups – consumers, the utility company, the commission itself – are legally represented in the hearings.

Why is the regulated price and output termed a “compromise?” The Public Utility Commission (PUC) forces the company to charge a price equal to its average cost, incorporating a rate of profit sufficient to attract investor capital. This regulatory result is intermediate between the outcomes under pure monopoly and pure competition. A profit-maximizing monopoly firm will always maximize profit by producing the rate of output at which marginal revenue is equal to marginal cost. The monopolist’s marginal revenue is less than its average revenue (price) because every change in price affects inframarginal units, either positively or negatively, and the monopolist is all too aware of its singular status and the large number of inframarginal units affected by its pricing decisions. Under pure competition, each firm treats price as a parameter and neglects the tiny effect its supply decisions have on market price; hence price and marginal revenue are effectively equal. Thus, each competitive firm will produce a rate of output at which price equals marginal cost, and the total output resulting from each of these individual firm decisions is larger – and the resulting market price is lower – than would be the case if a single monopoly firm were deciding on price and output for the whole market. The PUC does not attempt to duplicate this pure competitive price because it assumes that, under decreasing average cost, marginal cost is less than average cost and a price less than average cost would not cover all the utility firm’s costs. Rather than subsidize these losses out of public funds (as is commonly done outside of the U.S. and Canada

), the PUC allows a higher price sufficient to cover all costs including the opportunity cost of attracting financial capital.

How well does this theoretical picture of natural monopoly fit industrial reality? Many public-utility industries possess at least some technical features in common with it. Electric and telephone transmission lines, natural-gas pipelines and water pipe all obey the two-thirds rule. This much of the natural monopoly doctrine has a scientific basis. On the other hand, power generation (as opposed to transmission or transport) does not usually exhibit economies of scale. There are plenty of industries that are not regulated public utilities despite showing clear scale economies – ocean-going cargo vessels are one obvious case. This is enough to provoke immediate suspicion of the natural-monopoly doctrine as a comprehensive explanation of public-utility regulation. Suffice it to say that scale economies seldom dominate the production functions even of public-utility goods.

The Myth of the Birth of Public-Utility Regulation – and the Reality

 

In his classic article, (“Hornswoggled! How Ma Bell and Chicago Ed Conned Our Grandparents and Stuck Us With the Bill,” Reason Magazine, February 1986, pp. 29-33), author Marvin N. Olasky recounts the birth of public-utility regulation. When “angry consumers and other critics call for an end to [public-utility] monopolies, choruses of utility PR people and government regulators recite the same old story – once upon a time there was competition among utilities, but ‘the public’ got fed up and demanded regulation… Free enterprise in utilities lost in a fair fight.”

As Olasky reveals, “it makes a good story, but it’s not true.” It helps to superimpose the logic of natural monopoly theory on the scenario spun by the “fair fight” myth. If natural-monopoly logic held good, how would we expect the utility-competition scenario to deteriorate?

Well, the textbooks tell us that the condition of natural monopoly (decreasing long-run average total cost) allows one firm to undersell all others by growing faster. Then it drives rivals out of business, becomes a pure monopoly and gouges consumers with high prices and reduced output. So that’s what we would expect to find as our “fair-fight” scenario: dog-eat-dog competition resulting in the big dog devouring all rivals, then rounding on consumers, whose outraged howls produce the dog-catching regulators who kennel up the company as a regulated public utility. The problem with this scenario is that it never happened. It is nowhere to be found in the history books or contemporary accounts.

Oops.

Well, somebody must have said something about life before utility regulation. After all, it was only about a century ago, not buried in prehistory. If events didn’t unfold according to textbook theory, how did public-utility regulation happen?

Actually, conventional references to the pre-regulatory past are surprisingly sparse. More to the point, they are contradictory. Mostly, they can be grouped under the heading of “wasteful competition.” This is a very different story than the one told by the natural monopoly theory. It maintains that competitive utility provision was a prodigal fiasco; numerous firms all vying for the same market by laying cable and pipe and building transmission lines. All this superfluous activity and expenditure drove costs – and, presumably, prices – through the roof. Eventually, a fed-up public put an end to all this competitive nonsense by demanding relief from the government. This is the scenario commonly cited by the utility PR people and regulators, who care little about theory and even less about logical consistency. All they want is an explanation that will play in Peoria, meeting whatever transitory necessity confronts them at the moment.

Fragmentary support for this explanation exists in the form of references to multiply suppliers of utility services in various markets. In New York City, for example, there were six different electricity franchises granted by one single 1887 City Council resolution. But specific references to competitive chaos are hard to come by, which we wouldn’t expect if things were as bad as they are portrayed.

Could such a situation have arisen and persisted for the 20-40 years that filled the gap between the development of commercial electricity and telephony and the ascendance of public-utility regulation in the decade of the 1920s? No, the thought of competitive firms chasing their tails up the cost curve and losing money for decades is implausible on its face. Anyway, we have gradually pieced together the true picture.

The Reality of Pre-Regulatory Utility Competition

 

Marvin Olasky pinpoints 1905 as a watershed year in the sage of public utilities in America. That year a merger took place between two of the nation’s largest electric companies, Chicago Edison and Commonwealth Electric. Olasky cites a 1938 monograph by economist Burton Behling, which declared that prior to 1905 the market for municipal electricity “was one of full and free competition.” Market structure bore a superficial resemblance to cable television today in that municipalities assigned franchise rights for service to corporate applicants, the significant difference being that “the common policy was to grant franchises to all who applied” and met minimum requirements. Olasky describes the resulting environment as follows: “Low prices and innovative developments resulted, along with some bankruptcies and occasional disruption of service.”

That qualification “some bankruptcies and occasional disruption of service” raises no red flags to economists; it is the tradeoff they expect to encounter for the benefits provided by low prices and innovation. But it is integral to the story we are telling here. The anecdotal tales of dislocation are the source of the historical scare stories told by later generations of economic historians, utility propagandists and left-wing opportunists. They also provided contemporaneous proponents of public-utility regulation with ammunition for their promotional salvos.

Who roamed the utility landscape during the competitive years? In 1902, America Bell Co. had about 1.3 million subscribers, while the independent companies who competed with it had over 2 million subscribers altogether. By 1905, Bell’s industry leadership was threatened sufficiently to inspire publication of a book entitled How the Bell Lost its Grip. In Toledo, OH, an independent company, Home Telephone Co., began competing with Bell in 1901. It charged rates half those of Bell. By 1906, it had 10, 000 subscribers compared to 6,700 for the local Bell Co. In the states of Nebraska and Iowa, independent company subscribers outnumbered those of Bell by 260,000 to 80,000. Numerous cities held referenda on the issue of granting competitive franchises for telephone service. Competition usually won out. In Portland, OR, the vote was 12,213 to 560 in favor of granting the competitive franchise. In Omaha, NE, the independent franchise won by 7,653 to 3,625. A national survey polled 1,400 businessmen on the issue; 1,245 said that competition had or could produce better phone service in their community. 982 said that competition had forced their Bell company to improve its service.

Obviously, one option open to the Bell (and Edison electric) companies was to cut prices to meet competition. But because Bell and Edison were normally the biggest company in a city or region, with the most subscribers, this price cut was much more costly to them than it was to a smaller independent because the big company had so many inframarginal customers. Consequently, these leading companies looked around for alternative ways of dealing with pesky competitors. The great American rule of thumb in business is: If you can’t beat ’em, join’em; if you can’t beat ’em or join ’em, bar ’em.

The Deadly Duo: Theodore Vail and Samuel Insull

 

Theodore Vail was a leading America business executive of the 19th century. He was President of American Bell from 1880 to 1886, and then later rejoined the Bell system when he became an AT&T board member in 1902. Vail commissioned a city-by-city study of Bell’s competitive position. It persuaded him that Bell’s business strategy needed overhauling. Bell’s corporate position had been that monopoly was the only technically feasible arrangement because it enabled telephone users in different parts of a city and even different cities to converse. As a company insider conversant with the latest advances, Vail knew that this excuse was wearing thin because system interconnections were even then becoming possible. Competition was eating into Bell’s market share already, and with interconnection on the horizon Vail knew that Bell’s supremacy would vanish unless it was revitalized.

The idea Vail hit upon was based upon the strategy employed by the railroads about fifteen years earlier. In order to win public acceptance for the special government favors they had received, the roads commissioned puff pieces from free-lance writers and bribed newspaper and magazine editors to print them. Vail expanded this technique into what later came to be called “third-party” editorial services; he employed companies for the sole purpose of producing editorial matter glorifying the Bells. One firm earned over $100,000 from the Bell companies while simultaneously earning $84,000 per year to place some 13,000 favorable articles annually about electric utilities. (These usually appeared as what we would now call “advertorials” – unsigned editorials containing citing no source.) The companies did not formally acknowledge their link with utilities, although it was exposed in investigative works such as 1931’s The Public Pays by Ernest Gruening.

Vail combined this approach with another original tactic borrowed from the railroads – the pre-emptive embrace of government regulation. Political scientist Gabriel Kolko provided documentation for his thesis that the original venture in federal-government regulation, the Interstate Commerce Commission Act of 1887, was sponsored by the railroads themselves as a means of cartelizing the industry and suppressing the troublesome competitive forces that had bankrupted one railroad after another by producing price wars and persistent low rates for freight. The public uproar over differential rates for long hauls and short hauls gave both railroads and regulators the necessary excuse to claim that competition had failed and only regulation could provide “just and reasonable rates.” Not surprisingly, the regulatory solution was to impose fairness and equality by requiring railroads to raise the rates for long hauls to the level of short-haul rates, so that all shippers now paid equal high rates per-mile.

Vail was desperate to suppress competition from independent phone companies, but knew that he would then face the danger of lawsuits under the embryonic Sherman Antitrust Act, which contained a key section forbidding monopolization. The only kind of competition Vail approved of was “that kind which is rather ‘participation’ than ‘competition,’ and operates under agreement as to prices or territory.” That is, Vail explicitly endorsed cartelization over competition. Unfortunately, the Sherman Act also contained a section outlawing price collusion. Buying off the public was clearly not enough; Vail would have to stave off the federal government as well. So he sent AT&T lobbyists to Washington, where they successfully achieved passage of legislation placing interstate telephone and telegraph communications under the aegis of the ICC.

Vail feared competition, not government. He was confident that regulation could be molded and shaped to the benefit of the Bells. He knew that the general public and particularly his fellow businessmen would take a while to warm up to regulation. “Some corporations have as yet not quite got on to the new order of things,” he mused. By the time Vail died in 1920, that new order had largely been established thanks to the work of Vail’s contemporary, Samuel Insull.

Insull emigrated from England in 1881 to become Thomas Edison’s secretary. He rose rapidly to become Edison’s strategic planner and right-hand man. At Edison’s side, Insull saw firsthand the disruptive effects of innovation on markets when competition was allowed to function. Insull made a mental note not to let himself become the disruptee. With Edison’s blessing, Insull took the reins of Chicago Edison in 1892. His tenure gave him an education in the field of politics to complement the one Edison had given him in technology. In 1905, he merged Chicago Edison with Commonwealth Electric to create the nation’s leading municipal power monopoly.

Like Vail, Insull recognized the threat posed by marketplace competition. Like Vail, Insull saw government as an ally and a tool to suppress his competitors. Insull’s embrace of government was even warmer than Vail’s because he perceived its vital role to be placating and anesthetizing the public. As Olasky put it, “Insull argued that utility monopoly… could best be secured by the establishment of government commissions, which would present the appearance of popular control.”

The commission idea would be sold to the public as a democratic means of establishing fair utility rates. Sure, these rates might be lower than the highest rates utility owners could get on their own, but they would certainly be higher than those prevailing with competition. And the regulated rates would be stable, a sure thing, not the crap shoot offered by the competitive market. In a 1978 article in the prestigious Journal of Law and Economics, economic historian Gregg Jarrell documents that the first states to implement utility regulation saw rising prices and profits and falling utility output, while states that retained competitive utility markets had lower utility prices. Jarrell’s conclusion: “State regulation of electric utilities was primarily a pro-producer policy.”

Over the years, this trend continued, even though utility competition died off almost to the vanishing point. Yet it remained true that those few jurisdictions that allowed utility competition – usually phone, sometimes electric – benefitted from lower rates. This attracted virtually no public attention.

Insull realized that the popularity of competition was just as big an obstacle as its reality in the marketplace. So he slanted his public-relations to heighten the public’s fear of socialism and promote utility regulation as the alternative to a government-owned, socialized power system. Insull foresaw that politicians and regulators would need to use the utility company as a whipping boy by pretending to discipline it severely and accusing it of cupidity and greed. This would allow government to assume the posture of a stern guardian of the public welfare and champion of the consumer – all the while catering to the utility’s welfare behind closed doors. Generations of economists became accustomed to seeing this charade performed at PUC hearings. Their cynicism was tempered by the fact that these same economists were earning handsome incomes by specializing as consultants to one of the several interested parties at those hearings. Over the years, this iron quadrangle of interested parties – regulators, lawyers, economists and “consumer advocates” – became the staunchest and most reliable defender of the public-utility regulation process. Despite the fact that these people were in the best position to appreciate the endless waste and hypocrisy, their self-interest blinded them to it.

Insull enthusiastically adopted the promotional methods pioneered by the railroads and imitated by Theodore Vail. One of his third-party firms, the Illinois Committee on Public Utility Information, was led by Insull subordinate Bernard J. Mullaney. The Committee distributed 5 million pieces of pro-utility literature in the state in 1920 and 1921. Mullaney carefully cultivated the favors of editors by feeding them news and information of all kinds in order to earn a key quid pro quo – publication of his press releases. This favoritism went as far as providing the editors with free long-distance telephone service as an in-kind bribe. Not to be overlooked, of course, is that most traditional of all shady relationships in the newspaper business – buying ads in exchange for preferential treatment in the paper. Electric companies, like the Bells, were prodigious advertisers and took lavish advantage of it. In eventual hearings held by the Federal Trade Commission and the Federal Communications Commission, testimony and exhibits revealed that Bell executives had newspaper editors throughout the West and Midwest in their pockets.

Over the years, as public-utility regulation became a respected institution, the need for big-ticket PR support waned. But utilities never stopped cultivating political support. The Bell companies in particular bought legislators by the gross, challenging teachers’ unions as the leading political force in statehouses across the nation. When the challenge of telecommunications deregulation loomed, the Bells were able to stall it off and postpone its benefits to U.S. consumers for a decade longer than those enjoyed abroad.

Profit regulation left utilities with no profit motive to innovate or cut costs. This caused costs to inflate like a hot-air balloon. Sam Insull realized that he could make a healthy profit by guaranteeing his market, killing off his competition and writing his profit in stone through regulation. Then he could ratchet up real income by “gold-plating the rate base” – increasing salaries and other costs and forcing the ratepayers to pay for them. Ironically, he ended up going broke despite owning a big portfolio of utilities. He borrowed huge sums of money to buy them and expand their operations. When the Depression hit, he found that he couldn’t raise rates to service the debt he had run up. He was indicted, left the country, returned to win acquittal on criminal charges but died broke from a heart attack – just one more celebrated riches-to-rags Depression-era tale.

The lack of motivation made utilities a byword for inefficiency. Bell Labs invented the transistor, but AT&T was one of the last companies to use it because it still had vacuum tubes on hand and had no profit motivation to switch and no competitive motivation to serve its customers. An AT&T company made the first cell phone call in 1946, but the technology withered on the vine for 40 years because the utility system had no profit motivation to deploy it. Touch-tone dialing was invented in 1941 but not rolled out until the 1970s. Bell Labs developed early high-speed computer modems but couldn’t test high-speed data transmission because regulators hadn’t approved tariffs (prices) for data transmission. The list goes on and on; in fact, the entire telecommunications revolution began by accident when a regulator became so fed up with AT&T’s inefficiency that he changed one regulation in the 1970s and allowed one company called MCI to compete with the Bells. (We owe Andy Kessler, longtime AT&T employee and current hedge-fund manager, for this litany of innovative ineptitude.)

What is Net Neutrality All About?

 

Today, the call for “net neutrality” by politicians like President Obama is a political pose, just as the call for public-utility regulation was a century ago. Robert Litan of the
Brookings Institution has pointed out the irony that slapping a Title II common-carrier classification on broadband Internet providers would not even prevent them from practicing the paid prioritization of buyers that the President complained of in his speech! Indeed, for most of the 20th century, public utilities practiced price discrimination among different classes of buyers in order to redistribute income from business users to household users.

The Internet as we know it today is the result of an unimpeded succession of competitive innovations over the last three decades; i.e., the very “open and free Internet” that the New York Times claims President Obama will now bestow upon us. Net neutrality would bring all this to a screeching halt by imposing regulation on most of the Web and taxes on consumers. Today, the biggest chunk of phone bills goes to pay for a charge for “universal service,” a redistributive tax ostensibly intended to make sure everybody had phone service. Yet before the proliferation of cell phones, the percentage of the U.S. population owning televisions – which were unregulated and benefitted from no “universal service” tax – was several percentage points higher than the percentage owning and using telephones. In reality, the universal service tax was used to perpetuate the regulatory process itself.

In summary, then, the balance sheet on public utilities shows they were plotted by would-be monopolists to stymie competition and enlist government and regulators as co-conspirators. The conspiracy stuck consumers with high prices, reduced output, mediocre service, high excise taxes and – worst of all – stagnant innovation for decade after decade. All this is balanced against the dubious benefit of stability – the sort of stability the U.S. economy has shown in the last five years.

A similar future awaits us if we treat the Internet’s imagined ills with the regulatory nostrum called net neutrality.

DRI-284 for week of 8-10-14: All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

An Access Advertising EconBrief:

All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

By now most of America must wonder secretly whether the door to race relations is marked “Abandon all hope, ye who enter here.” Blacks – mostly teenagers and young adults, except for those caught in the crossfire – are shot dead every day throughout the country by other blacks in private quarrels, drug deals gone bad and various attempted crimes. Murder is the leading cause of death for young black males in America. We are inured to this. But the relative exception of a black youth killed by a white man causes all hell to break loose – purely on the basis of the racial identities of the principals.

The latest chilling proof of this racial theorem comes from Ferguson, MO, the St. Louis suburb where a policeman shot and killed an unarmed 18-year-old black man on Monday. The fact that the shooter is a policeman reinforces the need for careful investigation and unflinching analysis of the issues involved. The constant intrusion of racial identity is a mountainous obstacle to this process.

The Two Sides to the Story, As Originally Told

The shooting occurred on Saturday afternoon, August 9, 2014, in Ferguson, MO, where 14,000 of the 21,000 inhabitants are black and 50 of 53 assigned St. Louis County Police officers are white. The two sides of the story are summarized in an Associated Press story carrying the byline of Jim Suhr and carried on MSN News 08/13/2014. “Police have said the shooting happened after an [then-unnamed] officer encountered 18-year-old Michael Brown and another man on the street. They say one of the men pushed the officer into his squad car, then physically assaulted him in the vehicle and struggled with the officer over the officer’s weapon. At least one shot was fired inside the car. The struggle then spilled onto the street, where Brown was shot multiple times. In their initial news conference about the shooting, police didn’t specify whether Brown was the person who scuffled with the officer in the car and have refused to clarify their account.”

“Jackson said Wednesday that the officer involved sustained swelling facial injuries.”

“Dorian Johnson, who says he was with Brown when the shooting happened, has told a much different story. He has told media outlets that the officer ordered them out of the street, then tried to open his door so close to the men that it ‘ricocheted’ back, apparently upsetting the officer. Johnson says the officer grabbed his friend’s neck, then tried to pull him into the car before brandishing his weapon and firing. He says Brown started to run and the officer pursued him, firing multiple times. Johnson and another witness both say Brown was on the street with his hands raised when the officer fired at him repeatedly.”

The Reaction by Local Blacks: Protests and Violence

When a white citizen is shot by police under questionable circumstances – an occurrence that is happening with disturbing frequency – the incident is not ignored. But the consequent public alarm is subdued and contained within prescribed channels. Newspapers editorialize. Public figures express concern. Private citizens protest by writing or proclaiming their discontent.

The stylized reaction to a white-on-black incident like the one in Ferguson is quite different. Ever since the civil-rights era that began in the 1950s, these incidents are treated as presumptive civil-rights violations; that is, they are treated as crimes committed because the victim was black. Black “leaders” bemoan the continuing victim status of blacks, viewing the incident as more proof of same – the latest in an ongoing, presumably never-ending, saga of brutalization of blacks by whites. “Some civil-rights leaders have drawn comparisons between Brown’s death and that of 17-year-old Trayvon Martin.”

Rank-and-file blacks gather and march in protest, holding placards and chanting slogans tailored to the occasion. “Some protestors… raised their arms above their heads as they faced the police… The most popular chant has been ‘Hands up! Don’t shoot!'”

Most striking of all is the contrast struck by headlines like “Protests Turn Violent in St. Louis Suburb.” There is no non-black analogue to behavior like this: “Protests in the St. Louis suburb turned violent Wednesday night, with people lobbing Molotov cocktails at police, who responded with smoke bombs and tear gas to disperse the crowd.” This is a repetition of behavior begun in the 1960s, when massive riots set the urban ghettos of Harlem, Philadelphia and Detroit afire.

Joseph Epstein Weighs In

The critic and essayist Joseph Epstein belongs on the short list of the most trenchant thinkers and writers in the English language. His pellucid prose has illumined subjects ranging from American education to gossip political correctness to Fred Astaire. The utter intractability of race in America is demonstrated irrefutably by the fact that the subject reduced Epstein to feeble pastiche.

In his Wall Street Journal op-ed “What’s Missing in Ferguson, MO.”(The Wall Street Journal, Wednesday, August 13, 2014), Epstein notes the stylized character of the episode: “…the inconsolable mother, the testimony of the dead teenager’s friends to his innocence, the aunts and cousins chiming in, the police chief’s promise of a thorough investigation… The same lawyer who represented the [Trayvon] Martin family, it was announced, is going to take this case.”

But according to Epstein, the big problem is that it isn’t stylized enough. “Missing… was the calming voice of a national civil-rights leader of the kind that was so impressive during the 1950s and ’60s. In those days there was Martin Luther King Jr…. Roy Wilkins… Whitney Young… Bayard Rustin…. – all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America’s black population.”

But they are long dead. “None has been replaced by men of anywhere near the same caliber. In their place today there is only Jesse Jackson and Al Sharpton…One of the small accomplishments of President Obama has been to keep both of these men from becoming associated with the White House.” Today, the overriding problem facing blacks is that “no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime and despair.”

Wait just a minute here. What about President Obama? He is, after all, a black man himself. That was ostensibly the great, momentous breakthrough of his election – the elevation of a black man to the Presidency of the United States. This was supposed to break the racial logjam once and for all. If a black man occupying the Presidency couldn’t lead the black underclass to the Promised Land, who could?

No, according to Epstein, it turns out that “President Obama, as leader of all the people, is not well positioned for the job of leading the black population that finds itself mired in despond.” Oh. Why not? “Someone is needed who commands the respect of his or her people, and the admiration of that vast – I would argue preponderate [sic] – number of middle-class whites who understand that progress for blacks means progress for the entire country.”

To be sure, Epstein appreciates the surrealism of the status quo. “In Chicago, where I live, much of the murder and crime… is black-on-black, and cannot be chalked up to racism, except secondarily by blaming that old hobgoblin, ‘the system.’ People march with signs reading ‘Stop the Killing,’ but everyone knows that the marching and the signs and the sweet sentiments of local clergy aren’t likely to change anything. Better education… a longer school day… more and better jobs… get the guns off the street… the absence of [black] fathers – … the old dead analyses, the pretty panaceas, are paraded. Yet nothing new is up for discussion… when Bill Cosby, Thomas Sowell or Shelby Steele… have dared to speak up about the pathologies at work… these black figures are castigated.”

The Dead Hand of “Civil Rights Movement” Thinking

When no less an eminence than Joseph Epstein sinks under the waves of cliché and outmoded rhetoric, it is a sign of rhetorical emergency: we need to burn away the deadwood of habitual thinking.

Epstein is caught in a time warp, still living out the decline and fall of Jim Crow. But that system is long gone, the men who destroyed it and those who desperately sought to preserve it alike. The Kings and Youngs and Wilkins’ and Rustins are gone just as the Pattons and Rommels and Ridgeways and MacArthurs and Montgomerys are gone. Leaders suit themselves to their times. Epstein is lamenting the fact that the generals of the last war are not around to fight this one.

Reflexively, Epstein hearkens back to the old days because they were days of triumph and progress. He is thinking about the Civil Rights Movement in exactly the same way that the political left thinks about World War II. What glorious days, when the federal government controlled every aspect of our lives and we had such a wonderful feeling of solidarity! Let’s recreate that feeling in peacetime! But those feelings were unique to wartime, when everybody subordinates their personal goals to the one common goal of winning the war. In peacetime, there is no such unitary goal because we all have our personal goals to fulfill. We may be willing to subordinate those goals temporarily to win a war but nobody wants to live that way perpetually. And the mechanisms of big government – unwieldy agencies, price and wage controls, tight security controls, etc. – may suffice to win a war against other big governments but cannot achieve prosperity and freedom in a normal peacetime environment.

In the days of Civil Rights, blacks were a collective, a clan, a tribe. This made practical, logistical sense because the Jim Crow laws treated blacks as a unit. It was a successful strategic move to close ranks in solidarity and choose leaders to speak for all. In effect, blacks were forming a political cartel to counter the political setbacks they had been dealt. That is to say, they were bargaining with government as a unit and consenting to be assigned rights as a collective (a “minority”) rather than as free individuals. In social science terms, they were what F. A. Hayek called a “social whole,” whose constituent individual parts were obliterated and amalgamated into the opaque unitary aggregate. This dangerous strategy has since come back to haunt them by obscuring the reality of black individualism.

Consider Epstein’s position. Indian tribes once sent their chief – one who earned respect as an elder, religious leader or military captain, what anthropologists called a “big man” – to Washington for meetings with the Great White Father. Now, Epstein wants to restore the Civil Rights days when black leaders analogously spoke out for their tribal flock. Traditionally, the fate of individuals in aboriginal societies is governed largely by the wishes of the “big man” or leader, not by their own independent actions. This would be unthinkable for (say) whites; when was the last time you heard a call for a George Washington, Henry Ford or Bill Gates to lead the white underclass out of its malaise?

In fact, this kind of thinking was already anachronistic in Epstein’s Golden Age, the heyday of Civil Rights. Many blacks recognized the trap they were headed towards, but took the path of least resistance because it seemed the shortest route to killing off Jim Crow. Now we can see the pitiful result of this sort of collective thinking.

An 18-year-old black male is killed by a police officer under highly suspicious circumstances. Is the focus on criminal justice, on the veracity of the police account, on the evidence of a crime? Is the inherent danger of a monopoly bureaucracy investigating itself and exercising military powers over its constituency highlighted? Not at all.

Instead, the same old racial demons are summoned from the closet using the same ritual incantations. Local blacks quickly turn a candlelight protest vigil into a violent riot. Uh oh – it looks like the natives are getting restless; too much firewater at the vigil, probably. Joseph Epstein bemoans the lack of a chieftain who can speak for them. No, wait – the Great Black Father in Washington has come forward to chastise the violent and exalt the meek and the humble. His lieutenant Nixon has sent a black chief to comfort his brothers. (On Thursday, Missouri Governor Jay Nixon sent Missouri Highway Patrol Captain Ron Johnson, a black man, heading a delegation of troopers to take over security duties in Ferguson.) The natives are mollified; the savage breast is soothed. “All the police did was look at us and shoot tear gas. Now we’re being treated with respect,” a native exults happily. “Now it’s up to us to ride that feeling,” another concludes. “The scene [after the Missouri Highway Patrol took over] was almost festive, with people celebrating and honking horns.” The black chief intones majestically: “We’re here to serve and protect… not to instill fear.” All is peaceful again in the village.

Is this the response Joseph Epstein was calling for? No, this is the phony-baloney, feel-good pretense that he decried, the same methods he recognized from his hometown of Chicago and now being deployed there by Obama confidant Rahm Emmanuel. The restless natives got the attention they sought. Meanwhile, lost in the festive party atmosphere was the case of Michael Brown, which wasn’t nearly as important as the rioters’ egos that needed stroking.

But the Highway Patrol will go home and the St. Louis County Police will be back in charge and the Michael Brown case will have to be resolved. Some six days after the event, the police finally got around to revealing pertinent details of the case; namely, that Michael Brown was suspected of robbing a convenience store of $48.99 worth of boxed cigars earlier that day in a “strong-arm robbery.” Six-year veteran policeman Darren Wilson, now finally identified by authorities, was one of several officers dispatched to the scene.

Of course, the blacks in Ferguson, MO, and throughout America aren’t Indian tribesmen or rebellious children – they are nominally free American individuals with natural rights protected by the U.S. Constitution. But if they expect to be treated with respect 365 days a year they will have to stop acting like juvenile delinquents, stop delegating the protection of their rights to self-serving politicians and hustlers and start asserting the individuality they possess.

The irony of this particular case is that it affords them just that opportunity. But it demands that they shed what Epstein calls “the too-comfortable robes of victimhood.” And they will have to step out from behind the shield of the collective. The Michael Brown case is not important because “blacks” are affronted. It is important because Michael Brown was an individual American just like the whites who get shot down by police every year. If Dorian Johnson is telling the truth, Brown’s individual rights were violated just as surely whether he was black, white, yellow or chartreuse.

Policing in America Today – and the Michael Brown Case

For at least two decades, policing in America has followed two clearly discernible trends. The first of these is the deployment of paramilitary equipment, techniques and thinking. The second is a philosophy is placing the police officer’s well-being above all other considerations. Both of these trends place the welfare of police bureaucrats, employees and officers above that of their constituents in the public.

To an economist, this is a striking datum. Owners or managers of competitive firms cannot place their welfare above that of their customers; if they do, the firm will go bankrupt and cease to exist, depriving the owners of an asset (wealth) and real income and the managers of a job and real income. So what allows a police force (more specifically, the Chief of Police and his lieutenants) to do what a competitive firm cannot do? Answer: The police have a monopoly on the use of force to enforce the law. In the words of a well-known lawyer, the response to the generic question “Can the police do that?” is always “Sure they can. They have guns.”

All bureaucracies tend to be inefficient, even corrupt. But corporate bureaucracies must respond to the public and they must earn profits. So they cannot afford to ignore consumer demand. The only factor to which government bureaucracies respond is variations in their budget, which are functions of political rather than economic variables.

All of these truths are on display in this case. The police have chosen to release only a limited, self-serving account of the incident. Their version of the facts is dubious to say the least, although it could conceivably be correct. Their suppression of rioting protestors employed large, tank-like vehicles carrying officers armed with military gear, weapons and tear gas. Dorian Johnson’s account of the incident is redolent of the modern police philosophy of “self-protection first;” at the first hint of trouble, the officer’s focus is on downing anybody who might conceivable offer resistance, armed or not, dangerous or not.

What does all this have to do with the racial identities of the principals? Absolutely nothing. Oh, it’s barely possible that officer Wilson might have harbored some racial animosity toward Brown or blacks in general. But it’s really quite irrelevant because white-on-black, white-on-white and black-on-white police incidents have cropped up from sea to shining sea in recent years. Indeed, this is an issue that should unite the races rather than dividing them since police are not reluctant to dispatch whites (or Hispanics or Asians, for that matter). While some observers claim the apparent increase in frequency of these cases is only because of the prevalence of cell phones and video cameras, this is also irrelevant; the fact that we may be noticing more abuses now would not be a reason to decry the new technology. As always, the pertinent question is whether or not an abuse of power took place. And those interested in the answer to that question, which should be every American, will have to contend with the unpromising prospect of a police department – a monopoly bureaucracy – investigating itself.

That is the very real national problem festering in Ferguson, MO – not a civil-rights problem, but a civil-wrongs problem.

The Battle Lines

Traditionally, ever since the left-wing counterculture demonized police as “pigs” in the 1960s, the right wing has reflexively supported the police and opposed those who criticized them. Indeed, some of this opposition to the police has been politically tendentious. But the right wing’s general stance is wrongheaded for two powerful reasons.

First, support for law enforcement itself has become progressively less equated to support for the Rule of Law. The number and scope of laws has become so large and excessive that support for the Rule of Law would actually require opposition to the existing body of statutory law.

Second, the monopoly status of the police has enabled them to become so abusive that they now threaten everybody, not merely the politically powerless. Considering the general decrease in crime rates driven by demographic factors, it is an open question whether most people are more threatened by criminals or by abusive police.

Even a bastion of neo-conservatism like The Wall Street Journal is becoming restive at the rampant exercise of monopoly power by police. Consider these excerpts from the unsigned editorial, “The Ferguson Exception,” on Friday, August 15, 2014: “One irony of Ferguson is that liberals have discovered an exercise of government power that they don’t support. Plenary police powers are vast, and law enforcement holds a public trust to use them in proportion to the threats. The Ferguson police must prevent rioting and looting and protect their own safety, though it is reasonable to wonder when law enforcement became a paramilitary operation [emphasis added]. The sniper rifles, black armored convoys and waves of tear gas deployed across Ferguson neighborhoods are jarring in a free society…Police contracts also build in bureaucratic privileges that would never be extended to other suspects. The Ferguson police department has refused to… supply basic information about the circumstances and status of the investigation [that], if it hasn’t been botched already, might help cool passions… how is anyone supposed to draw a conclusion one way or the other without any knowledge of what happened that afternoon?”

The Tunnel… and the Crack of Light at the End

The pair of editorial reactions in The Wall Street Journal typifies the alternatives open to those caught in the toils of America’s racial strife. We can play the same loop over and over again in such august company as Joseph Epstein. Or we can dunk ourselves in ice water, wake up and smell the coffee – and find ourselves rubbing shoulders with the Journal editors.

DRI-303 for week of 5-11-14: The Real ‘Stress Test’ is Still to Come

An Access Advertising EconBrief:

The Real ‘Stress Test’ is Still to Come

Timothy Geithner, former Treasury Secretary and former head of the New York Federal Reserve, is in the news. Like virtually every former policymaker, he has written a book about his experiences. He is currently flogging that book on the publicity circuit. Unlike many other such books, Geithner’s holds uncommon interest – not because he is a skillful writer or a keen analyst. Just the opposite.

Geithner is a man desperate to rationalize his past actions. Those actions have put us on a path to disaster. When that disaster strikes, we will be too stunned and too busy to think clearly about the past. Now is the time to view history coolly and rationally. We must see Geithner’s statements in their true light.

Power and the Need for Self-Justification

In his Wall Street Journal book review of Geithner’s book, Stress Test, James Freeman states that “Geithner makes a persuasive case that he is the man most responsible for the federal bailouts of 2008.” Mr. Freeman finds this claim surprising, but as we will see, it is integral to what Geithner sees as his legacy.

This issue of policy authorship is important to historians, whose job is getting the details right. But it is trivial to us. We want the policies to be right, regardless of their source. That is why we should be worried by Geithner’s need to secure his place in history.

Geithner and his colleagues, Federal Reserve Chairman Ben Bernanke and then-Treasury Secretary Henry Paulson, possessed powers whose exercise would have been unthinkable not that long ago. Nobody seems to have considered how the possession of such vast powers would distort their exercise.

Prior to assumption of the Federal Reserve Chairmanship, Ben Bernanke wrote his dissertation on the causes of the Great Depression. Later, his academic reputation was built on his assessment of mistakes committed by Fed Board members during the 1920s and 30s. When he joined the Board and became Chairman, he vowed not to repeat those mistakes. Thus, we should not have been surprised when he treated a financial crisis on his watch as though it were another Great Depression in the making. Bernanke was the living embodiment of the old saying, “Give a small boy a hammer and he will find that everything he encounters needs pounding.” His academic training had given him a hammer and he proceeded to use it to pound the first crisis he met.

In an interview with “Bloomberg News,” Geithner used the phrase “Great Depression” three times. First, he likened the financial crisis of 2008 to the Great Depression, calling it “classic” and comparing it to the bank runs of the Great Depression. Later, he claimed that we had avoided another Great Depression by following his policies. For Geithner, the Great Depression isn’t so much an actual historical episode or an analytical benchmark as it is an emotional button he presses whenever he needs justification for his actions.

When we give vast power to individuals, we virtually guarantee that they will view events through the lens of their own ego rather than objectively. Bernanke was bound to view his decisions in this light: either apply principles he himself had espoused and built his career upon or run the risk of going down in history as exactly the kind of man he had made his name criticizing – the man who stood by and allowed the Great Depression to happen. Faced with those alternatives, policy activism was the inevitable choice.

Geithner had tremendous power in his advisory capacity as President of the New York Federal Reserve. His choices were: use it or not. Not using it ran the risk of being Hooverized by future generations; that is, being labeled as unwitting, uncaring or worse. Using it at least showed that he cared, even if he failed. The only people who would criticize him would be some far-out, laissez-faire types. Thus, he had everything to gain and little to lose by advising policy activism.

Now, after the fact, the incentive to seek the truth is even weaker than it is in the moment. Now Bernanke, Geithner et al are stuck with their decisions. They cannot change their actions, but they can change anything else – their motivations, those of others, even the truths of history and analysis. If they can achieve by lying or dissembling what they could not achieve with their actions at the time, then dishonesty is a small price to pay. Being honest with yourself can be difficult under the best of circumstances. When somebody is on the borderline between being considered the nation’s savior and its scourge, it is well-nigh impossible.

And a person who begins by lying to himself cannot end up being truthful with the world. No, memoirs like Stress Test are not the place to look for a documentary account of the financial crisis told by an insider. The pressures of power do not shape men like Paulson, Bernanke and Geithner into diamonds, but rather into gargoyles.

We cannot take their words at face value. We must put them under the fluoroscope.

“We Were Three Days Away From Americans Not Being Able to Get Money from ATMs”

Not only are Geithner’s actions under scrutiny, but his timing is also criticized. Many people, perhaps most prominently David Stockman, have insisted that the actual situation faced by the U.S. economy wasn’t nearly dire enough to justify the drastic actions urged by Geithner, et al.

Geithner’s stock reply, found in his book and repeated in numerous interviews, is that the emergency facing the nation left no time for observance of legal niceties or economic precedent. He resuscitates the old quote: “We were three days away from Americans not being able to get money from their ATMs.”

There is an effective reply because its psychological shock value tends to stun the listener into submission. But meek silence is the wrong posture with which to receive a response like this from a self-interested party like Paulson, Bernanke or Geithner. Instead, it demands minute examination.

First, ask ourselves this: Is this a figure of speech or literal truth? That is, what precise significance attaches to the words “three days?”

Recall that Bernanke and Paulson have told us that they realized the magnitude of the emergency facing the country and determined that they must (a) violate protocol by going directly to Congress; and (b) act in secret to prevent public panic. Remember also that Paulson told Congress that if they did not pass bailout legislation by the weekend, Armageddon would ensue. And remember also that, typically, Congress did not act within the deadline specified. It waited  ten days before passing the bailout deal. And the prophesied disaster did not unfold.

In other words, Paulson, Bernanke, et al were exaggerating for effect. How much they were exaggerating can be debated.

That leads to the next logical point. What about the ATM reference itself? Was it specific, meaningful? Or was it just hooey? To paraphrase the line used in courtroom interrogation by litigators (“Are you lying now or were you lying then?”), is Geithner exaggerating now just as Paulson and Bernanke exaggerated then?

Well, Geithner is apparently serious in using this reference. In the same interviews, Geithner calls the financial crisis “a classic financial panic, similar to the bank runs in the Great Depression.” In the 1930s, U.S. banks faced “runs” by depositors who withdrew deposits in cash when they questioned the solvency of banks. Under fractional-reserve banking, banks then (as now) kept only a tiny ratio of deposit liabilities on hand in the form of cash and liquid assets. The runs produced a rash of bank failures, leading to widespread closures and the eventual “bank holiday” proclaimed by newly elected President Franklin Delano Roosevelt. So Geithner’s borrowing of the ATM comment as an index of our distress seems to be clearly intended to suggest an impending crisis of bank liquidity.

There is an obvious problem with this interpretation, the problem being that it is obvious nonsense. Virtually every commentator and reviewer has treated Geithner’s backwards predictions of a “Great Depression” with some throat-clearing version of “well, as we all know, we can’t know what would have happened, we’ll never know, we can’t replay history, history only happens once,” and so forth. But that clearly doesn’t apply to the ATM case. We know – as incontrovertibly as we can know anything in life – what would have happened had bank runs and bank illiquidity a la 1930s so much as threatened in 2008.

Somebody would have stepped to a computer at the Federal Reserve and started creating money. We know this because that’s exactly what did happen in 2010 when the Fed initiated its “Quantitative Easing” program of monetary increase. The overwhelming bulk of the QE money found its way to bank reserve accounts at the Fed where it has been quietly drawing interest ever since. We also know that the usual formalities and intermediaries involving money creation by the Fed could and would have been dispensed with in that sort of emergency. As Fed Chairman, Ben Bernanke was known as “Helicopter Ben” because he was fond of quoting Milton Friedman’s remark that the Fed could get money in public hands by dropping it from helicopters in an emergency, if necessary. Bernanke would not have stood on ceremony in the case of a general bank run; he would have funneled money directly to banks by the speediest means.

In other words, the ATM comment was and is the purest hooey. It has no substantive significance or meaning. It was made, and revived by Geithner, for shock effect only. This is very revealing. It implies a man desperate to achieve his effect, which means his words should be received with utmost caution.

“The Paradox of Financial Crises”

Geithner’s flagship appearance on the promotion circuit was his op-ed in The Wall Street Journal (5/13/2014), “The Paradox of Financial Crises.” The thesis of this op-ed – the “paradox” of the title – is that “the more aggressive the government is in designing a rescue plan, the easier it is to force more restructuring in the financial sector, and the better the chances of leaving the surviving system stronger and less dependent on the taxpayer.” Alas, Geithner complains, “Americans don’t give their presidents much in the way of emergency authority to fight” financial crises. As evidence of the need for this emergency authority, Geithner cites the loss of 16% of U.S. household net worth in 2008, “several times as large as the losses at the start of the Great Depression.”

No doubt eyebrows were raised throughout the U.S. when Geithner bemoaned the lack of emergency authority for a President who has appointed dozens of economic and regulatory “czars,” single-handedly suspended execution of legislation and generally behaved high-handedly. Geithner’s thesis – a generous description of what might reasonably be called a desperate attempt at self-justification – apparently consists of three components: (1) the presumption that financial crises are uniquely powerful and destructive; (2) the claim that, nevertheless, a financial crisis can be counteracted by sufficiently forceful action, taken with sufficient dispatch; and (3) the further claim that he knows what actions to take.

The power of financial crises is a trendy idea given currency by a popular scholarly work by two economists named Rogoff and Reinhart, who surveyed recessions featuring financial panics going back several centuries and ostensibly discovered that their recoveries tended to be slow. How much merit their ideas have is really irrelevant to Geithner’s thesis because Geithner’s interest in financial crises is entirely opportunistic. It began in 2008 with Geithner’s improvisations when faced with the impending failure of Bear Stearns, Lehman Brothers, et al. It perseveres only because Geithner’s legacy is now tied to the success of those machinations – which, unlikely as it might have seemed six years ago, is still in dispute.

Geithner’s theory of financial crises is not the Rogoff/Reinhart theory. It is the Geithner theory, which is: financial crises are uniquely powerful because Geithner needs them to be uniquely powerful in order to justify his unprecedented recommendations for unilateral executive actions. In his book and interviews, Geithner peddles various vague, vacuous generalities about financial crises. In order to these to make sense, they must be based on historical observation and/or statistical regularities. But they cannot jibe with the sentiments expressed above in the Journal. Geithner claims to be enunciating a general theory of financial crisis and rescue. But he is really telling a story of what he did to this particular financial system in the particular financial crisis of 2008.

And no wonder, since the financial system existing in the U.S. in 2008 was and still is like no financial system that existed previously. Instead of “banks” as we previously knew them, the failing financial institutions in 2008 were diversified financial institutions – nominally investment banks, although that activity had by then assumed a minor part of their work – some of whose liabilities would once have been called “near monies.” Meanwhile, the true banks were also diversified into securities and investment banking, and the larger ones controlled the overwhelming bulk of deposit liabilities in the U.S. This historically unprecedented configuration accounted for the determination of Paulson, Bernanke, and Geithner to bail them out at all costs. But they weren’t drawing upon a general theory of crises, because no previous society ever had a financial structure like ours.

Geithner stresses the need to “force more restructuring in the financial sector,” as though every financial crisis was caused by corporate elephantiasis and cured by astute government pruning back of financial firms. This is not only historically wrong but logically deficient, since the past government pruning couldn’t have been very astute if crises kept recurring. Indeed, that is the obvious shortcoming of the second component. There are no precedents – none, zero, nada – for the idea that government policy can either forestall or cure recessions, whether financial or otherwise. This is not for want of trying. If there is one thing governments love to do, it is spend money. If there is another thing governments love to do, it is throw their weight around. Neither has solved the problem of recession so far.

What leads us to believe that Timothy Geithner was and is well qualified to pronounce on the subject of financial crises? Only one thing – his claims that “we did do the essential thing, which was to prevent another Great Depression, with its decade of shantytowns and bread lines. We put out the financial fire…because we wanted to prevent mass unemployment.”

Incredible as it seems now, Timothy Geithner had even fewer economic credentials for his post as Chairman of the New York Federal Reserve than Ben Bernanke had for his as Chairman of the Federal Reserve Board of Governors. Geithner had only one economics course as a Dartmouth undergraduate (he found it “dreary”). His master’s degree at John’s Hopkins was split between international economics and Far Eastern studies. (He speaks Japanese, among other foreign languages.) He put in a three-year stint as a consultant with Henry Kissinger’s consulting firm before graduating to the Treasury, where he spent 13 years before moving to the International Monetary Fund, then becoming Chairman of the New York Fed at age 42. As Freeman observed in his book review, Geithner “never worked in finance or in any type of business” save Kissinger’s consulting firm.

This isn’t exactly a resume of recommendation for a man taking the tiller during a financial typhoon. Maybe it explains what Freeman called Geithner’s “difficulty in understanding the health of large financial firms.”

When asked by interviewers if he had any regrets about his tenure, Geithner regrets not foreseeing the crisis in time to act sooner. This certainly contradicts his theory of crises and his claim of special knowledge – if he was the man with a plan and the man of the moment, why did he fail to foresee the crisis and have to go begging for emergency authorization for Presidential action at the 11th hour? Why should we now eagerly devour the words of a man who claims responsibility for saving the nation while simultaneously admitting that he “didn’t see the crisis coming and didn’t grasp the severity of the problems when it appeared?” He now boasts a special understanding of financial crises, but “didn’t require the banks he was overseeing to raise more capital” at the time of the crisis. In fact, as Freeman discloses, the minutes of the Federal Reserve show that Geithner denies that the banking system in general was undercapitalized even while other Fed governors were proposing that banks meet a capital call.

Geithner offers no particular reason why we should believe anything he says and ample reasons for doubt.

“The Government and the Central Bank Have to Step In and Take Risks”

Geithner’s book and publicity tour are a public-relations exercise designed to change his image. Ironically, this involves a tradeoff. He had image problems with both the right wing and the left wing, so gains on one side rate to lose him support on the other side. The Wall Street Journal piece shows that he wants to burnish his left profile. He closes by lamenting that “we were not able to do all that was important or desirable.  …Long-term unemployment remains alarmingly high. There are very high levels of poverty and appalling inequality, not just in income and wealth, but in the opportunities Americans have for a quality education or economic mobility.” Having spent the bulk of the op-ed apologizing for not allowing undeserving Wall Street bankers to go broke, he now nods frantically to every left-wing preoccupation. None of this has anything to do with a financial crisis or emergency authorizations or stress tests, of course – it is just Geithner stroking his left-wing critics.

The real sign that Geithner’s allegiance is with the left is his renunciation of the concept of “moral hazard.” Oh, he gives lip service to the fact that when the government bails out business and subsidizes failure, this will encourage subsequent businessmen to take excessive risks on a “heads I win, tails the government bails me out” expectation. But he savagely criticizes the moral hazard approach as “Old Testament” thinking. (The fact that “Old Testament” is now a pejorative is significant in itself; one wonders what significance “New Testament” would have.) “What one has to do in a panic is the opposite of what seems fair and just. In a financial crisis, the natural instinct is to let creditors suffer losses, let firms fail, and protect taxpayers from any risk of loss. But in a financial panic, a strategy based on those instincts will lead to depression-level unemployment. Instead, the government and the central bank have to step in and take risks on a scale that the private sector can’t and won’t… reduce the incentive for investors, lenders and depositors to run…raise the confidence of businesses and individuals… breaking a vicious cycle in which the fear of a financial-system collapse and a deep recession feed on each other and become self-fulfilling.”

This is surely the clearest sign that Geithner is engaging in ex post rationalization and improvisation. For centuries, economists have debated the question of whether recessions are real or monetary in origin and substance. Now Geithner emerges with the secret: they are psychological. Keynes, it seems, was the second-most momentous thinker of the 1930s, behind Sigmund Freud. All we have to do is overcome our “natural instinct” and rid ourselves of those awful “Old Testament” morals and bail out the right people – creditors – instead of the wrong people – taxpayers.

Once again, commentators have glossed over the most striking contradictions in this tale. For five years, we have listened ad nauseum to scathing denunciations of bankers, real-estate brokers, developers, investment bankers, house flippers and plain old home buyers who went wild and crazy, taking risks right and left with reckless abandon. But now Geithner is telling us that the problem is that “the private sector can’t and won’t …take risks on a scale” sufficient to save us from depression! So government and the central bank (!) must gird their loins, step in and do the job.

But this is a tale left unfinished.  Geithner says plainly that his actions saved us from a Great Depression. He also says that salvation occurred because government and the Fed assumed risks on a massive scale. What happened to those risks? Did they vanish somewhere in a puff of smoke or cloud of dust? If not, they must still be borne. And if the risks are still active, that means that we have not, after all, been saved from the Great Depression; it has merely been postponed.

It is not too hard to figure out what Geithner is saying between the lines. He wants to justify massive Federal Reserve purchases of toxic bank assets and the greatest splurge of money creation in U.S. history – without having to mention that these put us all on a hook where we remain to this day.

In this sense, Timothy Geithner’s book was well titled. Unfortunately, he omitted to mention that the most stressful test is yet to come.

DRI-265 for week of 2-23-14: False Confession Under Torture: The So-Called Re-Evaluation of the Minimum Wage

An Access Advertising EconBrief:

False Confession Under Torture: The So-Called Re-Evaluation of the Minimum Wage

For many years, the public pictured an economist as a vacillator. That image dated back to President Harry Truman’s quoted wish for a “one-armed economist,” unable to hedge every utterance with “on the one hand…on the other hand.”

Surveys of economists belied this perception. The profession has remained predominantly left-wing in political orientation, but its support for the fundamental logic of markets has been strong. Economists have backed free international trade overwhelmingly. They have opposed rent control – which socialist economist Assar Lindbeck deemed the second-best way to destroy a city, ranking behind only bombing. And economists have denounced the minimum wage with only slightly less force.

Now, for the first time, this united front has begun to break up. Recently a gaggle of some 600 economists, including seven Nobel Laureates, has spoken up in favor of a 40% increase in the minimum wage. The minimum wage has always retained public support. But what could possibly account for this seeming about-face by the economics profession?

The CBO Study

This week, the Congressional Budget Office (CBO) released a study that was hailed by both proponents and opponents of the minimum wage. The CBO study tried to estimate the effects of raising the current minimum of $7.25 per hour to $9 and $10.10, respectively. It provided an interval estimate of the job loss resulting from President Obama’s State of the Union suggestion of a $10.10 minimum wage. The interval stretched from roughly zero to one million. It took the midpoint of this interval – 500,000 jobs – as “the” estimate of job loss because… because…well, because 500,000 is halfway between zero and 1,000,000, that’s why. Averages seem to have a mystical attraction to statisticians as well as to the general public.

Economists looking for signs of orthodox economic logic in the CBO study could find them. “Some jobs for low-wage workers would probably be eliminated, the income of most workers who became jobless would fall substantially, and the share of low-wage workers who were employed would probably fall slightly.” The minimum wage is a poorly-targeted means of increasing the incomes of the poor because “many low-income workers are not members of low-income families.” And when an employer chooses which low-wage workers to retain and which to cut loose after a minimum-wage hike, he will likely retain the upper-class employee with good education and social skills and lay off the first-time entrant into the labor force who is poor in income, wealth and human capital. These are traditional sentiments.

On the other hand, the Obama administration’s hired gun at the Council of Economic Advisers (CEA), Chairman Jason Furman, looked inside the glass surrounding the minimum wage and found it half-full. He characterized the CBO’s job-loss conclusion as a “0.3% decrease in employment” that “could be essentially zero.” Furman cited the CBO estimate that 16.5 million workers would receive an increase in income as a result of the minimum-wage increase. Net benefits to those whose incomes currently fall below the so-called poverty line are estimated at $5 billion. The overall effect on real income – what economists would call the general equilibrium result of the change – is estimated to be a $2 billion increase in real income.

The petitioning economists, the CBO and the CEA clearly are all not viewing the minimum wage through the traditional textbook prism. What caused this new outlook?

The “New Learning” and the Old-Time Religion on the Minimum Wage

The impetus to this eye-opening change has ostensibly been new research. Bloomberg Businessweek devoted a lead article to the supposed re-evaluation of the minimum wage. Author Peter Coy declares that “the argument that a wage floor kills jobs has been weakened by careful research over the past 20 years.” Not surprisingly, Coy locates the watershed event as the Card-Krueger comparative study of fast-food restaurants in New Jersey and Pennsylvania in 1994. This study not only made names for its authors, it began the campaign to make the minimum wage respectable in academic economic circles.

“The Card-Krueger study touched off an econometric arms race as labor economists on opposite sides of the argument topped one another with increasingly sophisticated analyses,” Coy relates. “The net result has been to soften the economics profession’s traditional skepticism about minimum wages.” If true, this would be sign of softening brains, not skepticism. The arguments advanced by the re-evaluation of the minimum wage have been around for decades. Peter Coy is saying that, somehow, new studies done in the last 20 years have produced different results than those done for the previous fifty years, and those different results justify a turnabout by the economics profession.

That stance is, quite simply, hooey. Traditional economic opposition to the minimum wage was never based on empirical research. It was based on the economic logic of choice in markets, which argues unequivocally against the minimum wage. Setting a wage above the market-determined wage will create a surplus of low-skilled labor; e.g., unemployment. Thus, any gains accruing to the workers who retain their jobs will come at the expense of workers who lose their jobs. The public supports the minimum wage on the misapprehension that the gains come at the expense of employers. This is true only transitorily, during the period in which some firms go out of business, prices rise and workers are laid off. During this short-run transition period, the gains of still-employed workers come at the expense of business owners and laid-off workers. But once the adjustments occur, the business owners who survive the transition are once again earning a “normal” (competitive) rate of profit, as they were before the minimum wage went up. Now, and indefinitely going forward, the gains of still-employed workers come at the expense of laid-off workers and consumers who pay higher prices for the smaller supply of goods and services produced by low-skilled workers.

The still-employed workers are by no means all “poor,” despite the face that they earn the minimum wage. Some are teenagers in middle- or upper-class households, whose good educations and social skills preserved their jobs after the minimum-wage hike. Some are older workers whose superior discipline and work skills made them irreplaceable. The workers who rate to lose their jobs are the poorest and least able to cope – namely, first-time job holders and those with the fewest cognitive and social skills. The minimum wage transfers income from the poor to the non-poor. What a victory for social justice! That is why even the left-wing economists like Alan Blinder formerly pooh-poohed the minimum wage as a means of helping the poor. (While he was Chairman of the CEA under President Clinton, Blinder was embarrassed when the arguments against the minimum wage in his economics textbook were juxtaposed alongside the administration’s support of a minimum-wage increase.)

This does not complete the roster of the minimum wage’s defects. Government price-setting has mirror-image effects on both above-market prices and below-market prices. By creating a surplus of low-skilled labor, the minimum wage makes it costless for employers to discriminate against a class of workers they find objectionable – black, female, politically or theologically incorrect, etc. Black-market employment of illegal workers – immigrants or off-the-books employees – can now gain a foothold. Business owners are encouraged to substitute machines for workers and have done so throughout the history of the minimum wage. In cases such as elevator operators, this has caused whole categories of workers to vanish. This expanded range of drawbacks somehow never finds its way into popular discussions of the minimum wage, which are invariably confined to the effects on employment and income distribution.

“If there are negative effects on total employment, the most recent studies show, they appear to be small,” according to Bloomberg Businessweek.  The trouble is that the focus of the minimum wage is not properly on total employment. The minimum wage itself applies only to the market for low-skilled labor, comprising roughly 20 million Americans. There are certainly effects on other labor and product markets. But it is difficult enough to estimate the quantitative effect of the minimum wage on the one market directly affected, let alone to gauge the secondary impact on the other markets comprising the remaining 300 million people. The Obama administration, the vocal economists, the Bloomberg Businessweek and the political Left are ostensibly concerned with the poor. Why, then, do they insist on couching employment effects only in total terms?

It is clear that the same reasons why economists have traditionally chosen not to confuse the issue by dragging in total employment are also the reasons why economists now choose precisely to do so. They want to confuse the issue, to disguise the full magnitude of the adverse effects on low-skilled workers by hiding them inside the much smaller percentage effect on total employment. That is what allows CEA Chairman Jason Furman to brag that the “CBO’s central estimate…leads to a 0.3% decrease in employment… [that] could be essentially zero.” 500,000 is not 0.3% of 20 million (that would be 60,000) but rather 0.3% of the larger total work force of around 170 million. 0.3% sounds like such a small number. That’s almost zero, isn’t it? Surely that isn’t such a high price to pay for paying people what they’re worth – or what a bunch of economists think they’re worth, anyway.

But we digress. Just what is it that causes those “apparently small” effects on total employment, anyway? “Higher wages reduce turnover by reducing job satisfaction, so at any given moment there are fewer unfilled openings. Within reasonable ranges of a minimum wage, the churn-reducing effect seems to offset whatever staff reductions occur because of higher labor costs. Also, some businesses manage to pass along the costs to customers without harming sales.”

This is mostly warmed-over sociology, imported by economists for cosmetic purposes. American industry is pockmarked with industries plagued by high turnover, such as trucking. If higher wages were a panacea for this problem, it would have been solved long since. Today, we have a minimum wage. We also have a gigantic mismatch of unfilled jobs and discouraged workers. The shibboleth of businesses “passing along” costs to consumers with impunity was a cherished figment imagined in books by John Kenneth Galbraith in the 1950s and 60s, but neither Galbraith nor today’s economists can explain what hypnotic power businesses exert over consumers to accomplish this feat.

The magic word never mentioned by Peter Coy or the 600 economists or Jason Furman is productivity. Competitive markets enforce a strict link between market wages and productivity – specifically, between the wage and the discounted marginal value product of the marginal worker’s labor. Once that link is severed, the tether to economic logic has been cut and the discussion drifts along in never-never land. The political Left maunders on about the “dignity of human labor” and “a living wage” and “the worth of a human being” – nebulous concepts that have no objective meaning but allow the user to attach their own without fear of being proven wrong.

Bloomberg Businessweek‘s cover features a young baggage handler holding a sign identifying his job and duties, with a caption reading “How Much Is He Worth?” Inside the magazine, a page is taken up with workers posing for pictures showing their jobs and their own estimation of their “worth.” These emotive exercises may or may not sell magazines, but they prove and solve nothing. Asking a low-skilled worker to evaluate their own worth is like asking a cancer victim what caused their disease. Broadcast journalists do it all the time, but if that were really valuable, we would have cured cancer long ago. If a low-skilled worker were an expert on valuing labor, he or she would qualify as an entrepreneur – and would be set up to make some real money.

A Fine-Tuned Minimum Wage

Into the valley of brain death rode the 600 economists who supported a minimum wage of $10.10 per hour. Their ammunition consisted of fine-tuning based on econometrics. Let us hear from Paul Osterman, labor economist of MIT. “To jump from $7.25 to $15 would be a long haul. That would in my view be a shock to the system.” Mr. Osterman, exercising his finely-honed powers of insight denied to the rabble, is able to peer into the econometric mists and discern that $10.10 would be …somehow… just right – barely felt by 320 million people generating $16 trillion in goods and services, but $15 – no, that would shock the system. In other words, that first 40% increase would be hardly a tickle, but the subsequent 38% would be a bridge too far.

In any other context, it would be quite a surprise to the economics profession to discover that the study of econometrics had advanced this far. (The phrase “science of econometrics” was avoided advisedly.) For decades, graduate students in economics were taught a form of logical positivism originally outlined by John Neville Keynes (father of John Maynard Keynes) and developed by Milton Friedman. Economic theory was advanced by developing hypotheses couched in the form of conditional predictions. These were then tested in order to evaluate their worth. The tests ranged from simple observation to more complex tests of statistical inference. Hypotheses meeting the tests were retained; those failing to do so were discarded.

Simple and attractive though that may sound, this philosophy has failed utterly in practice. The tests have failed to convince anybody; it is axiomatic that no economic theory was ever accepted or rejected on the basis of econometric evidence. And the econometric tools themselves have been the subject of increasing skepticism by economists themselves as well as the outside world. One of the ablest and most respected practitioners, Edward Leamer, titled a famous 1983 article, “Let’s Take the Con Out of Econometrics.”

The time period pictured by Peter Coy as an “econometric arms race” employing “increasingly sophisticated” tools and models overlapped with a steadily growing scandal enveloping the practice of econometrics – or, more precisely, statistical practice across both the natural and social sciences. Within economics alone, it concerned the continuing failure of the leading economists and economic journals to correctly enforce the proper interpretation of the term “statistical significance.” This failure has placed the quantitative value of most of the econometric work done in the last 30 years in question.

The general public’s exposure to the term has encouraged it to regard a “statistically significant” variable or event as one that is quantitatively large or important. In fact, that might or might not be true; there is no necessary connection between statistical significance and quantitative importance. The statistician needs to take measures apart from ascertaining statistical significance in order to gauge quantitative importance, such as calculating a loss function. In practice, this has been honored more in the breach than the observance. Two leading economic historians, Deirdre McCloskey and Steven Ziliak, have conducted a two-decade crusade to reform the statistical practice of their fellow scientists. Their story is not unlike that of the legendary Dr. Simmelweis, who sacrificed his career in order to wipe out childbed fever among women by establishing doctors’ failure to wash their hands as the transmitter of the disease.

This scandal could not be more relevant to the current rehabilitation of the minimum wage. The entire basis for that rehabilitation is supposedly the new, improved econometric work done beginning in 1994 – the very time when the misuse and overemphasis of statistical significance was in full swing. The culprits included many of the leading economists in the profession – including Drs. Card and Krueger and their famous 1994 study, which was one of dozens of offending econometric studies identified by McCloskey and Ziliak. And the claim made by today’s minimum-wage proponents is that their superior command of econometrics allows them to gauge the quantitative effects of different minimum-wages so well that they can fine-tune the choice of a minimum wage, picking a minimum wage that will benefit the poor without causing much loss of jobs and real income. But judging the quantitative effect of dependent variables is exactly what econometrics has done badly from the 1980s to the present, owing to its preoccupation with statistical significance. The last thing in the world that the lay public should do is take the quantitative pretensions of these economists on faith.

This doesn’t sound like a profession possessing the tools and professional integrity necessary to fine-tune a minimum wage to maximize social justice – whatever that might mean. In fact, there is no reason to take recent pronouncements by economists on the minimum wage at face value. This is not professional judgment talking. It is political partisanship masquerading as analytical economics.

The Wall Street Journal pointed out that the $2 billion net gain in real income projected by the CBO if the minimum wage were to rise to $10.10 is a minute percentage gain compared to the size of a $16 trillion GDP. (It is slightly over 0.001%.) The notion of risking a job loss of one million for a gain of that size is quixotic. Even more to the point, the belief that economists can predict gains or losses of that tiny magnitude in a general equilibrium context using econometrics is absurd. The CEA and the CBO are allowing themselves to be used for political purposes and, in the process, allowing the discipline of economics to be prostituted.

The increasing politicization of economics is beginning to produce the same effects that subservience to political orthodoxy produced on Russian science under Stalin. The Russian scientist Lysenko became immortal not because of his scientific achievements but because of his willingness to distort science to comport with Communist doctrine. The late, great economist Ronald Coase once characterized the economics profession’s obsession with econometrics as a determination to “torture the data until it confesses.” Those confessions are now taking on the hue of Soviet-style confessions from the 1930s, exacted under torture from political dissidents who wouldn’t previously knuckle under to the regime. Today, politically partisan economists torture recalcitrant data on the minimum wage in order to extract results favorable to their cause.

The CBO and the CEA should have new stationery printed. Its logo should be an image of Lubyanka Prison in old Soviet Russia.

DRI-259 for week of 2-2-14: Kristallnacht for the Rich: Not Far-Fetched

An Access Advertising EconBrief:

Kristallnacht for the Rich: Not Far-Fetched

Periodically, the intellectual class aptly termed “the commentariat” by The Wall Street Journal works itself into frenzy. The issue may be a world event, a policy proposal or something somebody wrote or said. The latest cause célèbre is a submission to the Journal’s letters column by a partner in one of the nation’s leading venture-capital firms. The letter ignited a firestorm; the editors subsequently declared that Tom Perkins of Kleiner Perkins Caulfield & Byers “may have written the most-read letter to the editor in the history of The Wall Street Journal.”

What could have inspired the famously reserved editors to break into temporal superlatives? The letter’s rhetoric was both penetrating and provocative. It called up an episode in the 20th century’s most infamous political regime. And the response it triggered was rabid.

“Progressive Kristallnacht Coming?”

“…I would call attention to the parallels of fascist Nazi Germany to its war on its “one percent,” namely its Jews, to the progressive war on the American one percent, namely “the rich.” With this ice breaker, Tom Perkins made himself a rhetorical target for most of the nation’s commentators. Even those who agreed with his thesis felt that Perkins had no business using the Nazis in an analogy. The Wall Street Journal editors said “the comparison was unfortunate, albeit provocative.” They recommended reserving Nazis only for rarefied comparisons to tyrants like Stalin.

On the political Left, the reaction was less measured. The Anti-Defamation League accused Perkins of insensitivity. Bloomberg View characterized his letter as an “unhinged Nazi rant.”

No, this bore no traces of an irrational diatribe. Perkins had a thesis in mind when he drew an analogy between Nazism and Progressivism. “From the Occupy movement to the demonization of the rich, I perceive a rising tide of hatred of the successful one percent.” Perkins cited the abuse heaped on workers traveling Google buses from the cities to the California peninsula. Their high wages allowed them to bid up real-estate prices, thereby earning the resentment of the Left. Perkins’ ex-wife Danielle Steele placed herself in the crosshairs of the class warriors by amassing a fortune writing popular novels. Millions of dollars in charitable contributions did not spare her from criticism for belonging to the one percent.

“This is a very dangerous drift in our American thinking,” Perkins concluded. “Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?” Perkins point is unmistakable; his letter is a cautionary warning, not a comparison of two actual societies. History doesn’t repeat itself, but it does rhyme. Kristallnacht and Nazi Germany belong to history. If we don’t mend our ways, something similar and unpleasant may lie in our future.

A Short Refresher Course in Early Nazi Persecution of the Jews

Since the current debate revolves around the analogy between Nazism and Progressivism, we should refresh our memories about Kristallnacht. The name itself translates loosely into “Night of Broken Glass.” It refers to the shards of broken window glass littering the streets of cities in Germany and Austria on the night and morning of November 9-10, 1938. The windows belonged to houses, hospitals, schools and businesses owned and operated by Jews. These buildings were first looted, then smashed by elements of the German paramilitary SA (the Brownshirts) and SS (security police), led by the Gauleiters (regional leaders).

In 1933, Adolf Hitler was elevated to the German chancellorship after the Nazi Party won a plurality of votes in the national election. Almost immediately, laws placing Jews at a disadvantage were passed and enforced throughout Germany. The laws were the official expression of the philosophy of German anti-Semitism that dated back to the 1870s, the time when German socialism began evolving from the authoritarian roots of Otto von Bismarck’s rule. Nazi officialdom awaited a pretext on which to crack down on Germany’s sizable Jewish population.

The pretext was provided by the assassination of German official Ernst vom Rath on Nov. 7, 1938 by a 17-year-old German boy named Herschel Grynszpan. The boy was apparently upset by German policies expelling his parents from the country. Ironically, vom Rath’s sentiments were anti-Nazi and opposed to the persecution of Jews. Von Rath’s death on Nov. 9 was the signal for release of Nazi paramilitary forces on a reign of terror and abduction against German and Austrian Jews. Police were instructed to stand by and not interfere with the SA and SS as long as only Jews were targeted.

According to official reports, 91 deaths were attributed directly to Kristallnacht. Some 30,000 Jews were spirited off to jails and concentration camps, where they were treated brutally before finally winning release some three months later. In the interim, though, some 2-2,500 Jews died in the camps. Over 7,000 Jewish-owned or operated businesses were damaged. Over 1,000 synagogues in Germany and Austria were burned.

The purpose of Kristallnacht was not only wanton destruction. The assets and property of Jews were seized to enhance the wealth of the paramilitary groups.

Today we regard Kristallnacht as the opening round of Hitler’s Final Solution – the policy that produced the Holocaust. This strategic primacy is doubtless why Tom Perkins invoked it. Yet this furious controversy will just fade away, merely another media preoccupation du jour, unless we retain its enduring significance. Obviously, Tom Perkins was not saying that the Progressive Left’s treatment of the rich is now comparable to Nazi Germany’s treatment of the Jews. The Left is not interning the rich in concentration camps. It is not seizing the assets of the rich outright – at least not on a wholesale basis, anyway. It is not reducing the homes and businesses of the rich to rubble – not here in the U.S., anyway. It is not passing laws to discriminate systematically against the rich – at least, not against the rich as a class.

Tom Perkins was issuing a cautionary warning against the demonization of wealth and success. This is a political strategy closely associated with the philosophy of anti-Semitism; that is why his invocation of Kristallnacht is apropos.

The Rise of Modern Anti-Semitism

Despite the politically correct horror expressed by the Anti-Defamation Society toward Tom Perkins’ letter, reaction to it among Jews has not been uniformly hostile. Ruth Wisse, professor of Yiddish and comparative literature at HarvardUniversity, wrote an op-ed for The Wall Street Journal (02/04/2014) defending Perkins.

Wisse traced the modern philosophy of anti-Semitism to the philosopher Wilhelm Marr, whose heyday was the 1870s. Marr “charged Jews with using their skills ‘to conquer Germany from within.’ Marr was careful to distinguish his philosophy of anti-Semitism from prior philosophies of anti-Judaism. Jews “were taking unfair advantage of the emerging democratic order in Europe with its promise of individual rights and open competition in order to dominate the fields of finance, culture and social ideas.”

Wisse declared that “anti-Semitism channel[ed] grievance and blame against highly visible beneficiaries of freedom and opportunity.” “Are you unemployed? The Jews have your jobs. Is your family mired in poverty? The Rothschilds have your money. Do you feel more secure in the city than you did on the land? The Jews are trapping you in the factories and charging you exorbitant rents.”

The Jews were undermining Christianity. They were subtly perverting the legal system. They were overrunning the arts and monopolizing the press. They spread Communism, yet practiced rapacious capitalism!

This modern German philosophy of anti-Semitism long predated Nazism. It accompanied the growth of the German welfare state and German socialism. The authoritarian political roots of Nazism took hold under Otto von Bismarck’s conservative socialism, and so did Nazism’s anti-Semitic cultural roots as well. The anti-Semitic conspiracy theories ascribing Germany’s every ill to the Jews were not the invention of Hitler, but of Wilhelm Marr over half a century before Hitler took power.

The Link Between the Nazis and the Progressives: the War on Success

As Wisse notes, the key difference between modern anti-Semitism and its ancestor – what Wilhelm Marr called “anti-Judaism” – is that the latter abhorred the religion of the Jews while the former resented the disproportionate success enjoyed by Jews much more than their religious observances. The modern anti-Semitic conspiracy theorist pointed darkly to the predominance of Jews in high finance, in the press, in the arts and running movie studios and asked rhetorically: How do we account for the coincidence of our poverty and their wealth, if not through the medium of conspiracy and malefaction? The case against the Jews is portrayed as prima facie and morphs into per se through repetition.

Today, the Progressive Left operates in exactly the same way. “Corporation” is a pejorative. “Wall Street” is the antonym of “Main Street.” The very presence of wealth and high income is itself damning; “inequality” is the reigning evil and is tacitly assigned a pecuniary connotation. Of course, this tactic runs counter to the longtime left-wing insistence that capitalism is inherently evil because it forces us to adopt a materialistic perspective. Indeed, environmentalism embraces anti-materialism to this day while continuing to bunk in with its progressive bedfellows.

We must interrupt with an ironic correction. Economists – according to conventional thinking the high priests of materialism – know that it is human happiness and not pecuniary gain that is the ultimate desideratum. Yet the constant carping about “inequality” looks no further than money income in its supposed solicitude for our well-being. Thus, the “income-inequality” progressives – seemingly obsessed with economics and materialism – are really anti-economic. Economists, supposedly green-eyeshade devotees of numbers and models, are the ones focusing on human happiness rather than ideological goals.

German socialism metamorphosed into fascism. American Progressivism is morphing from liberalism to socialism and – ever more clearly – honing in on its own version of fascism. Both employed the technique of demonization and conspiracy to transform the mutual benefit of free voluntary exchange into the zero-sum result of plunder and theft. How else could productive effort be made to seem fruitless? How else could success be made over into failure? This is the cautionary warning Perkins was sounding.

The Great Exemplar

The great Cassandra of political economy was F.A. Hayek. Early in 1929, he predicted that Federal Reserve policies earlier in the decade would soon bear poisoned fruit in the form of a reduction in economic activity. (His mentor, Ludwig von Mises, was even more emphatic, foreseeing “a great crash” and refusing a prestigious financial post for fear of association with the coming disaster.) He predicted that the Soviet economy would fail owing to lack of a functional price system; in particular, missing capital markets and interest rates. He predicted that Keynesian policies begun in the 1950s would culminate in accelerating inflation. All these came true, some of them within months and some after a lapse of years.

Hayek’s greatest prediction was really a cautionary warning, in the same vein as Tom Perkins’ letter but much more detailed. The 1945 book The Road to Serfdom made the case that centralized economic planning could operate only at the cost of the free institutions that distinguished democratic capitalism. Socialism was really another form of totalitarianism.

The reaction to Hayek’s book was much the same as reaction to Perkins’ letter. Many commentators who should have known better have accused both of them of fascism. They also accused both men of describing a current state of affairs when both were really trying to avoida dystopia.

The flak Hayek took was especially ironic because his book actually served to prevent the outcome he feared. But instead of winning the acclaim of millions, this earned him the scorn of intellectuals. The intelligentsia insisted that Hayek predicted the inevitable succession of totalitarianism after the imposition of a welfare state. When welfare states in Great Britain, Scandinavia, and South America failed to produce barbed wire, concentration camps and German Shepherd dogs, the Left advertised this as proof of Hayek’s “exaggerations” and “paranoia.”

In actual fact, Great Britain underwent many of the changes Hayek had feared and warned against. The notorious “Rules of Engagements,” for instance, were an attempt by a Labor government to centrally control the English labor market – to specify an individual’s work and wage rather than allowing free choice in an impersonal market to do the job. The attempt failed just a dismally as Hayek and other free-market economists had foreseen it would. In the 1980s, it was Hayek’s arguments, wielded by Prime Minister Margaret Thatcher, which paved the way for the rolling back of British socialism and the taming of inflation. It’s bizarre to charge the prophet of doom with inaccuracy when his prophecy is the savior, but that’s what the Left did to Hayek.

Now they are working the same familiar con on Tom Perkins. They begin by misconstruing the nature of his argument. Later, if his warnings are successful, they will use that against him by claiming that his “predictions” were false.

Enriching Perkins’ Argument

This is not to say that Perkins’ argument is perfect. He has instinctively fingered the source of the threat to our liberties. Perkins himself may be rich, but argument isn’t; it is threadbare and skeletal. It could use some enriching.

The war on the wealthy has been raging for decades. The opening battle is lost to history, but we can recall some early skirmishes and some epic brawls prior to Perkins.

In Europe, the war on wealth used anti-Semitism as its spearhead. In the U.S., however, the popularity of Progressives in academia and government made antitrust policy a more convenient wedge for their populist initiatives against success. Antitrust policy was a crown jewel of the Progressive movement in the early 1900s; Presidents Theodore Roosevelt and William Howard Taft cultivated reputations as “trust busters.”

The history of antitrust policy exhibits two pronounced tendencies: the use of the laws to restrict competition for the benefit of incumbent competitors and the use of the laws by the government to punish successful companies for various political reasons. The sobering research of Dominick Armentano shows that antitrust policy has consistently harmed consumer welfare and economic efficiency. The early antitrust prosecution of Standard Oil, for example, broke up a company that had consistently increased its output and lowered prices to consumers over long time spans. The Orwellian rhetoric accompanying the judgment against ALCOA in the 1940s reinforces the notion that punishment, not efficiency or consumer welfare, was behind the judgment. The famous prosecutions of IBM and AT&T in the 1970s and 80s each spawned book-length investigations showing the perversity of the government’s claims. More recently, Microsoft became the latest successful firm to reap the government’s wrath for having the temerity to revolutionize industry and reward consumers throughout the world.

The rise of the regulatory state in the 1970s gave agencies and federal prosecutors nearly unlimited, unsupervised power to work their will on the public. Progressive ideology combined with self-interest to create a powerful engine for the demonization of success. Prosecutors could not only pursue their personal agenda but also climb the career ladder by making high-profile cases against celebrities. The prosecution of Michael Milken of Drexel Burnham Lambert is a classic case of persecution in the guise of prosecution. Milken virtually created the junk-bonk market, thereby originating an asset class that has enhanced the wealth of investors by untold billions or trillions of dollars. For his pains, Milken was sent to jail.

Martha Stewart is a high-profile celebrity who was, in effect, convicted of the crime of being famous. She was charged and convicted of lying to police about a case in which the only crime could have been the offense of insider-trading. But she was the trader and she was not charged with insider-trading. The utter triviality and absence of any damage to consumers or society at large make it clear that she was targeted because of her celebrity; e.g., her success.

Today, the impetus for pursuing successful individuals and companies today comes primarily from the federal level. Harvey Silverglate (author of Three Felonies Per Day) has shown that virtually nobody is safe from the depredations of prosecutors out to advance their careers by racking up convictions at the expense of justice.

Government is the institution charged with making and enforcing law, yet government has now become the chief threat to law. At the state and local level, governments hand out special favors and tax benefits to favored recipients – typically those unable to attain success on their own efforts – while making up the revenue from the earned income of taxpayers at large. At the federal level, Congress fails in its fundamental duty and ignores the law by refusing to pass budgets. The President appoints czars to make regulatory law, while choosing at discretion to obey the provisions of some laws and disregard others. In this, he fails his fundamental executive duty to execute the laws faithfully. Judges treat the Constitution as a backdrop for the expression of their own views rather than as a subject for textual fidelity. All parties interpret the Constitution to suit their own convenience. The overarching irony here is that the least successful institution in America has united in a common purpose against the successful achievers in society.

The most recent Presidential campaign was conducted largely as a jihad against the rich and successful in business. Mitt Romney was forced to defend himself against the charge of succeeding too well in his chosen profession, as well as the corollary accusation that his success came at the expense of the companies and workers in which his private-equity firm invested. Either his success was undeserved or it was really failure. There was no escape from the double bind against which he struggled.

It is clear, than, that the “progressivism” decried by Tom Perkins dates back over a century and that it has waged a war on wealth and success from the outset. The tide of battle has flowed – during the rampage of the Bull Moose, the Depression and New Deal and the recent Great Recession and financial crisis – and ebbed – under Eisenhower and Reagan. Now the forces of freedom have their backs to the sea.

It is this much-richer context that forms the backdrop for Tom Perkins’ warning. Viewed in this panoramic light, Perkins’ letter looks more and more like the battle cry of a counter-revolution than the crazed rant of an isolated one-percenter.