DRI-172 for week of 1-18-15: Consumer Behavior, Risk and Government Regulation

An Access Advertising EconBrief: 

Consumer Behavior, Risk and Government Regulation

The Obama administration has drenched the U.S. economy in a torrent of regulation. It is a mixture of new rules formulated by new regulatory bodies (such as the Consumer Financial Protection Bureau), new rules levied by old, preexisting federal agencies (such as those slapped on bank lending by the Federal Reserve) and old rules newly imposed or enforced with new stringency (such as those emanating from the Department of Transportation and bedeviling the trucking industry).

Some people within the business community are pleased by them, but it is fair to say that most are not. But the President and his subordinates have been unyielding in his insistence that they are not merely desirable but necessary to the health, well-being, vitality and economic growth of America.

Are the people affected by the regulations bad? Do the regulations make them good, or merely constrain their bad behavior? What entitles the particular people designing and implementing the regulations to perform in this capacity – is it their superior motivations or their superior knowledge? That is, are they better people or merely smarter people than those they regulate? The answer can’t be democratic election, since regulators are not elected directly. We are certainly entitled to ask why a President could possibly suppose that some people can effectively regulate an economy of over 300 million people. If they are merely better people, how do we know that their regulatory machinations will succeed, however well-intentioned they are? If they are merely smarter people, how do we know their actions will be directed toward the common good (whatever in the world that might be) and not toward their own betterment, to the exclusion of all else? Apparently, the President must select regulators who are both better people and smarter people than their constituents. Yet government regulators are typically plucked from comparative anonymity rather than from the firmament of public visibility.

Of all American research organizations, the Cato Institute has the longest history of examining government regulation. Recent Cato publications help rebut the longstanding presumptions in favor of regulation.

The FDA Graciously Unchains the American Consumer

In “The Rise of the Empowered Consumer” (Regulation, Winter 2014-2015, pp.34-41, Cato Institute), author Lewis A. Grossman recounts the Food and Drug Administration’s (FDA) policy evolution beginning in the mid-1960s. He notes that “Jane, a [hypothetical] typical consumer in 1966… had relatively few choices” across a wide range of food-products like “milk, cheese, bread and jam” because FDA’s “identity standards allowed little variation.” In other words, the government determined what kinds of products producers were allowed to legally produce and sell to consumers. “Food labels contained barely any useful information. There were no “Nutrition Facts” panels. The labeling of many foods did not even include a statement of ingredients. Nutrient content descriptors were rare; indeed, the FDA prohibited any reference whatever to cholesterol. Claims regarding foods’ usefulness in preventing disease were also virtually absent from labels; the FDA considered any such statement to render the product an unapproved – and thus illegal – drug.”

Younger readers will find the quoted passage startling; they have probably assumed that ingredient and nutrient-content labels were forced on sellers over their strenuous objections by noble and altruistic government regulators.

Similar constraints bound Jane should she have felt curiosity about vitamins, minerals or health supplements. The types and composition of such products were severely limited and their claims and advertising were even more severely limited by the FDA. Over-the-counter medications were equally limited – few in number and puny in their effectiveness against such infirmities as “seasonal allergies… acid indigestion…yeast infection[s] or severe diarrhea.” Her primary alternative for treatment was a doctor’s visit to obtain a prescription, which included directions for use but no further enlightening information about the therapeutic agent. Not only was there no Internet, copies of the Physicians’ Desk Reference were unavailable in bookstores. Advertising of prescription medicines was strictly forbidden by the FDA outside of professional publications like the Journal of the American Medical Association.

Food substances and drugs required FDA approval. The approval process might as well have been conducted in Los Alamos under FBI guard as far as Jane was concerned. Even terminally ill patients were hardly ever allowed access to experimental drugs and treatments.

From today’s perspective, it appears that the position of consumers vis-à-vis the federal government in these markets was that of a citizen in a totalitarian state. The government controlled production and sale; it controlled the flow of information; it even controlled the life-and-death choices of the citizenry, albeit with benevolent intent. (But what dictatorship – even the most savage in history – has failed to reaffirm the benevolence of its intentions?) What led to this situation in a country often advertised as the freest on earth?

In the late 19th and early 20th centuries, various incidents of alleged consumer fraud and the publicity given them by various muckraking authors led Progressive administrations led by Theodore Roosevelt, William Howard Taft and Woodrow Wilson to launch federal-government consumer regulation. The FDA was the flagship creation of this movement, the outcome of what Grossman called a “war against quackery.”

Students of regulation observe this common denominator. Behind every regulatory agency there is a regulatory movement; behind every movement there is an “origin story;” behind every story there are incidents of abuse. And upon investigation, these abuses invariably prove either false or wildly exaggerated. But even had they been meticulously documented, they would still not substantiate the claims made for them and not justify the regulatory actions taken in response.

Fraud was illegal throughout the 19th and 20th century and earlier. Competitive markets punish producers who fail to satisfy consumers by putting the producers out of business. Limiting the choices of producers and consumers harms consumers without providing compensating benefits. The only justification for FDA regulation of the type provided for the first half of the 20th century was that government regulators were omniscient, noble and efficient while consumers were dumbbells. That is putting it baldly but it is hardly an overstatement. After all, consider the situation that exists today.

Plentiful varieties of products exist for consumers to pick from. They exist because consumers want them to exist, not because the FDA decreed their existence. Over-the-counter medications are plentiful and effective. The FDA tries to regulate their uses, as it does for prescription medications, but thankfully doctors can choose from a plethora of “off-label” uses. Nutrient and ingredient labels inform the consumer’s quest to self-medicate such widespread ailments as Type II diabetes, which spread to near-epidemic status but is now being controlled thanks to rejection of the diet that the government promoted for decades and embrace of a diet that the government condemned as unsafe. Doctors and pharmacists discuss medications and supplements with patients and provide information about ingredients, side effects and drug interactions. And patients are finally rising in rebellion against the tyranny of FDA drug approval and the pretense of compassion exhibited by the agency’s “compassionate use” drug-approval policy for patients facing life-threatening diseases.

Grossman contrasts the totalitarian policies of yesteryear with the comparative freedom of today in polite academic language. “The FDA treated Jane’s… cohort…as passive, trusting and ignorant consumers. By comparison, [today’s consumer] has unmediated [Grossman means free] access to many more products and to much more information about those products. Moreover, modern consumers have acquired significant influence over the regulation of food and drugs and have generally exercised that influence in ways calculated to maximize their choice.”

Similarly, he explains the transition away from totalitarianism to today’s freedom in hedged terms. To be sure, the FDA gave up much of its power over producers and consumers kicking and screaming; consumers had to take all the things listed above rather than receive them as the gifts of a generous FDA. Nevertheless, Grossman insists that consumers’ distrust of the word “corporation” is so profound that they believe that the FDA exerts some sort of countervailing authority to ensure “the basic safety of products and the accuracy and completeness of labeling and advertising.” This concerning an agency that fought labeling and advertising tooth and claw! As to safety, Grossman makes the further caveat that consumers “prefer that government allow consumers to make their own decisions regarding what to put in their bodies…except in cases in which risk very clearly outweighs benefit” [emphasis added]. That implies that consumers believe that the FDA has some special competence to assess risks and benefits to individuals, which completely contradicts the principle that individuals should be free to make their own choices.

Since Grossman clearly treats consumer safety and risk as a special case of some sort, it is worth investigating this issue at special length. We do so below.

Government Regulation of Cigarette Smoking

For many years, individual cigarette smokers sued cigarette companies under the product-liability laws. They claimed that cigarettes “gave them cancer,” that the cigarette companies knew it and that consumers didn’t and that the companies were liable to selling dangerous products to the public.

The consumers got nowhere.

To this day, an urban legend persists that this run of legal success was owed to deep financial pockets and fancy legal footwork. That is nonsense. As the leading economic expert on risk (and the longtime cigarette controversy), W. Kip Viscusi, concluded in Smoke-Filled Rooms: A Postmortem on the Tobacco Deal, “the basic fact is that when cases reached the jury, the jurors consistently concluded that the risks of cigarettes were well-known and voluntarily incurred.”

In the early 1990s, all this changed. States sued the tobacco companies for medical costs incurred by government due to cigarette smoking. The suits never reached trial. The tobacco companies settled with four states; a Master Settlement Agreement applied to remaining states. The aggregate settlement amount was $243 billion, which in the days before the Great Recession, the Obama administration and the Bernanke Federal Reserve was a lot of money. (To be sure, a chunk of this money was gobbled up by legal fees; the usual product-liability portion is one-third of the settlement, but gag orders have hampered complete release of information on lawyers’ fees in these cases.)

However, the states were not satisfied with this product-liability bonanza. They increased existing excise taxes on cigarettes. In “Cigarette Taxes and Smoking,” Regulation (Winter 2014-2015, pp. 42-46, Cato Institute), authors Kevin Callison and Robert Kaestner ascribe these tax increases to “the hypothesis… that higher cigarette taxes save a substantial number of lives and reduce health-care costs by reducing smoking, [which] is central to the argument in support of regulatory control of cigarettes through higher cigarette taxes.”

Callison and Kaestner cite research from anti-smoking organizations and comments to the FDA that purport to find price elasticities of demand for cigarettes of between -0.3 and -0.7 percent, with the lower figure applying to adults and the higher to adolescents. (The words “lower” and “higher” refer to the absolute, not algebraic, value of the elasticities.) Price elasticity of demand is defined as the percentage change in quantity demanded associated with a 1 percent change in price. Thus, a 1% increase in price would cause quantity demanded to fall by between 0.3% and 0.7% according to these estimates.

The problem with these estimates is that they were based on research done decades ago, when smoking rates were much higher. The authors estimate that today’s smokers are mostly the young and the poorly educated. Their price elasticities are very, very low. Higher cigarette taxes have only a miniscule effect on consumption of cigarettes. They do not reduce smoking to any significant extent. Thus, they do not save on health-care costs.

They serve only to fatten the coffers of state governments. Cigarette taxes today play the role played by the infamous tax on salt levied by French kings before the French Revolution. When the tax goes up, the effective price paid by the consumer goes up. When consumption falls by a much smaller percentage than the price increase, tax revenues rise. Both the cigarette-tax increase of today and the salt-tax increases of the 17th and 18th century were big revenue-raisers.

In the 1990s, tobacco companies were excoriated as devils. Today, though, several of the lawyers who sued the tobacco companies are either in jail for fraud, under criminal accusation or dead under questionable circumstances. And the state governments who “regulate” the tobacco companies by taxing them are now revealed as merely in it for the money. They have no interest in discouraging smoking, since it would cut into their profits if smoking were to fall too much. State governments want smoking to remain price-inelastic so that they can continue to raise more revenue by raising taxes on cigarettes.

 

Can Good Intentions Really Be All That Bad? The Cost of Federal-Government Regulation

The old saying “You can’t blame me for trying” suggests that there is no harm in trying to make things better. The economic principle of opportunity cost reminds us that the use of resources for one purpose – in this case, the various ostensibly benevolent and beneficent purposes of regulation – denies the benefits of using them for something else. So how costly is that?

In “A Slow-Motion Collapse” (Regulation, Winter 2014-2015, pp. 12-15, Cato Institute), author Pierre Lemieux cites several studies that attempted to quantify the costs of government regulation. The most comprehensive of these was by academic economists John Dawson and John Seater, who used variations in the annual Code of Federal Regulations as their index for regulatory change. In 1949, the CFR had 19,335 pages; in 2005, this total has risen to 134,261 pages, a seven-fold increase in six-plus decades. (Remember, this includes federal regulation only, excluding state and local government regulation, which might triple that total.)

Naturally, proponents of regulation blandly assert that the growth of real income (also roughly seven-fold over the same period) requires larger government, hence more regulation, to keep pace. This nebulous generalization collapses upon close scrutiny. Freedom and free markets naturally result in more complex forms of goods, services and social interactions, but if regulatory constraints “keep pace” this will restrain the very benefits that freedom creates. The very purpose of freedom itself will be vitiated. We are back at square one, asking the question: What gives regulation the right and the competence to make that sort of decision?

Dawson and Seater developed an econometric model to estimate the size of the bite taken by regulation from economic growth. Their estimate was that it has reduced economic growth on average by about 2 percentage points per year. This is a huge reduction. If we were to apply it to the 2011 GDP, it would work as follows: Starting in 1949, had all subsequent regulation not happened, 2011 GDP would have been 39 trillion dollars higher, or about 54 trillion. As Lemieux put it: “The average American (man, woman and child) would now have about $125,000 more per year to spend, which amounts to more than three times [current] GDP per capita. If this is not an economic collapse, what is?”

Lemieux points out that, while this estimate may strain the credulity of some, it also may actually incorporate the effects of state and local regulation, even though the model itself did not include them in its index. That is because it is reasonable to expect a statistical correlation between the three forms of regulation. When federal regulation rises, it often does so in ways that require corresponding matching or complementary state and local actions. Thus, those forms of regulation are hidden in the model to some considerable degree.

Lemieux also points to Europe, where regulation is even more onerous than in the U.S. – and growth has been even more constipated. We can take this reasoning even further by bringing in the recent example of less-developed countries. The Asian Tigers experienced rapid growth when they espoused market-oriented economics; could their relative lack of regulation supplement this economic-development success story? India and mainland China turned their economies around when they turned away from socialism and Communism, respectively; regulation still hamstrings India while China is dichotomized into a relatively autonomous small-scale competitive sector and a heavily regulated and planned government controlled big-business economy. Signs point to a recent Chinese growth dip tied to the bursting of a bubble created by easy money and credit granted to the regulated sector.

The price tag for regulation is eye-popping. It is long past time to ask ourselves why we are stuck with this lemon.

Government Regulation as Wish-Fulfillment

For millennia, children have cultivated the dream fantasies of magical figures that make their wishes come true. These apparently satisfy a deep-seated longing for security and fulfillment. Freud referred to this need as “wish fulfillment.” Although Freudian psychology has long ago been discredited, the term retains its usefulness.

When we grow into adulthood, we do not shed our childish longings; they merely change form. In the 20th century, motion pictures became the dominant art form in the Western world because they served as fairy tales for adults by providing alternative versions of reality that were preferable to daily life.

When asked by pollsters to list or confirm the functions regulation should perform, citizens repeatedly compose “wish lists” that are either platitudes or, alternatively, duplicate the functions actually approximated by competitive markets. It seems even more significant that researchers and policymakers do exactly the same thing. Returning to Lewis Grossman’s evaluation of the public’s view of FDA: “Americans’ distrust of major institutions has led them to the following position: On the one hand, they believe the FDA has an important role to play in ensuring the basic safety of products and the accuracy and completeness of labeling and advertising. On the other hand, they generally do not want the FDA to inhibit the transmission of truthful information from manufacturers to consumers, and – except in cases in which risk very clearly outweighs benefit – they prefer that the government allow consumers to make their own decisions regarding what to put in their own bodies.”

This is a masterpiece of self-contradiction. Just exactly what is an “important role to play,” anyway? Allowing an agency that previously denied the right to label and advertise to play any role is playing with fire; it means that genuine consumer advocates have to fight a constant battle with the government to hold onto the territory they have won. If consumers really don’t want the FDA to “inhibit the transmission of truthful information from manufacturers to consumers,” they should abolish the FDA, because free markets do the job consumers want done by definitionand the laws alreadyprohibit fraud and deception.

The real whopper in Grossman’s summary is the caveat about risk and benefit. Government agencies in general and the FDA in particular have traditionally shunned cost/benefit and risk/benefit analysis like the plague; when they have attempted it they have done it badly. Just exactly who is going to decide when risk “very clearly” outweighs benefit in a regulatory context, then? Grossman, a professional policy analyst who should know better, is treating the FDA exactly as the general public does. He is assuming that a government agency is a wish-fulfillment entity that will do exactly what he wants done – or, in this case, what he claims the public wants done – rather than what it actually does.

Every member of the general public would scornfully deny that he or she believes in a man called Santa Claus who lives at the North Pole and flies around the world on Christmas Eve distributing presents to children. But for an apparent majority of the public, government in general and regulation in particular plays a similar role because people ascribe quasi-magical powers to them to fulfill psychological needs. For these people, it might be more apropos to view government as “Mommy” or “Daddy” because of the strength and dependent nature of the relationship.

Can Government Control Consumer Risk? The Emerging Scientific Answer: No 

The comments of Grossman, assorted researchers and countless other commentators and onlookers over the years imply that government regulation is supposed to act as a sort of stern, but benevolent parent, protecting us from our worst impulses by regulating the risks we take. This is reflected not only in cigarette taxes but also in the draconian warnings on the cigarette packages and in numerous other measures taken by regulators. Mandatory seat belt laws, adopted by state legislatures in 49 states since the mid-1980s at the urging of the federal government, promised the near-elimination of automobile fatalities. Government bureaucracies like Occupational Safety and Health Administration have covered the workplace with a raft of safety regulations. The Consumer Product Safety Commission presides with an eagle eye over the safety of the products that fill our market baskets.

In 1975, University of Chicago economist Sam Peltzman published a landmark study in the Journal of Political Economy. In it, Peltzman revealed that the various devices and measures mandated by government and introduced by the big auto companies in the 1960s had not actually produced statistically significant improvements in safety, as measured by auto fatalities and injuries. In particular, use of the new three-point seat belts seemed to show a slight improvement in driver fatalities that was more than offset by a rise in fatalities to others – pedestrians, cyclists and possibly occupants of victim vehicles. Over the years, subsequent research confirmed Peltzman’s results so repeatedly that former Chairman of the Council of Economic Advisors’ N. Gregory Mankiw dubbed this the “Peltzman Effect.”

A similar kind of result emerged throughout the social sciences. Innovations in safety continually failed to produce the kind of safety results that experts anticipated and predicted, often failing to provide any improved safety performance at all. It seems that people respond to improved safety by taking more risk, thwarting the expectations of the experts. Needless to say, this same logic applies also to rules passed by government to force people to behave more safely. People simply thwart the rules by finding ways to take risk outside the rules. When forced to wear seat belts, for example, they drive less carefully. Instead of endangering only themselves by going beltless, now they endanger others, too.

Today, this principle is well-established in scientific circles. It is called risk compensation. The idea that people strike to maintain, or “purchase,” a particular level of risk and hold it constant in the face of outside efforts to change it is called risk homeostasis.

These concepts make the entire project of government regulation of consumer risk absurd and counterproductive. Previously it was merely wrong in principle, an abuse of human freedom. Now it is also wrong in practice because it cannot possibly work.

Dropping the Façade: the Reality of Government Regulation

If the results of government regulation do not comport with its stated purposes, what are its actual purposes? Are the politicians, bureaucrats and employees who comprise the legislative and executive branches and the regulatory establishment really unconscious of the effects of regulation? No, for the most part the beneficiaries of regulation are all too cynically aware of the façade that covers it.

Politicians support regulation to court votes from the government-dependent segment of the voting public and to avoid being pilloried as killers and haters or – worst of all – a “tool of the big corporations.” Bureaucrats tacitly do the bidding of politicians in their role as administrators. In return, politicians do the bidding of bureaucrats by increasing their budgets and staffs. Employees vote for politicians who support regulation; in return, politicians vote to increase budgets. Employees follow the orders of bureaucrats; in return, bureaucrats hire bigger staffs that earn them bigger salaries.

This self-reinforcing and self-supporting network constitutes the metastatic cancer of big government. The purpose of regulation is not to benefit the public. It is to milk the public for the benefit of politicians, bureaucrats and government employees. Regulation drains resources away from and hamstrings the productive private economy.

Even now, as we speak, this process – aided, abetted and drastically accelerated by rapid money creation – is bringing down the economies of the Western world around our ears by simultaneously wreaking havoc on the monetary order with easy money, burdening the financial sector with debt and eviscerating the real economy with regulations that steadily erode its productive potential.

DRI-135 for week of 1-4-15: Flexible Wages and Prices: Economic Shock Absorbers

An Access Advertising EconBrief:

Flexible Wages and Prices: Economic Shock Absorbers

At the same times that free markets are becoming an endangered species in our daily lives, they enjoy a lively literary existence. The latest stimulating exercise in free-market thought is The Forgotten Depression: 1921 – The Crash That Cured Itself. The author is James Grant, well-known in financial circles as editor/publisher of “Grant’s Interest Rate Observer.” For over thirty years, Grant has cast a skeptical eye on the monetary manipulations of governments and central banks. Now he casts his gimlet gaze backward on economic history. The result is electrifying.

The Recession/Depression of 1920-1921

The U.S. recession of 1920-1921 is familiar to students of business cycles and few others. It was a legacy of World War I. Back then, governments tended to finance wars through money creation. Invariably this led to inflation. In the U.S., the last days of the war and its immediate aftermath were boom times. As usual – when the boom was the artifact of money creation – the boom went bust.

Grant recounts the bust in harrowing detail.  In 1921, industrial production fell by 31.6%, a staggering datum when we recall that the U.S. was becoming the world’s leading manufacturer. (The President’s Conference on Unemployment reported in 1929 that 1921 was the only year after 1899 in which industrial production had declined.) Gross national product (today we would cite gross domestic product; neither statistic was actually calculated at that time) fell about 24% in between 1920 and 1921 in nominal dollars, or 9% when account is taken of price changes. (Grant compares this to the figures for the “Great Recession” of 2007-2009, which were 2.4% and 4.3%, respectively.) Corporate profits nosedived commensurately. Stocks plummeted; the Dow Jones Industrial average fell by 46.6% between the cyclical peak of November, 1919 and trough of August, 1921. According to Grant, “the U.S. suffered the steepest plunge in wholesale prices in its history (not even eclipsed by the Great Depression),” over 36% within 12 months. Unemployment rose dramatically to a level of some 4,270,000 in 1921 – and included even the President of General Motors, Billy Durant. (As the price of GM’s shares fell, he augmented his already-sizable shareholdings by buying on margin – ending up flat broke and out of a job.) Although the Department of Labor did not calculate an “unemployment rate” at that time, Grant estimates the nonfarm labor force at 27,989,000, which would have made the simplest measure of the unemployment rate 15.3%. (That is, it would have undoubtedly included labor-force dropouts and part-time workers who preferred full-time employment.)

A telling indicator of the dark mood enveloping the nation was passage of the Quota Act, the first step on the road to systematic federal limitation of foreign immigration into the U.S. The quota was fixed at 3% of foreign nationals present in each of the 48 states as of 1910. That year evidently reflected nostalgia for pre-war conditions since the then-popular agricultural agitation for farm-price “parity” sought to peg prices to levels at that same time.

In the Great Recession and accompanying financial panic of 2008 and subsequently, we had global warming and tsunamis in Japan and Indonesia to distract us. In 1920-1921, Prohibition had already shut down the legal liquor business, shuttering bars and nightclubs. A worldwide flu pandemic had killed hundreds of thousands. The Black Sox had thrown the 1919 World Series at the behest of gamblers.

The foregoing seems to make a strong prima facie case that the recession of 1920 turned into the depression of 1921. That was the judgment of the general public and contemporary commentators. Herbert Hoover, Secretary of Commerce under Republican President Warren G. Harding, who followed wartime President Woodrow Wilson in 1920, compiled many of the statistics Grant cites while chairman of the President’s Conference on Unemployment. He concurred with that judgment. So did the founder of the study of business cycles, the famous institutional economist Wesley C. Mitchell, who influenced colleagues as various and eminent as Thorstein Veblen, Milton Friedman, F. A. Hayek and John Kenneth Galbraith. Mitchell referred to “…the boom of 1919, the crisis of 1920 and the depression of 1921 [that] followed the patterns of earlier cycles.”

By today’s lights, the stage was set for a gigantic wave of federal-government intervention, a gargantuan stimulus program. Failing that, economists would have us believe, the economy would sink like a stone into a pit of economic depression from which it would likely never emerge.

What actually happened in 1921, however, was entirely different.

The Depression That Didn’t Materialize

We may well wonder what might have happened if the Democrats had retained control of the White House and Congress. Woodrow Wilson and his advisors (notably his personal secretary, Joseph Tumulty) had greatly advanced the project of big government begun by Progressive Republicans Theodore Roosevelt and William Howard Taft. During World War I, the Wilson administration seized control of the railroads, the telephone companies and the telegraph companies. It levied wage and price controls. The spirit of the Wilson administration’s efforts is best characterized by the statement of the Chief Price Controller of the War Industries Board, Robert Brookings. “I would rather pay a dollar a pound for [gun]powder for the United States in a state of war if there was no profit in it than pay the DuPont Company 50 cents a pound if they had 10 cents profit in it.” Of course, Mr. Brookings was not actually himself buying the gunpowder; the government was only representing the taxpayers (of whom Mr. Brookings was presumably one). And their attitude toward taxpayers was displayed by the administration’s transformation of an income tax initiated at insignificant levels in 1913 and to a marginal rate of 77% (!!) on incomes exceeding $1 million.

But Wilson’s obsession with the League of Nations and his 14 points for international governance had not only ruined his health, it had ruined his party’s standing with the electorate. In 1920, Republican Warren G. Harding was elected President. (The Republicans had already gained substantial Congressional majorities in the off-year elections of 1918.) Except for Hoover, the Harding circle of advisors was comprised largely of policy skeptics – people who felt there was nothing to be done in the face of an economic downturn but wait it out. After all, the U.S. had endured exactly this same phenomenon of economic boom, financial panic and economic bust before in 1812, 1818, 1825, 1837, 1847, 1857, 1873, 1884, 1890, 1893, 1903, 1907, 1910 and 1913. The U.S. economy had not remained mired in depression; it had emerged from all these recessions – or, in the case of 1873, a depression. If the 19th-century system of free markets were to be faulted, it would not be for failure to lift itself out of recession or depression, but for repeatedly re-entering the cycle of boom and bust.

There was no Federal Reserve to flood the economy with liquidity or peg interest rates at artificially low levels or institute a “zero interest-rate policy.” Indeed, the rules of the gold-standard “game” called for the Federal Reserve to raise interest rates to stem the inflation that still raged in the aftermath of World War I. Had it not done so, a gold outflow might theoretically have drained the U.S. dry.  The Fed did just that, and interest rates hovered around 8% for the duration. Deliberate deficit spending as an economic corrective would have been viewed as madness. As Grant put it, “laissez faire had its last hurrah in 1921.”

What was the result?

In the various individual industries, prices and wages and output fell like a stone. Auto production fell by 23%. General Motors, as previously noted, was particularly hard hit. It went from selling 52,000 vehicles per month to selling 13,000 to 6,150 in the space of seven months. Some $85 million in inventory was eventually written off in losses.

Hourly manufacturing wages fell by 22%. Average disposable income in agriculture, which comprised just under 20% of the economy, fell by over 55%. Bankruptcies overall tripled to nearly 20,000 over the two years ending in 1921. In Kansas City, MO, a haberdashery shop run by Harry Truman and Eddie Jacobson held out through 1920 before finally folding in 1921. The resulting personal bankruptcy and debt plagued the partners for years. Truman evaded it by taking a job as judge of the Jackson County Court, where his salary was secure against liens. But his bank accounts were periodically raided by bill collectors for years until 1935, when he was able to buy up the remaining debt at a devalued price.

In late 1920, Ford Motor Co. cut the price of its Model T by 25%. GM at first resisted price cuts but eventually followed suit. Farmers, who as individuals had no control over the price of their products, had little choice but to cut costs and increase productivity – increasing output was an individual’s only way to increase income. When all or most farmers succeeded, this produced lower prices. How much lower? Grant: “In the second half of [1920], the average price of 10 leading crops fell by 57 percent.” But how much more food can humans eat; how many more clothes can they wear? Since the price- and income-elasticities of demand for agricultural goods were less than one, this meant that agricultural revenue and incomes fell.

As noted by Wesley Mitchell, the U.S. slump was not unique but rather part of a global depression that began as a series of commodity-price crashes in Japan, the U.K., France, Italy, Germany, India, Canada, Sweden, the Netherlands and Australia. It encompassed commodities including pig iron, beef, hemlock, Portland cement, bricks, coal, crude oil and cotton.

Banks that had speculative commodity positions were caught short. Among these was the largest bank in the U.S., National City Bank, which had loaned extensively to finance the sugar industry in Cuba. Sugar prices were brought down in the commodity crash and brought the bank down with them. That is, the bank would have failed had it not received sweetheart loans from the Federal Reserve.

Today, the crash of prices would be called “deflation.” So it was called then and with much more precision. Today, deflation can mean anything from the kind of nosediving general price level seen in 1920-1921 to relatively stable prices to mild inflation – in short, any general level of prices that does not rise fast enough to suit a commentator.

But there was apparently general acknowledgment that deflation was occurring in the depression of 1921. Yet few people apart from economists found that ominous. And for good reason. Because after some 18 months of panic, recession and depression – the U.S. economy recovered. Just as it had done 14 times previously.

 

It didn’t merely recover. It roared back to life. President Harding died suddenly in 1923, but under President Coolidge the U.S. economy experienced the “Roaring 20s.” This was an economic boom fueled by low tax rates and high productivity, the likes of which would not be seen again until the 1980s. It was characterized by innovation and investment. Unfortunately, in the latter stages, the Federal Reserve forgot the lessons of 1921 and increases the money supply to “keep the price level stable” and prevent deflation in the face of the wave of innovation and productivity increases. This helped to usher in the Great Depression, along with numerous policy errors by the Hoover and Roosevelt administrations.

Economists like Keynes, Irving Fisher and Gustav Cassel were dumbfounded. They had expected deflation to flatten the U.S. economy like a pancake, increasing the real value of debts owed by debtor classes and discouraging consumers from spending in the expectation that prices would fall in the future. Not.

There was no economic stimulus. No TARP, no ZIRP, no QE. No wartime controls. No meddlesome regulation a la Theodore Roosevelt, Taft and Wilson. The Harding administration and the Fed left the economy alone to readjust and – mirabile dictu – it readjusted. In spite of the massive deflation or, much more likely, because of it.

The (Forgotten) Classical Theory of Flexible Wages and Prices

James Grant wants us to believe that this outcome was no accident. The book jacket for the Forgotten Depression bills it as “a free-market rejoinder to Bush’s and Obama’s Keynesian stimulus applied to the 2007-9 recession,” which “proposes ‘less is more’ with respect to federal intervention.”

His argument is almost entirely empirical and very heavily oriented to the 1920-1921 depression. That is deliberate; he cites the 14 previous cyclical contractions but focuses on this one for obvious reasons. It was the last time that free markets were given the opportunity to cure a depression; both Herbert Hoover and Franklin Roosevelt supervised heavy, continual interference with markets from 1929 through 1941. We have much better data on the 1920-21 episode than, say, the 1873 depression.

Readers may wonder, though, whether there is underlying logical support for the result achieved by the deflation of 1921. Can the chorus of economists advocating stimulative policy today really be wrong?

Prior to 1936, the policy chorus was even louder. Amazing as it now seems, it advocated the stance taken by Harding et al. Classical economists propounded the theory of flexible wages and prices as an antidote to recession and depression. And, without stating it in rigorous fashion, that is the theory that Grant is following in his book.

Using the language of modern macroeconomics, the problems posed by cyclical downturns are unemployment due to a sudden decline in aggregate (effective) demand for goods and services. The decline in aggregate demand causes declines in demand for all or most goods; the decline in demand for goods causes declines in demand for all or most types of labor. As a first approximation, this produces surpluses of goods and labor. The surplus of labor is defined as unemployment.

The classical economists pointed out that, while the shock of a decline in aggregate demand could cause temporary dislocations such as unsold goods and unemployment, this was not a permanent condition. Flexible wages and prices could, like the shock absorbers on an automobile, absorb the shock of the decline in aggregate demand and return the economy to stability.

Any surplus creates an incentive for sellers to lower price and buyers to increase purchases. As long as the surplus persists, the downward pressure on price will remain. And as the price (or wage) falls toward the new market-clearing point, the amount produced and sold (or the amount of labor offered and purchases) will increase once more.

Flexibility of wages and prices is really a two-part process. Part one works to clear the surpluses created by the initial decline in aggregate demand. In labor markets, this serves to preserve the incomes of workers who remain willing to work at the now-lower market wage. If they were unemployed, they would have no wage, but working at a lower wage gives them a lower nominal income than before. That is only part of this initial process, though. Prices in product markets are decreasing alongside the declining wages. In principle, fully flexible prices and wages would mean that even though the nominal incomes of workers would decline, their real incomes would be restored by the decline of all prices in equal proportion. If your wage falls by (say) 20%, declines in all prices by 20% should leave you able to purchase the same quantities of goods and services as before.

The emphasis on real magnitudes rather than nominal magnitudes gives rise to the name given to the second part of this process. It is called the real-balance effect. It was named by the classical economist A. C. Pigou and refined by later macroeconomist Don Patinkin.

When John Maynard Keynes wrote his General Theory of Employment Interest and Income in 1936, he attacked classical economists by attacking the concepts of flexible wages and prices. First, he attacked their feasibility. Then, he attacked their desirability.

Flexible wages were not observed in reality because workers would not consent to downward revisions in wages, Keynes maintained. Did Keynes really believe that workers preferred to be unemployed and earn zero wages at a relatively high market wage rather than work and earn a lower market wage? Well, he said that workers oriented their thinking toward the nominal wage rather than the real wage and thus did not perceive that they had regained their former position with lower prices and a lower wage. (This became known as the fallacy of money illusion.) His followers spent decades trying to explain what he really meant or revising his words or simply ignoring his actual words. (It should be noted, however, that Keynes was English and trade unions exerted vastly greater influence on prevailing wage levels in England that they did in the U.S. for at least the first three-quarters of the 20th century. This may well have biased Keynes’ thinking.)

Keynes also decried the assumption of flexible prices for various reasons, some of which continue to sway economists today. The upshot is that macroeconomics has lost touch with the principles of price flexibility. Even though Keynes’ criticisms of the classical economists and the price system were discredited in strict theory, they were accepted de facto by macroeconomists because it was felt that flexible wages and prices would take too long to work, while macroeconomic policy could be formulated and deployed relatively quickly. Why make people undergo the misery of unemployment and insolvency when we can relieve their anxiety quickly and compassionately by passing laws drafted by macroeconomists on the President’s Council of Economic Advisors?

Let’s Compare

Thanks to James Grant, we now have an empirical basis for comparison between policy regimes. In 1920-1921, the old-fashioned classical medicine of deflation, flexible wages and prices and the real-balance effect took 18 months to turn a panic, recession and depression into a rip-roaring recovery that lasted 8 years.

Fast forward to December, 2007. The recession has begun. Unfortunately, it is not detected until September, 2008, when the financial panic begins. The stimulus package is not passed until January, 2009 – barely in time for the official end of the recession in June, 2009. Whoops – unemployment is still around 10% and remains stubbornly high until 2013. Moreover, it only declines because Americans have left the labor force in numbers not seen for over thirty years. The recovery, such as it is, is so anemic as to hardly merit the name – and it is now over 7 years since the onset of recession in December, 2007.

 

It is no good complaining that the stimulus package was not large enough because we are comparing it with a case in which the authorities did nothing – or rather, did nothing stimulative, since their interest-rate increase should properly be termed contractionary. That is exactly what macroeconomists call it when referring to Federal Reserve policy in the 1930s, during the Great Depression, when they blame Fed policy and high interest rates for prolonging the Depression. Shouldn’t they instead be blaming the continual series of government interventions by the Fed and the federal government under Herbert Hoover and Franklin Roosevelt? And we didn’t even count the stimulus package introduced by the Bush administration, which came and went without making a ripple in term of economic effect.

Economists Are Lousy Accident Investigators 

For nearly a century, the economics profession has accused free markets of possessing faulty shock absorbers; namely, inflexible wages and prices. When it comes to economic history, economists are obviously lousy accident investigators. They have never developed a theory of business cycles but have instead assumed a decline in aggregate demand without asking why it occurred. In figurative terms, they have assumed the cause of the “accident” (the recession or the depression). Then they have made a further assumption that the failure of the “vehicle’s” (the economy’s) automatic guidance system to prevent (or mitigate) the accident was due to “faulty shock absorbers” (inflexible wages and prices).

Would an accident investigator fail to visit the scene of the accident? The economics profession has largely failed to investigate the flexibility of wages and prices even in the Great Depression, let alone the thirty-odd other economic contractions chronicled by the National Bureau of Economic Research. The work of researchers like Murray Rothbard, Vedder and Galloway, Benjamin Anderson and Harris Warren overturns the mainstream presumption of free-market failure.

The biggest empirical failure of all is one ignored by Grant; namely, the failure to demonstrate policy success. If macroeconomic policy worked as advertised, then we would not have recessions in the first place and could reliably end them once they began. In fact, we still have cyclical downturns and cannot use policy to end them and macroeconomists can point to no policy successes to bolster their case.

Now we have this case study by James Grant that provides meticulous proof that deflation – full-blooded, deep-throated, hell-for-leather deflation in no uncertain terms – put a prompt, efficacious end to what must be called an economic depression.

Combine this with the 40-year-long research project conducted on Keynesian theory, culminating in its final discrediting by the early 1980s. Throw in the existence of the Austrian Business Cycle Theory, which combines the monetary theory of Ludwig von Mises and interest-rate theory of Knut Wicksell with the dynamic synthesis developed by F. A. Hayek. This theory cannot be called complete because it lacks a fully worked out capital theory to complete the integration of monetary and value theory. (We might think of this as the economic version of the Unified Field Theory in the natural sciences.) But an incomplete valid theory beats a discredited theory every time.

In other words, free-market economics has an explanation for why the accident repeatedly happens and why its effects can be mitigated by the economy’s automatic guidance mechanism without the need for policy action by government. It also explains why the policy actions are ineffective at both remedial and preventive action in the field of accidents.

James Grant’s book will take its place in the pantheon of economic history as the outstanding case study to date of a self-curing depression.

DRI-259 for week of 2-2-14: Kristallnacht for the Rich: Not Far-Fetched

An Access Advertising EconBrief:

Kristallnacht for the Rich: Not Far-Fetched

Periodically, the intellectual class aptly termed “the commentariat” by The Wall Street Journal works itself into frenzy. The issue may be a world event, a policy proposal or something somebody wrote or said. The latest cause célèbre is a submission to the Journal’s letters column by a partner in one of the nation’s leading venture-capital firms. The letter ignited a firestorm; the editors subsequently declared that Tom Perkins of Kleiner Perkins Caulfield & Byers “may have written the most-read letter to the editor in the history of The Wall Street Journal.”

What could have inspired the famously reserved editors to break into temporal superlatives? The letter’s rhetoric was both penetrating and provocative. It called up an episode in the 20th century’s most infamous political regime. And the response it triggered was rabid.

“Progressive Kristallnacht Coming?”

“…I would call attention to the parallels of fascist Nazi Germany to its war on its “one percent,” namely its Jews, to the progressive war on the American one percent, namely “the rich.” With this ice breaker, Tom Perkins made himself a rhetorical target for most of the nation’s commentators. Even those who agreed with his thesis felt that Perkins had no business using the Nazis in an analogy. The Wall Street Journal editors said “the comparison was unfortunate, albeit provocative.” They recommended reserving Nazis only for rarefied comparisons to tyrants like Stalin.

On the political Left, the reaction was less measured. The Anti-Defamation League accused Perkins of insensitivity. Bloomberg View characterized his letter as an “unhinged Nazi rant.”

No, this bore no traces of an irrational diatribe. Perkins had a thesis in mind when he drew an analogy between Nazism and Progressivism. “From the Occupy movement to the demonization of the rich, I perceive a rising tide of hatred of the successful one percent.” Perkins cited the abuse heaped on workers traveling Google buses from the cities to the California peninsula. Their high wages allowed them to bid up real-estate prices, thereby earning the resentment of the Left. Perkins’ ex-wife Danielle Steele placed herself in the crosshairs of the class warriors by amassing a fortune writing popular novels. Millions of dollars in charitable contributions did not spare her from criticism for belonging to the one percent.

“This is a very dangerous drift in our American thinking,” Perkins concluded. “Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?” Perkins point is unmistakable; his letter is a cautionary warning, not a comparison of two actual societies. History doesn’t repeat itself, but it does rhyme. Kristallnacht and Nazi Germany belong to history. If we don’t mend our ways, something similar and unpleasant may lie in our future.

A Short Refresher Course in Early Nazi Persecution of the Jews

Since the current debate revolves around the analogy between Nazism and Progressivism, we should refresh our memories about Kristallnacht. The name itself translates loosely into “Night of Broken Glass.” It refers to the shards of broken window glass littering the streets of cities in Germany and Austria on the night and morning of November 9-10, 1938. The windows belonged to houses, hospitals, schools and businesses owned and operated by Jews. These buildings were first looted, then smashed by elements of the German paramilitary SA (the Brownshirts) and SS (security police), led by the Gauleiters (regional leaders).

In 1933, Adolf Hitler was elevated to the German chancellorship after the Nazi Party won a plurality of votes in the national election. Almost immediately, laws placing Jews at a disadvantage were passed and enforced throughout Germany. The laws were the official expression of the philosophy of German anti-Semitism that dated back to the 1870s, the time when German socialism began evolving from the authoritarian roots of Otto von Bismarck’s rule. Nazi officialdom awaited a pretext on which to crack down on Germany’s sizable Jewish population.

The pretext was provided by the assassination of German official Ernst vom Rath on Nov. 7, 1938 by a 17-year-old German boy named Herschel Grynszpan. The boy was apparently upset by German policies expelling his parents from the country. Ironically, vom Rath’s sentiments were anti-Nazi and opposed to the persecution of Jews. Von Rath’s death on Nov. 9 was the signal for release of Nazi paramilitary forces on a reign of terror and abduction against German and Austrian Jews. Police were instructed to stand by and not interfere with the SA and SS as long as only Jews were targeted.

According to official reports, 91 deaths were attributed directly to Kristallnacht. Some 30,000 Jews were spirited off to jails and concentration camps, where they were treated brutally before finally winning release some three months later. In the interim, though, some 2-2,500 Jews died in the camps. Over 7,000 Jewish-owned or operated businesses were damaged. Over 1,000 synagogues in Germany and Austria were burned.

The purpose of Kristallnacht was not only wanton destruction. The assets and property of Jews were seized to enhance the wealth of the paramilitary groups.

Today we regard Kristallnacht as the opening round of Hitler’s Final Solution – the policy that produced the Holocaust. This strategic primacy is doubtless why Tom Perkins invoked it. Yet this furious controversy will just fade away, merely another media preoccupation du jour, unless we retain its enduring significance. Obviously, Tom Perkins was not saying that the Progressive Left’s treatment of the rich is now comparable to Nazi Germany’s treatment of the Jews. The Left is not interning the rich in concentration camps. It is not seizing the assets of the rich outright – at least not on a wholesale basis, anyway. It is not reducing the homes and businesses of the rich to rubble – not here in the U.S., anyway. It is not passing laws to discriminate systematically against the rich – at least, not against the rich as a class.

Tom Perkins was issuing a cautionary warning against the demonization of wealth and success. This is a political strategy closely associated with the philosophy of anti-Semitism; that is why his invocation of Kristallnacht is apropos.

The Rise of Modern Anti-Semitism

Despite the politically correct horror expressed by the Anti-Defamation Society toward Tom Perkins’ letter, reaction to it among Jews has not been uniformly hostile. Ruth Wisse, professor of Yiddish and comparative literature at HarvardUniversity, wrote an op-ed for The Wall Street Journal (02/04/2014) defending Perkins.

Wisse traced the modern philosophy of anti-Semitism to the philosopher Wilhelm Marr, whose heyday was the 1870s. Marr “charged Jews with using their skills ‘to conquer Germany from within.’ Marr was careful to distinguish his philosophy of anti-Semitism from prior philosophies of anti-Judaism. Jews “were taking unfair advantage of the emerging democratic order in Europe with its promise of individual rights and open competition in order to dominate the fields of finance, culture and social ideas.”

Wisse declared that “anti-Semitism channel[ed] grievance and blame against highly visible beneficiaries of freedom and opportunity.” “Are you unemployed? The Jews have your jobs. Is your family mired in poverty? The Rothschilds have your money. Do you feel more secure in the city than you did on the land? The Jews are trapping you in the factories and charging you exorbitant rents.”

The Jews were undermining Christianity. They were subtly perverting the legal system. They were overrunning the arts and monopolizing the press. They spread Communism, yet practiced rapacious capitalism!

This modern German philosophy of anti-Semitism long predated Nazism. It accompanied the growth of the German welfare state and German socialism. The authoritarian political roots of Nazism took hold under Otto von Bismarck’s conservative socialism, and so did Nazism’s anti-Semitic cultural roots as well. The anti-Semitic conspiracy theories ascribing Germany’s every ill to the Jews were not the invention of Hitler, but of Wilhelm Marr over half a century before Hitler took power.

The Link Between the Nazis and the Progressives: the War on Success

As Wisse notes, the key difference between modern anti-Semitism and its ancestor – what Wilhelm Marr called “anti-Judaism” – is that the latter abhorred the religion of the Jews while the former resented the disproportionate success enjoyed by Jews much more than their religious observances. The modern anti-Semitic conspiracy theorist pointed darkly to the predominance of Jews in high finance, in the press, in the arts and running movie studios and asked rhetorically: How do we account for the coincidence of our poverty and their wealth, if not through the medium of conspiracy and malefaction? The case against the Jews is portrayed as prima facie and morphs into per se through repetition.

Today, the Progressive Left operates in exactly the same way. “Corporation” is a pejorative. “Wall Street” is the antonym of “Main Street.” The very presence of wealth and high income is itself damning; “inequality” is the reigning evil and is tacitly assigned a pecuniary connotation. Of course, this tactic runs counter to the longtime left-wing insistence that capitalism is inherently evil because it forces us to adopt a materialistic perspective. Indeed, environmentalism embraces anti-materialism to this day while continuing to bunk in with its progressive bedfellows.

We must interrupt with an ironic correction. Economists – according to conventional thinking the high priests of materialism – know that it is human happiness and not pecuniary gain that is the ultimate desideratum. Yet the constant carping about “inequality” looks no further than money income in its supposed solicitude for our well-being. Thus, the “income-inequality” progressives – seemingly obsessed with economics and materialism – are really anti-economic. Economists, supposedly green-eyeshade devotees of numbers and models, are the ones focusing on human happiness rather than ideological goals.

German socialism metamorphosed into fascism. American Progressivism is morphing from liberalism to socialism and – ever more clearly – honing in on its own version of fascism. Both employed the technique of demonization and conspiracy to transform the mutual benefit of free voluntary exchange into the zero-sum result of plunder and theft. How else could productive effort be made to seem fruitless? How else could success be made over into failure? This is the cautionary warning Perkins was sounding.

The Great Exemplar

The great Cassandra of political economy was F.A. Hayek. Early in 1929, he predicted that Federal Reserve policies earlier in the decade would soon bear poisoned fruit in the form of a reduction in economic activity. (His mentor, Ludwig von Mises, was even more emphatic, foreseeing “a great crash” and refusing a prestigious financial post for fear of association with the coming disaster.) He predicted that the Soviet economy would fail owing to lack of a functional price system; in particular, missing capital markets and interest rates. He predicted that Keynesian policies begun in the 1950s would culminate in accelerating inflation. All these came true, some of them within months and some after a lapse of years.

Hayek’s greatest prediction was really a cautionary warning, in the same vein as Tom Perkins’ letter but much more detailed. The 1945 book The Road to Serfdom made the case that centralized economic planning could operate only at the cost of the free institutions that distinguished democratic capitalism. Socialism was really another form of totalitarianism.

The reaction to Hayek’s book was much the same as reaction to Perkins’ letter. Many commentators who should have known better have accused both of them of fascism. They also accused both men of describing a current state of affairs when both were really trying to avoida dystopia.

The flak Hayek took was especially ironic because his book actually served to prevent the outcome he feared. But instead of winning the acclaim of millions, this earned him the scorn of intellectuals. The intelligentsia insisted that Hayek predicted the inevitable succession of totalitarianism after the imposition of a welfare state. When welfare states in Great Britain, Scandinavia, and South America failed to produce barbed wire, concentration camps and German Shepherd dogs, the Left advertised this as proof of Hayek’s “exaggerations” and “paranoia.”

In actual fact, Great Britain underwent many of the changes Hayek had feared and warned against. The notorious “Rules of Engagements,” for instance, were an attempt by a Labor government to centrally control the English labor market – to specify an individual’s work and wage rather than allowing free choice in an impersonal market to do the job. The attempt failed just a dismally as Hayek and other free-market economists had foreseen it would. In the 1980s, it was Hayek’s arguments, wielded by Prime Minister Margaret Thatcher, which paved the way for the rolling back of British socialism and the taming of inflation. It’s bizarre to charge the prophet of doom with inaccuracy when his prophecy is the savior, but that’s what the Left did to Hayek.

Now they are working the same familiar con on Tom Perkins. They begin by misconstruing the nature of his argument. Later, if his warnings are successful, they will use that against him by claiming that his “predictions” were false.

Enriching Perkins’ Argument

This is not to say that Perkins’ argument is perfect. He has instinctively fingered the source of the threat to our liberties. Perkins himself may be rich, but argument isn’t; it is threadbare and skeletal. It could use some enriching.

The war on the wealthy has been raging for decades. The opening battle is lost to history, but we can recall some early skirmishes and some epic brawls prior to Perkins.

In Europe, the war on wealth used anti-Semitism as its spearhead. In the U.S., however, the popularity of Progressives in academia and government made antitrust policy a more convenient wedge for their populist initiatives against success. Antitrust policy was a crown jewel of the Progressive movement in the early 1900s; Presidents Theodore Roosevelt and William Howard Taft cultivated reputations as “trust busters.”

The history of antitrust policy exhibits two pronounced tendencies: the use of the laws to restrict competition for the benefit of incumbent competitors and the use of the laws by the government to punish successful companies for various political reasons. The sobering research of Dominick Armentano shows that antitrust policy has consistently harmed consumer welfare and economic efficiency. The early antitrust prosecution of Standard Oil, for example, broke up a company that had consistently increased its output and lowered prices to consumers over long time spans. The Orwellian rhetoric accompanying the judgment against ALCOA in the 1940s reinforces the notion that punishment, not efficiency or consumer welfare, was behind the judgment. The famous prosecutions of IBM and AT&T in the 1970s and 80s each spawned book-length investigations showing the perversity of the government’s claims. More recently, Microsoft became the latest successful firm to reap the government’s wrath for having the temerity to revolutionize industry and reward consumers throughout the world.

The rise of the regulatory state in the 1970s gave agencies and federal prosecutors nearly unlimited, unsupervised power to work their will on the public. Progressive ideology combined with self-interest to create a powerful engine for the demonization of success. Prosecutors could not only pursue their personal agenda but also climb the career ladder by making high-profile cases against celebrities. The prosecution of Michael Milken of Drexel Burnham Lambert is a classic case of persecution in the guise of prosecution. Milken virtually created the junk-bonk market, thereby originating an asset class that has enhanced the wealth of investors by untold billions or trillions of dollars. For his pains, Milken was sent to jail.

Martha Stewart is a high-profile celebrity who was, in effect, convicted of the crime of being famous. She was charged and convicted of lying to police about a case in which the only crime could have been the offense of insider-trading. But she was the trader and she was not charged with insider-trading. The utter triviality and absence of any damage to consumers or society at large make it clear that she was targeted because of her celebrity; e.g., her success.

Today, the impetus for pursuing successful individuals and companies today comes primarily from the federal level. Harvey Silverglate (author of Three Felonies Per Day) has shown that virtually nobody is safe from the depredations of prosecutors out to advance their careers by racking up convictions at the expense of justice.

Government is the institution charged with making and enforcing law, yet government has now become the chief threat to law. At the state and local level, governments hand out special favors and tax benefits to favored recipients – typically those unable to attain success on their own efforts – while making up the revenue from the earned income of taxpayers at large. At the federal level, Congress fails in its fundamental duty and ignores the law by refusing to pass budgets. The President appoints czars to make regulatory law, while choosing at discretion to obey the provisions of some laws and disregard others. In this, he fails his fundamental executive duty to execute the laws faithfully. Judges treat the Constitution as a backdrop for the expression of their own views rather than as a subject for textual fidelity. All parties interpret the Constitution to suit their own convenience. The overarching irony here is that the least successful institution in America has united in a common purpose against the successful achievers in society.

The most recent Presidential campaign was conducted largely as a jihad against the rich and successful in business. Mitt Romney was forced to defend himself against the charge of succeeding too well in his chosen profession, as well as the corollary accusation that his success came at the expense of the companies and workers in which his private-equity firm invested. Either his success was undeserved or it was really failure. There was no escape from the double bind against which he struggled.

It is clear, than, that the “progressivism” decried by Tom Perkins dates back over a century and that it has waged a war on wealth and success from the outset. The tide of battle has flowed – during the rampage of the Bull Moose, the Depression and New Deal and the recent Great Recession and financial crisis – and ebbed – under Eisenhower and Reagan. Now the forces of freedom have their backs to the sea.

It is this much-richer context that forms the backdrop for Tom Perkins’ warning. Viewed in this panoramic light, Perkins’ letter looks more and more like the battle cry of a counter-revolution than the crazed rant of an isolated one-percenter.