DRI-183 for week of 3-1-15: George Orwell, Call Your Office – The FCC Curtails Internet Freedom In Order to Save It

An Access Advertising EconBrief:

George Orwell, Call Your Office – The FCC Curtails Internet Freedom In Order to Save It

February 26, 2015 is a date that will live in regulatory infamy. That assertion is subject to revision by the courts, as is nearly everything undertaken these days by the Obama administration. As this is written, the Supreme Court hears yet another challenge to “ObamaCare,” the Affordable Care Act. President Obama’s initiative to achieve a single-payer system of national health care in the U.S. is rife with Orwellian irony, since it cannot help but make health care unaffordable for everybody by further removing the consumer of health care from every exposure to the price of health care. Similarly, the latest administration initiative is the February 26 approval by the Federal Communications Commission (FCC) of the so-called “Net Neutrality” doctrine in regulatory form. Commission Chairman Tom Wheeler’s summary of his regulatory proposal – consisting of 332 pages that were withheld from the public – has been widely characterized as a proposal to “regulate the Internet like a public utility.”

This episode is riven with a totalitarian irony that only George Orwell could fully savor. The FCC is ostensibly an independent regulatory body, free of political control. In fact, Chairman Wheeler long resisted the “net neutrality” doctrine (hereinafter, shortened to “NN” for convenience). The FCC’s decision was a response to pressure from President Obama, which made a mockery of the agency’s independence. The alleged necessity for NN arises from the “local monopoly” over “high-speed” broadband exerted by Internet service providers (again, hereinafter abbreviated as “ISPs”) – but a “public utility” was, and is, by definition a regulated monopoly. Since the alleged local monopoly held by ISPs is itself fictitious, the FCC is in fact proposing to replace competition with monopoly.

To be sure, the particulars of Chairman Wheeler’s proposal are still open to conjecture. And the enterprise is wildly illogical on its face. The idea of “regulating the Internet like a public utility” treats those two things as equivalent entities. A public utility is a business firm. But the Internet is not a single business firm; indeed, it is not a single entity at all in the concrete sense. In the business sense, “the Internet” is shorthand for an infinite number of existing and potential business firms serving the world’s consumers in countless ways. The clause “regulate the Internet like a public utility” is quite literally meaningless – laughably indefinite, overweening in its hubris, frightening in its totalitarian implications.

It falls to an economist, former FCC Chief Economist Thomas Hazlett of Clemson University, to sculpt this philosophy into its practical form. He defines NN as “a set of rules… regulating the business model of your local ISP.” In short, it is a political proposal that uses economic language to prettify and conceal its real intentions. NN websites are emblazoned with rhetoric about “protecting the Open Internet” – but the Internet has thrived on openness for over 20 years under the benign neglect of government regulators. This proposal would end that era.

There is no way on God’s green earth to equate a regulated Internet with an open Internet; the very word “regulated” is the antithesis of “open.” NN proponents paint scary scenarios about ISPs “blocking or interfering with traffic on the Internet,” but their language is always conditional and hypothetical. They are posing scenarios that might happen in the future, not ones that threaten us today. Why? Because competition and innovation protected consumers up to now and continue to do so. NN will make its proponents’ scary predictions more likely, not less, because it will restrict competition. That is what regulation does in general; that is what public-utility regulation specifically does. For over a century, public-utility regulation has installed a single firm as a regulated monopoly in a particular market and has forcefully suppressed all attempts to compete with that firm.

Of course, that is not what President Obama, Chairman Wheeler and NN proponents want us to envision when we hear the words “regulate the Internet like a public utility.” They want us to envision a lovely, healthy flock of sheep grazing peacefully in a beautiful meadow, supervised by a benevolent, powerful Shepherd with a herd of well-trained, affectionate shepherd dogs at his command. Soothing music is piped down from heaven and love and tranquility reign. At the far edges of the meadow, there is a forest. Hungry wolves dwell within, eyeing the sheep covetously. But they dare not approach, for they fear the power of the Shepherd and his dogs.

In other words, the Obama administration is trying to manipulate the emotions of the electorate by creating an imaginary vision of public-utility regulation. The reality of public-utility regulation was, and is, entirely different.

The Natural-Monopoly Theory of Public-Utility Regulation

The history of public-utility regulation is almost, but not quite, co-synchronous with that of government regulation of business in the United States. Regulation began at the state level with Munn vs. Illinois, which paved the way for state government of the grain business in the 1870s. The Interstate Commerce Commission’s inaugural voyage with railroad regulation followed in the late 1880s. With the commercial introduction of electric lighting and the telephone came business firms tailored to those ends. And in their wake came the theory of natural monopoly.

Both electric power and telephones came to be known as “natural monopoly” industries; that is, industries in which both economic efficiency and commercial viability chose one single firm to serve the entire market. This was the outgrowth of economies of scale in production, owing to decreasing long-run average cost of production. This decidedly unusual state of affairs is a technological anomaly. Engineers recognize it in conjunction with the “two-thirds rule.” There are certain cases in which cost increases as the two-thirds power of output, which implies that cost decreases steadily as output rises. (The thru-put of pipes and cables and the capacity of cargo holds are examples.) In turn, this implies that the firm that grows the fastest will undersell all others while still covering all its costs. The further implication is that consumers will receive the most output at the lowest price if one monopoly firm serves everybody – if, and only if, the firm’s price can be constrained equal to its long-run average cost at the rate of output necessary to meet market demand. An unconstrained monopoly would produce less than this optimal rate of output and charge a higher price, in order to maximize its profit. But the theoretical outcome under regulated monopoly equates price with long-run average cost, which provides the utility with a rate of return equal to what it could get in the best alternative use for its financial capital, given its business risk.

In the U.S. and Canada, this regulated outcome is sought by a public-utility commission via the medium of periodic hearings staged by the public-utility regulatory commission (PUC for short). The utility is privately owned by shareholders. In Europe, utilities are not privately owned. Instead, their prices are (in principle) set equal to long-run marginal cost, which is below the level of average cost and thus constitutes a loss in accounting terms. Taxpayers subsidize this loss – these subsidies are the alternative to the profits earned by regulated public-utility firms in the U.S. and Canada.

These regulatory schemes represent the epitome of what the Nobel laureate Ronald Coase called “blackboard economics” – economists micro-managing reality as if they possessed all the information and control over reality that they do when drawing diagrams on a classroom blackboard. In practice, things did not work out as neatly as the foregoing summary would lead us to believe. Not even remotely close, in fact.

The Myriad Slips Twixt Theoretical Cup and Regulatory Lip

What went wrong with this theoretical set-up, seemingly so pat when viewed in a textbook or on a classroom blackboard? Just about everything, to some degree or other. Today, we assume that the institution of regulated monopoly came in response to market monopolies achieved and abuses perpetrated by electric and telephone companies. What mostly happened, though, was different. There were multiple providers of electricity and telephone service in the early days. In exchange for submitting to rate-of-return regulation, though, one firm was extended a grant of monopoly and other firms were excluded. Only in very rare cases did competition exist for local electric service – and curiously, this rate competition actually produced lower electric rates than did public-utility regulation.

This result was not the anomaly it seemed, since the supposed economies of scale were present only in the distribution of electric power, not in power generation. So the cost superiority of a single firm producing for the whole market turned out to be not the slam-dunk that was advertised. That was just one of many cracks in the façade of public-utility regulation. Over the course of the 20th century, the evolution of public-utility regulation in telecommunications proved to be paradigmatic for the failures and inherent shortcomings of the form.

Throughout the country, the Bell system were handed a monopoly on the provision of local service. Its local service companies – the analogues to today’s ISPs – gradually acquired reputations as the heaviest political hitters in state-government politics. The high rates paid by consumers bought lobbyists and legislators by the gross, and they obediently safeguarded the monopoly franchise and kept the public-utility commissions (PUCs) staffed with tame members. That money also paid the bill for a steady diet of publicity designed to mislead the public about the essence of public-utility regulation.

We were assured by the press that the PUC was a vigilant watchdog whose noble motives kept the greedy utility executives from turning the rate screws on a helpless public. At each rate hearing, self-styled consumer advocacy groups paraded their compassion for consumers by demanding low rates for the poor and high rates on business – as if it were really possible for some non-human entity called “business” to pay rates in the true sense, any more than they could pay taxes. PUCs made a show of ostentatiously requiring the utility to enumerate its costs and pretending to laboriously calculate “just and reasonable” rates – as if a Commission possessed juridical powers denied to the world’s greatest philosophers and moralists.

Behind the scenes, after the press had filed their poker-faced stories on the latest hearings, increasingly jaded and cynical reporters, editors and industry consultants rolled their eyes and snorted at the absurdity of it all. Utilities quickly learned that they wouldn’t be allowed to earn big “profits,” because this would be cosmetically bad for the PUC, the consumer advocates, the politicians and just about everybody involved in this process. So executives, middle-level managers and employees figured out that they had to make their money differently than they would if working for an ordinary business in the private sector. Instead of working efficiently and productively and striving to maximize profit, they would strive to maximize cost instead. Why? Because they could make money from higher costs in the form of higher salaries, higher wages, larger staffs and bigger budgets. What about the shareholders, who would ordinarily be shafted by this sort of behavior? Shareholders couldn’t lose because the PUC was committed to giving them a rate of return sufficient to attract financial capital to the industry. (And the shareholders couldn’t gain from extra diligence and work effort put forward by the company because of the limitation on profits.) That is, the Commission would simply ratchet up rates commensurate with any increase in costs – accompanied by whatever throat-clearing, phony displays of concern for the poor and cost-shifting shell games were necessary to make the numbers work. In the final analysis, the name of the game was inefficiency and consumers always paid for it – because there was nobody else who could pay.

So much for the vaunted institution of public-utility regulation in the public interest. Over fifty years ago, a famous left-wing economist named Gardiner Means proposed subjecting every corporation in the U.S. to rate-of-return regulation by the federal government. This held the record for most preposterous policy program advanced by a mainstream commentator – until Thomas Wheeler announced that henceforth the Internet would be regulated as if it were a public utility. Now every American will get a taste of life as Ivan Denisovich, consigned to the Gulag Archipelago of regulatory bureaucracy.

Of particular significance to us in today’s climate is the effect of this regime on innovation. Outside of totalitarian economies such as the Soviet Union and Communist China, public-utility regulation is the most stultifying climate for innovation ever devised by man. The idea behind innovation is to find ways to produce more goods using the same amount of inputs or (equivalently) the same amount of goods using fewer inputs. Doing this lowers costs – which increases profits. But why do to the trouble if you can’t enjoy the increase in profits? Of course, utilities were willing to spend money on research, provided they could get it in the rate base and earn a rate of return on the investment. But they had no incentive to actually implement any cost-saving innovations. The Bell System was legendary for its unwillingness to lower its costs; the economic literature is replete with jaw-dropping examples of local Bell companies lagging years and even decades behind the private sector in technology adoption – even spurning advances developed in Bell’s own research labs!

Any reader who suspects this writer of exaggeration is invited to peruse the literature of industrial organization and regulation. One nagging question should be dealt with forthwith. If the demerits of public-utility regulation were well recognized by insiders, how were they so well concealed from the public? The answer is not mysterious. All of those insiders had a vested interest in not blowing the whistle on the process because they were making money from ongoing public-utility regulation. Commission employees, consultants, expert witnesses, public-interest lawyers and consumer advocates all testified at rate hearings or helped prepare testimony or research it. They either worked full-time or traveled the country as contractors earning lucrative hourly pay. If any one of them was crazy enough to launch an expose of the public-utility scam, he or she would be blackballed from the business while accomplishing nothing – the institutional inertia in favor of the system was so enormous that it would have taken mass revolt to effect change. So they just shrugged, took the money and grew more cynical by the year.

In retrospect, it seems miraculous that anything did change. In the 1960s, local Bell companies were undercharging for local service to consumers and compensating by soaking business and long-distance customers with high prices. The high long-distance rates eventually attracted the interest of would-be competitors. One government regulator grew so fed up with the inefficiency of the Bell system that he granted the competitive petition of a small company called MCI, which sought to compete only in the area of long-distance telecommunications. MCI was soon joined by other firms. The door to competition had been cracked slightly ajar.

In the 1980s, it was kicked wide open. A federal antitrust lawsuit against AT&T led to the breakup of the firm. At the time, the public was dubious about the idea that competition was possible in telecommunications. The 1990s soon showed that regulators were the only ones standing between the American public and a revolution unlike anything we had seen in a century. After vainly trying to protect the local Bells against competition, regulators finally succumbed to the inevitable – or rather, they were overrun by the competitive hordes. When the public got used to cell phones and the Internet, they ditched good old Ma Bell and land-line phones.

This, then, is public-utility regulation. The only reason we have smart phones and mobile Internet access today is that public-utility regulation in telecommunications was overrun by competition despite regulatory opposition in the 1990s. But public-utility regulation is the wonderful fate to which Barack Obama, Thomas Wheeler and the FCC propose to consign the Internet. What is the justification for their verdict?

The Case for Net Neutrality – Debunked

As we have seen, public-utility regulation was based on a premise that certain industries were “natural monopolies.” But nobody has suggested that the Internet is a natural monopoly – which makes sense, since it isn’t an industry. Nobody has suggested that all or even some of the industries that utilize the Internet are natural monopolies – which makes sense, since they aren’t. So why in God’s name should we subject them to public-utility regulation – especially since public-utility regulation didn’t even work well in the industries for which it was ideally suited? We shouldn’t.

The phrase “net neutrality” is designed to achieve an emotional effect through alliteration and a carefully calculated play on the word “neutral.” In this case, the word is intended to appeal to egalitarian sympathies among hearers. It’s only fair, we are urged to think, that ISPs, the “gatekeepers” of the Internet, are scrupulously fair or “neutral” in letting everybody in on the same terms. And, as with so many other issues in economics, the case for “fairness” becomes just so much sludge upon closer examination.

The use of the term “gatekeepers” suggests that God handed to Moses on Mount Olympus a stone tablet for the operation of the Internet, on which ISPs were assigned the role of “gatekeepers.” Even as hyperbolic metaphor, this bears no relation to reality. Today, cable companies are ISPs. But they began life as monopoly-killers. In the early 1960s, Americans chose between three monopoly VHF-TV networks, broadcast by ABC, NBC and CBS. Gradually, local UHF stations started to season the diet of content-starved viewers. When cable-TV came along, it was like manna from heaven to a public fed up with commercials and ravenous for sports and movies. But government regulators didn’t allow cable-TV to compete with VHF and UHF in the top 100 media markets of the U.S. for over two decades. As usual, regulators were zealously protecting government monopoly, restricting competition and harming consumers.

Eventually, cable companies succeeded in tunneling their way into most local markets. They did it by bribing local government literally and figuratively – the latter by splitting their profits via investment in pet political projects of local politicians as part of their contracts. In return, they were guaranteed various degrees of exclusivity. But this “monopoly” didn’t last because they eventually faced competition from telecommunication firms who wanted to get into their business and whose business the cable companies wanted to invade. And today, the old structural definitions of monopoly simply don’t apply to the interindustry forms of competition that prevail.

Take the Kansas City market. Originally, Time Warner had a monopoly franchise. But eventually a new cable company called Everest invaded the metro area across the state line in Johnson County, KS. Overland Park is contiguous with Kansas City, MO, and consumers were anxious to escape the toils of Time Warner. Eventually, Everest prevailed upon KC, MO to gain entry to the Missouri side. Now even the cable-TV market was competitive. Then Google selected Kansas City, KS as the venue for its new high-speed service. Soon KC, MO was included in that package, too – now there were three local ISPs! (Everest has morphed into two successive incarnations, one of which still serves the area.)

Although this is not typical, it does not exhaust the competitive alternatives. This is only the picture for fixed service. Americans are now turning to mobile forms of access to the Internet, such as smart phones. Smart watches are on the horizon. For mobile access, the ISP is a wireless company like AT&T, Verizon, Sprint or T-Mobile.

The NN websites stridently maintain that “most Americans have only a single ISP.” This is nonsense; a charitable interpretation would be that most of us have only a single cable-TV provider in our local market. But there is no necessary one-to-one correlation between “cable-TV provider” and “ISP.” Besides, the state of affairs today is ephemeral – different from what is was a few years ago and from what it will be a few years from now. It is only under public-utility regulation that technology gets stuck in one place because under public-utility regulation there is no incentive to innovate.

More specifically, the FCC’s own data suggest that 80% of Americans have two or more ISPs offering 10Mbps downstream speeds. 96% have two or more ISPs offering 6Mbps downstream and 1.5 upstream speeds. (Until quite recently, the FCC’s own criterion for “high-speed” Internet was 4Mbps or more.) This simply does not comport with any reasonable structural concept of monopoly.

The current flap over “blocking and interfering with traffic on the Internet” is the residue of disputes between Netflix and ISPs over charges for transmission of the former’s streaming services. In general, there is movement toward higher charges for data transmission than for voice transmission. But the huge volumes of traffic generated by Netflix cause congestion, and the free-market method for handling congestion is a higher price, or the functional equivalent. That is what economists have recommended for dealing with road congestion during rush hours and congested demand for air-conditioning and heating services at peak times of day and during peak seasons. Redirecting demand to the off-peak is not a monopoly response; it is an efficient market response. Competitive bar and restaurant owners do it with their pricing methods; competitive movie theater owners also do it (or used to).

Similar logic applies to other forms of hypothetically objectionable behavior by ISPs. The prioritization of traffic, creation of “fast” and “slow” lanes, blocking of content – these and other behaviors are neither inherently good nor bad. They are subject to the constraints of competition. If they are beneficial on net balance, they will be vindicated by the market. That is why we have markets. If a government had to vet every action by every business for moral worthiness in advance, it would paralyze life as we know it. The only sensible course is to allow free markets and competition to police the activities of competitors.

Just as there is nothing wrong or untoward with price differentials based on usage, there is nothing virtuous about government-enforced pricing equality. Forcing unequals to be treated equally is not meritorious. NN proponents insist that the public has to be “protected” from that kind of treatment. But this is exactly what PUCs did for decades when they subsidized residential consumers inefficiently by soaking business and long-distance users with higher rates. Back then, the regulatory mantra wasn’t “net neutrality,” it was “universal service.” Ironically, regulators never succeeded in achieving rates of household telephone subscription that exceeded the rate of household television service. Consumers actually needed – but didn’t get – protection from the public-utility monopoly imposed upon them. Today, consumers don’t need protection because there is no monopoly, nor is there any prospect of one absent regulatory intervention. The only remaining vestige of monopoly is that remaining from the grants of local cable-TV monopoly given by municipal governments. Compensating for past mistakes by local government is no excuse for making a bigger mistake by granting monopoly power to FCC regulators.

Forbearance? 

The late, great economist Frank Knight once remarked that he had heard do-gooders utter the equivalent words to “I want power to do good” so many times for so long that he automatically filtered out the last three words, leaving only “I want power.” Federal-government regulators want the maximum amount of power with the minimum number of restrictions, leaving them the maximum amount of flexibility in the exercise of their power. To get that, they have learned to write excuses into their mandates. In the case of NN and Internet regulation, the operative excuse is “forbearance.”

Forbearance is the writing on the hand with which they will wave away all the objections raised in this essay. The word appears in the original Title II regulations. It means that regulators aren’t required to enforce the regulations if they don’t want to; they can “forebear.” “Hey, don’t worry – be happy. We won’t do the bad stuff, just the good stuff – you know, the ‘neutrality’ stuff, the ‘equality’ stuff.” Chairman Wheeler is encouraging NN proponents to fill the empty vessel of Internet regulation with their own individual wish-fulfillment fantasies of what they dream a “public-utility” should be, not what the ugly historical reality tells us public-utility regulation actually was. For example, he has implied that forbearance will cut out things like rate-of-return regulation.

This just begs the questions raised by the issue of “regulating the Internet like a public utility.” The very elements that Wheeler proposes to forbear constitute part and parcel of public-utility regulation as we have known it. If these are forborne, we have no basis for knowing what to expect from the concept of Internet public-utility regulation at all. If they are not, after all, forborne – then we are back to square one, with the utterly dismal prospect of replaying 20th-century public-utility regulation in all its cynical inefficiency.

Forbearance is a good idea, all right – so good that we should apply it to the whole concept of Internet regulation by the federal government. We should forbear completely.

DRI-315 for week of 4-20-14: Is GDP NDG in the Digital Age?

An Access Advertising EconBrief:

Is GDP NDG in the Digital Age?

For years, we have heard the story of stagnant American wages, of the supposed stasis in which the real incomes of the middle and lower class are locked while the rich get richer. Various sophisticated refutations of this hypothesis have appeared. Households have been getting smaller, so the fact that “household income” is falling reflects mainly the fact that fewer people are earning the income that comprise it. “Wages” do not include the (largely untaxed) benefits that have made up a steadily larger share of workers’ real incomes ever since World War II.

But there is something else going on, something more visceral than statistics that leads us

to reject this declinism. It is the evidence of our own senses, our eyes and ears. As we go about our daily lives, each of us and the people around us do not exhibit the symptoms of a people getting materially worse off as we go.

For over thirty years, we have been forsaking the old broadcast trinity of network television stations, at first in favor of cable television and recently for a broadening array of alternative media. For over twenty years, our work and home lives have been dominated by desktop computers that have revolutionized our working and personal lives. For over ten years, an amazing profusion of digital products have taken over the way we live. Cell phones, smart phones, tablets, pads and other space-age electronic wonders have shot us out of a consumer cannon into a new world.

Can it really, truly be that we are worse off than we were before all this happened? As the late John Wayne would say if he were here to witness this phenomenon: “Not hardly.”

The pace of this technological revolution has not only been too fast for most of us to stay abreast of it. It has left many of our 20th century institutions blinking in the dust and gasping for breath. Mainstream economic theory and national income accounting, in particular, are trying to gauge the impact of a 21st-century revolution using the logic and measurement tools they developed in the first half of the 20th century.

The Case Study of Music

Music was one of the great consumer success stories of the 20th century. Thomas Edison’s invention of the phonograph paved the way for the recording of everything from live artistic performances to studio recordings of musicians and singers to the use of recorded sound tracks for motion pictures. The recordings themselves were contained on physical media that ranged from metal discs to vinyl to plastic. At first, these “records” were sold to consumers and played on the phonographs. Sales were in the hundreds of millions. Artists included some of the century’s most visible and talented individuals. The monetary value of these sales grew into billions of dollars.

Since recordings were consumer goods rather than capital goods, sales of records were recorded in the national income and product accounts. Or rather, the value added in the final, or retail, transaction was included. The value-added style of accounting was developed with the inauguration of the accounts in the late 1930s and early 40s in order to do three things: (1) show activity at various stages of production, but (2) highlight the new production of consumption goods each year to reflect the fact that the end-in-view behind all economic activity is consumption (3) by including only the additional value created at each stage to avoid double-counting.

As the 20th century came to a close, however, record albums were replaced by small audio discs that could be played on more compact devices. And these were soon supplanted by computers – that is, the playing medium became a computer and the music itself was housed within a computer file rather than a substantial physical object. As technology advanced, in other words, the media grew smaller and less substantial. But the message itself was unaffected; indeed, it was even improved.

How do we know that the value people derive from music has not been adversely affected by this transition to digitization? In The Second Machine Age, authors Erik Brynjolfsson and Andrew McAfee consider the question at length. In terms of physical units, sales of music have fallen off the table. Just in the years 2004-2008, they fell from roughly 800 million units to less than 400 million units – a decline of over 50% in four years! And the total revenue from sales of music fell 40% from $12.3 billion to $7.4 billion over the same period. By the standards we usually apply to business, this sounds like an industry in freefall.

In this case, though, those standards are misleading. During that same time span, the total unit-volume of music purchased still grew when purchases of digitized music where factored in. And acquisitions of music free of charge by various means swelled the total much, much larger. One of the things economists are best at is analyzing non-traditional markets, which is why Joel Waldfogel of the University of Minnesota was able to infer that the quality of music available to consumers has actually increased in the digital era. Today, anybody with a smartphone can access some 20 million songs via services like Spotify and Rhapsody. For those of us who recall the days of LPs and phonograph needles, the transition to today has been dizzying.

But the economics of the digital age have driven prices through the floor. As Brynjolfsson and McAfee observe, it is the same process that has driven the newspaper business to the wall and its readers online; the same one that has driven classified-advertising from newspapers to Craigslist; the same one that impels us to share photos on Facebook rather than buying prints for friends and family. “Analog dollars,” they conclude, “are becoming digital pennies.”

This creates an unprecedented marketplace anomaly. Measured by the value it creates for human beings, which is how economists want to measure it, the music industry is booming. But measured in dollars’ worth of marketplace transactions, which is how economists are currently able to measure it, the music industry is declining rapidly.

GDP RIP?

If the music industry were a singularity, we might treat it as a mere curiosity. It is not, of course; the gap between price/quantity product and value created yawns wide across the spectrum of industry. “By now, the number of pages and digital text and images on the Web is estimated to exceed one trillion…children with smartphones today have access to more information in real time via the mobile web than the President of the United States had twenty years ago. [!] Wikipedia alone claims to have over fifty times as much information as Encyclopedia Britannica, the premier compilation of knowledge for most of the twentieth century.”

“…Bits are created at virtually zero cost and transmitted almost instantaneously worldwide. What’s more, a copy of a digital good is exactly identical to the original… Because they have zero price, these services are virtually invisible in the official statistics. They add value to the economy, but not dollars to GDP… When a business traveler calls home to talk to her children via Skype, that may add zero to GDP, but it’s hardly worthless. Even the wealthiest robber baron would have been unable to buy this service [in the 19th century]. How do we measure the benefits of free goods or services that were unavailable at any price in previous eras?”

This understates the case. As Brynjolfsson and McAfee acknowledge, most of the new digital services substitute for existing services whose sales contribute to GDP. Thus, the digital bonanza actually lowers measured GDP at the same time that our well-being rises. In economic jargon, the effect on GDP’s function as index of national welfare is perverse.

This leads many people, including these authors, to the conclusion that GDP is no longer an adequate measure of national output. If this is true, it makes our monthly, quarterly and annual preoccupations with the growth rate of GDP seem pretty silly. The government agency whose task is the compilation of economic statistics is the U.S. Bureau of Economic Analysis. Its definition of the economy’s “information sector” aggregates sales of software, publishing, movies, audio recordings, broadcasting, telecommunications, and data processing and information services. These sales account for about 4% of measured GDP today. Yet we are commonly understood to be chest-deep in a new “economy of information” that is replacing the economy of tangible goods and services. Either this perception or that 4% metric is wrong; the latter seems vastly more probable.

What’s more, the irrelevance of GDP increases by the nanosecond.

New Products

Of course, not all digital products and services are substitutes for existing counterparts. Some of them are genuinely new. If these are similarly hard to incorporate in GDP, the distortion may be only half as great as that described above. But the digital revolution has displayed a propensity for creating things that were unknown heretofore but that soon became necessary accoutrements of daily life.

Longtime macroeconomist and textbook author Robert Gordon estimated the value of new goods and services added but missed by GDP at about 0.4% of GDP. That may not sound like much, but since the long-term average annual rate of productivity growth is around 2%, it would mean that we are overlooking 20% of annual productivity.

GDP and Investment: The Bad News Gets Worse

GDP is failing because it neglects to measure the tremendous increases in consumption and well-being conferred by the digital age. But GDP also measures investment, or purports to. Are its failings on the consumption side mitigated by its performance with investment?

No, they are magnified. The production of digital goods and services is heavily dependent on intangible assets rather than the familiar plant and equipment that are the focus of traditional investment. Brynjolfsson and McAfee identify four categories of these intangibles: intellectual property, organizational capital, user-generated content and human capital. It comes as no surprise to find that the measurement of these assets largely eludes GDP as well.

Intellectual property encompasses any creation of the human mind to which legal ownership can be attached. Patents and copyrights form the backbone of this category. A great deal of spending on research and development (R&D) constitutes investment in intellectual property.

Yet R&D has long been recognized as almost impossible to accurately measure because only its cost is transparent, while the value (e.g., capital) it creates often escapes measurement.

Organizational capital is an even broader concept intended to capture the value inhering in brands, processes, techniques and conceptual structures owned by particular businesses. This category long predates the digital age but is idealized by companies like Apple, whose brand and unique corporate style complement its portfolio of intellectual property to create perhaps the world’s most productive company. Accountants have long sought to put a price tab on things like “good will” and “brand name.” We have observed that the transition to a computer-savvy work force has necessitated investment in procedures and processes far greater than the initial spending on the computer hardware and software – spending that doesn’t show up in the national income accounts as investment.

User-generated content is a true digital innovation. Facebook, Twitter, YouTube, Pinterest, Instagram, Yelp and countless other websites are largely created by their users. The value of this approach is both undeniable and subjective, as anybody who has every previewed a restaurant on Yelp or planned a vacation with TripAdvisor can testify. The feedback generated these sites provides an object lesson in the generation of information – the kind of information that economists had to assume that people already knew because we didn’t know how markets could make it available to them. Now we do.

Human capital was a concept invented and popularized by economists Theodore Schultz and Gary Becker decades before the Internet existed. The talents, skills and training that we receive make us better productive “machines,” which inspired the analogy with physical capital.

How important are these intangible assets in the modern economy? Nobody knows with certainty, but – as always – economists have made educated guesses. Brynjolfsson and McAfee estimate the value of organizational assets as some $2 trillion. The preeminent theorist of investment, Dale Jorgenson, estimated that human capital is worth 5-10 times as much as the stock of all physical capital in the U.S. Investment in R&D has been estimated at roughly 3% of GDP in recent decades.

The degree of distortion in GDP numbers – specifically in measures of productivity, which compares growth in inputs and output – is harder to gauge in this case than in the consumption example. Some intangible assets, like R&D and human capital, are longtime thorns in the sides of statisticians; their measurement has always been bad and may be no worse now than before. In some cases, the distortions in investment may offset those in consumption, so that the measure of productivity may be accurate even though the numerator and denominator of the ratio are inaccurate. But the elements most closely associated with the digital revolution, such as user-generated content, impart a huge downward bias to measured productivity in the national income accounts.

A New, Improved GDP?

Economists and other commentators have done a good job of diagnosing the havoc wreaked on GDP by the digital revolution. Alas, they have rested on those laurels. In the “solutions and policy proposals” section of their work, they have fallen back on the tried and trite. GDP was a sibling of macroeconomics; the economic logic underlying the two is the same, with the operative word being “lying.” Macroeconomists are loathe to repudiate their birthright, so their reflex is to cast about for ways to mend the measurement holes in GDP rather than abandon it as a bad job. Hence the rosy glow cast by Brynjolfsson and McAfee over nebulous concoctions like the “Social Progress Index” and the “Gallup-Healthways Well-Being Index.” As for the touted “Gross National Index” of Bhutan, the less said about this laughable fantasy (treated in a previous EconBrief), the better.

The authors cite the comments of Joseph Stiglitz, whom they call “Joe” to profit by the implied familiarity with a Nobel laureate: “…Changes in society and the economy may have heightened the problems at the same time that advances in economics and statistical techniques may have provided opportunities to improve our metrics.” The “improvements” don’t seem to have included the ability to stop the scandalous misuse of the concept of “statistical significance” that has plagued the profession for many decades.

In fact, GDP has been known to be a failure almost since inception. Introductory economics textbooks routinely inculcate students in the shortcomings of GDP as a “welfare index” by listing a roster of flaws that predate the digital age, the Internet and computers. It has ignored the value of household services (predominantly provided by women), ignored the value created by secondary transactions of all kinds of used goods, undervalued services and thrown up its figurative hands when confronted by non-market transactions of all kinds. Its continued use has been a grim tribute to Lord Kelvin’s dubious dictum that “science is measurement,” the implication being that measuring badly must be better than not measuring at all.

What’s more, the blame cannot be laid at the feet of economic theory. It is certainly true that the digital age has brought with it a veritable flood of “free” goods – seemingly in contradiction with Milton Friedman’s famous aphorism that “there is no such thing as a free lunch.” Hearken back to Brynjolfsson and McAfee’s words that “bits are created at virtually zero cost.” A fundamental principle – perhaps the fundamental principle – of neoclassical microeconomics is that price should equal marginal cost, so that the value placed on an additional unit of something by consumers should equal its (opportunity) cost of production. When marginal cost equals zero, there is nothing inherently perverse about a price approaching zero. No, the laws of economics have not been suspended on the Internet.

Careful comparison of the age-old flaws of GDP and its current failure to cope with the challenges posed by digital innovation reveal a common denominator. Both evince a neglect of real factors for lack of a monetary nexus. The source of this insistence upon monetary provenance is the Keynesian economic theory to which the national income accounts owe their origin. Keynesian theory dropped the classical theory of interest in favor of a superficial monetary theory of liquidity preference. That is now proving bogus, as witness the failure of Federal Reserve interest-rate policies since the 1960s. Keynesian theory gives spending the pride of place among economic activity and relegates saving and assets to a subordinate role. Indeed, the so-called “paradox of thrift” declares saving bad and spending good. No wonder, then, that the national income accounts fail to account for assets and capital formation in a satisfactory manner.

Instead of tinkering around the margins with new statistical techniques and gimmicks when they have not even mastered basic statistical inference, economists should instead rip out the rotting growth root and branch. Reform of macroeconomics and reform of the national income accounts go hand in hand.

End the Reign of GDP

The digital age has merely exposed the inherent flaws of GDP and widened its internal contradictions to the breaking point. It is time to dump it. The next measure of national output must avoid making the same mistakes as did the founders of the national income accounts nearly 80 years ago.

The next EconBrief will outline one new proposal for reform of the national income accounts and explain both its improvements and shortcomings.

DRI-259 for week of 2-2-14: Kristallnacht for the Rich: Not Far-Fetched

An Access Advertising EconBrief:

Kristallnacht for the Rich: Not Far-Fetched

Periodically, the intellectual class aptly termed “the commentariat” by The Wall Street Journal works itself into frenzy. The issue may be a world event, a policy proposal or something somebody wrote or said. The latest cause célèbre is a submission to the Journal’s letters column by a partner in one of the nation’s leading venture-capital firms. The letter ignited a firestorm; the editors subsequently declared that Tom Perkins of Kleiner Perkins Caulfield & Byers “may have written the most-read letter to the editor in the history of The Wall Street Journal.”

What could have inspired the famously reserved editors to break into temporal superlatives? The letter’s rhetoric was both penetrating and provocative. It called up an episode in the 20th century’s most infamous political regime. And the response it triggered was rabid.

“Progressive Kristallnacht Coming?”

“…I would call attention to the parallels of fascist Nazi Germany to its war on its “one percent,” namely its Jews, to the progressive war on the American one percent, namely “the rich.” With this ice breaker, Tom Perkins made himself a rhetorical target for most of the nation’s commentators. Even those who agreed with his thesis felt that Perkins had no business using the Nazis in an analogy. The Wall Street Journal editors said “the comparison was unfortunate, albeit provocative.” They recommended reserving Nazis only for rarefied comparisons to tyrants like Stalin.

On the political Left, the reaction was less measured. The Anti-Defamation League accused Perkins of insensitivity. Bloomberg View characterized his letter as an “unhinged Nazi rant.”

No, this bore no traces of an irrational diatribe. Perkins had a thesis in mind when he drew an analogy between Nazism and Progressivism. “From the Occupy movement to the demonization of the rich, I perceive a rising tide of hatred of the successful one percent.” Perkins cited the abuse heaped on workers traveling Google buses from the cities to the California peninsula. Their high wages allowed them to bid up real-estate prices, thereby earning the resentment of the Left. Perkins’ ex-wife Danielle Steele placed herself in the crosshairs of the class warriors by amassing a fortune writing popular novels. Millions of dollars in charitable contributions did not spare her from criticism for belonging to the one percent.

“This is a very dangerous drift in our American thinking,” Perkins concluded. “Kristallnacht was unthinkable in 1930; is its descendant ‘progressive’ radicalism unthinkable now?” Perkins point is unmistakable; his letter is a cautionary warning, not a comparison of two actual societies. History doesn’t repeat itself, but it does rhyme. Kristallnacht and Nazi Germany belong to history. If we don’t mend our ways, something similar and unpleasant may lie in our future.

A Short Refresher Course in Early Nazi Persecution of the Jews

Since the current debate revolves around the analogy between Nazism and Progressivism, we should refresh our memories about Kristallnacht. The name itself translates loosely into “Night of Broken Glass.” It refers to the shards of broken window glass littering the streets of cities in Germany and Austria on the night and morning of November 9-10, 1938. The windows belonged to houses, hospitals, schools and businesses owned and operated by Jews. These buildings were first looted, then smashed by elements of the German paramilitary SA (the Brownshirts) and SS (security police), led by the Gauleiters (regional leaders).

In 1933, Adolf Hitler was elevated to the German chancellorship after the Nazi Party won a plurality of votes in the national election. Almost immediately, laws placing Jews at a disadvantage were passed and enforced throughout Germany. The laws were the official expression of the philosophy of German anti-Semitism that dated back to the 1870s, the time when German socialism began evolving from the authoritarian roots of Otto von Bismarck’s rule. Nazi officialdom awaited a pretext on which to crack down on Germany’s sizable Jewish population.

The pretext was provided by the assassination of German official Ernst vom Rath on Nov. 7, 1938 by a 17-year-old German boy named Herschel Grynszpan. The boy was apparently upset by German policies expelling his parents from the country. Ironically, vom Rath’s sentiments were anti-Nazi and opposed to the persecution of Jews. Von Rath’s death on Nov. 9 was the signal for release of Nazi paramilitary forces on a reign of terror and abduction against German and Austrian Jews. Police were instructed to stand by and not interfere with the SA and SS as long as only Jews were targeted.

According to official reports, 91 deaths were attributed directly to Kristallnacht. Some 30,000 Jews were spirited off to jails and concentration camps, where they were treated brutally before finally winning release some three months later. In the interim, though, some 2-2,500 Jews died in the camps. Over 7,000 Jewish-owned or operated businesses were damaged. Over 1,000 synagogues in Germany and Austria were burned.

The purpose of Kristallnacht was not only wanton destruction. The assets and property of Jews were seized to enhance the wealth of the paramilitary groups.

Today we regard Kristallnacht as the opening round of Hitler’s Final Solution – the policy that produced the Holocaust. This strategic primacy is doubtless why Tom Perkins invoked it. Yet this furious controversy will just fade away, merely another media preoccupation du jour, unless we retain its enduring significance. Obviously, Tom Perkins was not saying that the Progressive Left’s treatment of the rich is now comparable to Nazi Germany’s treatment of the Jews. The Left is not interning the rich in concentration camps. It is not seizing the assets of the rich outright – at least not on a wholesale basis, anyway. It is not reducing the homes and businesses of the rich to rubble – not here in the U.S., anyway. It is not passing laws to discriminate systematically against the rich – at least, not against the rich as a class.

Tom Perkins was issuing a cautionary warning against the demonization of wealth and success. This is a political strategy closely associated with the philosophy of anti-Semitism; that is why his invocation of Kristallnacht is apropos.

The Rise of Modern Anti-Semitism

Despite the politically correct horror expressed by the Anti-Defamation Society toward Tom Perkins’ letter, reaction to it among Jews has not been uniformly hostile. Ruth Wisse, professor of Yiddish and comparative literature at HarvardUniversity, wrote an op-ed for The Wall Street Journal (02/04/2014) defending Perkins.

Wisse traced the modern philosophy of anti-Semitism to the philosopher Wilhelm Marr, whose heyday was the 1870s. Marr “charged Jews with using their skills ‘to conquer Germany from within.’ Marr was careful to distinguish his philosophy of anti-Semitism from prior philosophies of anti-Judaism. Jews “were taking unfair advantage of the emerging democratic order in Europe with its promise of individual rights and open competition in order to dominate the fields of finance, culture and social ideas.”

Wisse declared that “anti-Semitism channel[ed] grievance and blame against highly visible beneficiaries of freedom and opportunity.” “Are you unemployed? The Jews have your jobs. Is your family mired in poverty? The Rothschilds have your money. Do you feel more secure in the city than you did on the land? The Jews are trapping you in the factories and charging you exorbitant rents.”

The Jews were undermining Christianity. They were subtly perverting the legal system. They were overrunning the arts and monopolizing the press. They spread Communism, yet practiced rapacious capitalism!

This modern German philosophy of anti-Semitism long predated Nazism. It accompanied the growth of the German welfare state and German socialism. The authoritarian political roots of Nazism took hold under Otto von Bismarck’s conservative socialism, and so did Nazism’s anti-Semitic cultural roots as well. The anti-Semitic conspiracy theories ascribing Germany’s every ill to the Jews were not the invention of Hitler, but of Wilhelm Marr over half a century before Hitler took power.

The Link Between the Nazis and the Progressives: the War on Success

As Wisse notes, the key difference between modern anti-Semitism and its ancestor – what Wilhelm Marr called “anti-Judaism” – is that the latter abhorred the religion of the Jews while the former resented the disproportionate success enjoyed by Jews much more than their religious observances. The modern anti-Semitic conspiracy theorist pointed darkly to the predominance of Jews in high finance, in the press, in the arts and running movie studios and asked rhetorically: How do we account for the coincidence of our poverty and their wealth, if not through the medium of conspiracy and malefaction? The case against the Jews is portrayed as prima facie and morphs into per se through repetition.

Today, the Progressive Left operates in exactly the same way. “Corporation” is a pejorative. “Wall Street” is the antonym of “Main Street.” The very presence of wealth and high income is itself damning; “inequality” is the reigning evil and is tacitly assigned a pecuniary connotation. Of course, this tactic runs counter to the longtime left-wing insistence that capitalism is inherently evil because it forces us to adopt a materialistic perspective. Indeed, environmentalism embraces anti-materialism to this day while continuing to bunk in with its progressive bedfellows.

We must interrupt with an ironic correction. Economists – according to conventional thinking the high priests of materialism – know that it is human happiness and not pecuniary gain that is the ultimate desideratum. Yet the constant carping about “inequality” looks no further than money income in its supposed solicitude for our well-being. Thus, the “income-inequality” progressives – seemingly obsessed with economics and materialism – are really anti-economic. Economists, supposedly green-eyeshade devotees of numbers and models, are the ones focusing on human happiness rather than ideological goals.

German socialism metamorphosed into fascism. American Progressivism is morphing from liberalism to socialism and – ever more clearly – honing in on its own version of fascism. Both employed the technique of demonization and conspiracy to transform the mutual benefit of free voluntary exchange into the zero-sum result of plunder and theft. How else could productive effort be made to seem fruitless? How else could success be made over into failure? This is the cautionary warning Perkins was sounding.

The Great Exemplar

The great Cassandra of political economy was F.A. Hayek. Early in 1929, he predicted that Federal Reserve policies earlier in the decade would soon bear poisoned fruit in the form of a reduction in economic activity. (His mentor, Ludwig von Mises, was even more emphatic, foreseeing “a great crash” and refusing a prestigious financial post for fear of association with the coming disaster.) He predicted that the Soviet economy would fail owing to lack of a functional price system; in particular, missing capital markets and interest rates. He predicted that Keynesian policies begun in the 1950s would culminate in accelerating inflation. All these came true, some of them within months and some after a lapse of years.

Hayek’s greatest prediction was really a cautionary warning, in the same vein as Tom Perkins’ letter but much more detailed. The 1945 book The Road to Serfdom made the case that centralized economic planning could operate only at the cost of the free institutions that distinguished democratic capitalism. Socialism was really another form of totalitarianism.

The reaction to Hayek’s book was much the same as reaction to Perkins’ letter. Many commentators who should have known better have accused both of them of fascism. They also accused both men of describing a current state of affairs when both were really trying to avoida dystopia.

The flak Hayek took was especially ironic because his book actually served to prevent the outcome he feared. But instead of winning the acclaim of millions, this earned him the scorn of intellectuals. The intelligentsia insisted that Hayek predicted the inevitable succession of totalitarianism after the imposition of a welfare state. When welfare states in Great Britain, Scandinavia, and South America failed to produce barbed wire, concentration camps and German Shepherd dogs, the Left advertised this as proof of Hayek’s “exaggerations” and “paranoia.”

In actual fact, Great Britain underwent many of the changes Hayek had feared and warned against. The notorious “Rules of Engagements,” for instance, were an attempt by a Labor government to centrally control the English labor market – to specify an individual’s work and wage rather than allowing free choice in an impersonal market to do the job. The attempt failed just a dismally as Hayek and other free-market economists had foreseen it would. In the 1980s, it was Hayek’s arguments, wielded by Prime Minister Margaret Thatcher, which paved the way for the rolling back of British socialism and the taming of inflation. It’s bizarre to charge the prophet of doom with inaccuracy when his prophecy is the savior, but that’s what the Left did to Hayek.

Now they are working the same familiar con on Tom Perkins. They begin by misconstruing the nature of his argument. Later, if his warnings are successful, they will use that against him by claiming that his “predictions” were false.

Enriching Perkins’ Argument

This is not to say that Perkins’ argument is perfect. He has instinctively fingered the source of the threat to our liberties. Perkins himself may be rich, but argument isn’t; it is threadbare and skeletal. It could use some enriching.

The war on the wealthy has been raging for decades. The opening battle is lost to history, but we can recall some early skirmishes and some epic brawls prior to Perkins.

In Europe, the war on wealth used anti-Semitism as its spearhead. In the U.S., however, the popularity of Progressives in academia and government made antitrust policy a more convenient wedge for their populist initiatives against success. Antitrust policy was a crown jewel of the Progressive movement in the early 1900s; Presidents Theodore Roosevelt and William Howard Taft cultivated reputations as “trust busters.”

The history of antitrust policy exhibits two pronounced tendencies: the use of the laws to restrict competition for the benefit of incumbent competitors and the use of the laws by the government to punish successful companies for various political reasons. The sobering research of Dominick Armentano shows that antitrust policy has consistently harmed consumer welfare and economic efficiency. The early antitrust prosecution of Standard Oil, for example, broke up a company that had consistently increased its output and lowered prices to consumers over long time spans. The Orwellian rhetoric accompanying the judgment against ALCOA in the 1940s reinforces the notion that punishment, not efficiency or consumer welfare, was behind the judgment. The famous prosecutions of IBM and AT&T in the 1970s and 80s each spawned book-length investigations showing the perversity of the government’s claims. More recently, Microsoft became the latest successful firm to reap the government’s wrath for having the temerity to revolutionize industry and reward consumers throughout the world.

The rise of the regulatory state in the 1970s gave agencies and federal prosecutors nearly unlimited, unsupervised power to work their will on the public. Progressive ideology combined with self-interest to create a powerful engine for the demonization of success. Prosecutors could not only pursue their personal agenda but also climb the career ladder by making high-profile cases against celebrities. The prosecution of Michael Milken of Drexel Burnham Lambert is a classic case of persecution in the guise of prosecution. Milken virtually created the junk-bonk market, thereby originating an asset class that has enhanced the wealth of investors by untold billions or trillions of dollars. For his pains, Milken was sent to jail.

Martha Stewart is a high-profile celebrity who was, in effect, convicted of the crime of being famous. She was charged and convicted of lying to police about a case in which the only crime could have been the offense of insider-trading. But she was the trader and she was not charged with insider-trading. The utter triviality and absence of any damage to consumers or society at large make it clear that she was targeted because of her celebrity; e.g., her success.

Today, the impetus for pursuing successful individuals and companies today comes primarily from the federal level. Harvey Silverglate (author of Three Felonies Per Day) has shown that virtually nobody is safe from the depredations of prosecutors out to advance their careers by racking up convictions at the expense of justice.

Government is the institution charged with making and enforcing law, yet government has now become the chief threat to law. At the state and local level, governments hand out special favors and tax benefits to favored recipients – typically those unable to attain success on their own efforts – while making up the revenue from the earned income of taxpayers at large. At the federal level, Congress fails in its fundamental duty and ignores the law by refusing to pass budgets. The President appoints czars to make regulatory law, while choosing at discretion to obey the provisions of some laws and disregard others. In this, he fails his fundamental executive duty to execute the laws faithfully. Judges treat the Constitution as a backdrop for the expression of their own views rather than as a subject for textual fidelity. All parties interpret the Constitution to suit their own convenience. The overarching irony here is that the least successful institution in America has united in a common purpose against the successful achievers in society.

The most recent Presidential campaign was conducted largely as a jihad against the rich and successful in business. Mitt Romney was forced to defend himself against the charge of succeeding too well in his chosen profession, as well as the corollary accusation that his success came at the expense of the companies and workers in which his private-equity firm invested. Either his success was undeserved or it was really failure. There was no escape from the double bind against which he struggled.

It is clear, than, that the “progressivism” decried by Tom Perkins dates back over a century and that it has waged a war on wealth and success from the outset. The tide of battle has flowed – during the rampage of the Bull Moose, the Depression and New Deal and the recent Great Recession and financial crisis – and ebbed – under Eisenhower and Reagan. Now the forces of freedom have their backs to the sea.

It is this much-richer context that forms the backdrop for Tom Perkins’ warning. Viewed in this panoramic light, Perkins’ letter looks more and more like the battle cry of a counter-revolution than the crazed rant of an isolated one-percenter.

DRI-248 for week of 1-26-14: Economics as Movie ‘Spoiler’: Some Famous Cases

An Access Advertising EconBrief:

Economics as Movie ‘Spoiler’: Some Famous Cases

Motion pictures evolved into the great popular art form of the 20th century. In the 21st century, many popular cultural references derive from movies. One of these is the “spoiler” – prematurely revealing the ending of a book, play, movie or presentation of any kind.

Economists sometimes experience a slightly different sort of “spoiler.” Their specialized understanding often defeats the internal logic of a presentation, completely spoiling the author’s intended effect. Movies are especially vulnerable to this effect.

The casual perception is that our attitude toward movies is distorted by the high quotient of improbably beautiful and talented people who populate them. While it is true that physical beauty has always been highly prized by Hollywood, it is also true that plain or even ugly people like Wallace Beery, Marie Dressler, Jean Gabin and Rodney Dangerfield have become champions of the movie box office. The locus of unreality in movies has actually been the stories told.

Movies are best regarded as fairy tales for adults. They over-emphasize dramatic conflict and exaggerate the moral divide between protagonist and antagonist. It is difficult to find a real-world referent to the “happy ending” that resolves the typical movie. Protagonists are all too often “heroes” whose actions exceed the normal bounds of human conduct. In recent years, this tendency has escalated; veteran screenwriter William Goldman has complained that movie protagonists are now not heroes but “gods” whose actions exceed the bounds of physics and other natural laws.

In this context, it is hardly surprising that movie plots have sometimes ignored the laws of economics in order to achieve the stylized dramatic effects demanded by the medium. Since public knowledge of economics is, if anything, less well developed than knowledge of natural science, these transgressions have generally gone unremarked. Indeed, the offending movies are often praised for their realism and power. Thus, it is worthwhile to correct the mistaken economic impressions left by the movies, some of which have found their way into popular folklore.

In each of the following movies, the major plot point – the movie’s resolution – rests on an obvious fallacy or failure to apply economic logic.

Scrooge (U.S. title: A Christmas Carol) (1951)

We know the plot of this most classic of all Christmas tales by heart. Victorian businessman Ebenezer Scrooge, famed miser and misanthrope, abhors the spirit of Christmas. He is visited by three ghosts, emblematic of his youthful past, his empty present life and the lonely, friendless end that awaits him in the future. Their guidance awakens him to the waste of his single-minded pursuit of material gain and rejection of personal affection and warmth. He realizes the cruelty he has visited upon his clerk, the good-hearted family man, Bob Cratchit. Most of all, he keenly regrets the fate of Cratchit’s crippled son, Tiny Tim, who seems doomed by Cratchit’s poverty.

Having witnessed Scrooge’s emotional reformation, the audience is now primed for the crowning culmination. On the day after Christmas, Bob Cratchit shows up at Scrooge’s office, a bit late and encumbered by holiday festivities. Fearfully, he tiptoes to his desk, only to be brought up short by Scrooge’s thunderous greeting. Expecting a verbal pink slip, Cratchit receives instead the news that Scrooge is doubling his wage – and that their working relationship will be hereafter cordial. Tiny Tim’s future is redeemed, and the audience has experienced one of the most cathartic moments on film.

Unless, that is, the viewer happens to be an economist – in which case, the reaction will be a double take accompanied by an involuntary blurt like “I beg your pardon?” For this is a resolution that just simply makes no sense. In order to understand why, the first thing to realize is that the scriptwriter (translating Charles Dickens’ timeless story to the screen) is asking us to believe that Bob Cratchit has heretofore been working for half of what Scrooge is now proposing to pay him.

In the 17th and 18th centuries, historical novelists like Charles Dickens played the role played by filmmakers in the 20th century. They brought history alive to their audiences. Ideally, they stimulated further study of their subject matter – indeed, many famous historians have confessed that their initial stimulus came from great storytellers such as Dickens and Dumas. But many readers searched no further than the stories told by these authors for explanations to the course taken by events. Dickens was an exponent of what the great black economist Thomas Sowell called “volitional economics.” In this case, for example, the wage paid by Scrooge and received by Cratchit ostensibly depended on Scrooge’s will or volition, and nothing else. No role existed for a labor market. Cratchit was not a partisan in his own cause, but rather a passive pawn of fate.

This is not a theory likely to commend itself to an economist. Scrooge and Cratchit are working to produce services purchased by their customers. Who are these? Well might you ask, for neither Dickens nor the filmmakers chose to clutter up the narrative with such extraneous considerations. Yet it is this consumer demand that governs the demand for Scrooge’s output, which in turn values the productivity of Cratchit’s work. In a competitive labor market, the market wage will gravitate toward the marginal value product of labor; e.g., the value of Cratchit’s product at the margin translated into money with the aid of the market price for Scrooge’s services. And in crowded London, there is no doubt about the competitive demand for the low-skilled labor provided by Bob Cratchit. That is what attracted the Bob Cratchits of the world to London in the first place during the Industrial Revolution.

Two possibilities suggest themselves. Either Bob Cratchit was working for half of his marginal value product previously and is only now being elevated to that level, or Scrooge is now proposing to pay Cratchit a wage equal to twice Cratchit’s marginal value product. The first possibility requires us to believe not only that Cratchit was and is a complete idiot, but that henot Scrooge as Dickens clearly implies – is responsible for Tiny Tim’s tenuous medical situation. After all, all Cratchit had to do was step outside Scrooge’s firm and wander off a block or two in order to better his circumstances dramatically and pay Tiny Tim’s medical tab without having to bank on Scrooge’s miraculous reformation. Cratchit was guaranteed a job at slightly less than double his then-current wage by simply underbidding the market wage slightly. But he inexplicably continued to work for Scrooge at half the wage his own productivity commanded.

Alternatively, consider possibility number two. Scrooge is now going to pay Cratchit a wage equal to twice his (Cratchit’s) marginal value product. If Scrooge insists on raising his price commensurate with this wage hike, he will go out of business. If he keeps his price the same, he will now be working for much less net income than all the other business owners in his position. (See below for the implications of this.)

There is no third possibility here. Either Cratchit was (is) crazy or Scrooge is. And either way, it completely upsets Dickens’ cozy suggestions that all’s right with the world, Scrooge has restored the natural order of things and everybody lived happily ever after.

Of course, Scrooge may have accumulated considerable assets over the course of his life and business career. He may choose to make an ongoing gift to Cratchit in the form of a wage increase, as opposed to a bonus or an outright transfer of cash. But it is important to note that this is not what Dickens or the filmmakers imply. The tone and tenor of Dickens’ original story and subsequent films adapted from it unambiguously suggest that Scrooge has righted a wrong. He has not committed a random act of generosity. In other words, Dickens implies – absurd as it now clearly seems – that possibility number one above was his intention.

It is clear to an economist that Dickens has not provided a general solution to the problem of poverty in 19th century England. What if Scrooge were the one with the sick child – would his acquisitive ways then be excusable? Dickens makes it clear that Scrooge’s wealth flows directly from his miserliness. But if miserliness produces wealth and good-heartedness promotes poverty, economic growth and happiness are simply mutually exclusive. After all, the message of the movie is that Scrooge promises to reform year-round, not just one day per year. Henceforward, when approached by collectors for charity, he will refuse not out of meanness but out of genuine poverty, his transformation having stripped him of the earning power necessary to contribute to charity.

In actual fact, of course, Scrooge never existed. Neither did Cratchit. And they are not reasonable approximations of actual 19th-century employers or workers, either. But these figments of Dickens’ imagination have been tragically influential in shaping opinions about the economic history of Victorian England.

The Man in the White Suit (1951)

This comedy from England’s famed Ealing Studios (the world’s oldest movie studio) is justly famous, but for the wrong reasons. It highlights the inefficiency of British socialism and the growing welfare state, but its fame derives from its plot highlight. Inventor Alec Guinness worms his way into the R&D division of a local textile business, where he develops a fabric so durable that it will never wear out. Instead of gaining him the wealth and immortality he craves, it gains the opprobrium of the textile owners, who fear that the fabric will ruin them by cutting replacement sales to zero. They block his efforts at production and the film ends when his formula is revealed to contain a flaw – which he may or may not ever get the chance to de-bug, since he is now a pariahin the industry.

The film is often cited as an example of how big business prevents new technology from empowering consumers – that is, it is cited as if it were a factual case study rather than a fictional movie. Actually, it is a classic example of the failure to deploy economic logic.

Would a textile firm find it profitable to produce an “indestructible” fabric of the sort depicted in the film? Certainly. The firm would achieve a monopoly in the supply of fabric and could obtain finance to expand its operations as necessary to meet the immediate demand. In practice, of course, such a fabric would not really be indestructible in the same sense as, say, Superman’s costume. It would be impervious to normal wear but would suffer damage from tearing, fire, water and other extreme sources. Changes in fashion would also necessitate replacement production. Nevertheless, we can safely grant the premise that the invention would drastically reduce the replacement demand for fabric. But that would not deter an individual firm from developing the invention – far from it.

The film depicts textile firms striving in combination to buy out the inventor. Perhaps overtures of that kind might be made in reality. They would be doomed to failure, though, because in order to afford to pay the inventor’s price the firms would have to compensate the inventor for the discounted present value of the monopoly profits available in prospect. But in order to raise an amount of money equal to those monopoly profits, the firms would themselves have to be monopolists willing to mortgage their future monopoly profits. Textile companies may enjoy legislative protection from foreign competition in the form of tariffs and/or quotas, but they will still not possess the kind of market power enabling them to do this, even if they were so predisposed. Thus, both of the movie’s key plot points are undermined by economic logic.

This reasoning explains why there is so little proof for longstanding allegations that large corporations buy off innovators. While it will often be profitable to acquire competitors, it will normally be prohibitively expensive to buy and suppress revolutionary inventions. The value of a competitive firm reflects its competitive rate of return. The value of a revolutionary innovation reflects the value of a (temporary) monopoly, heavily weighted toward the relatively near future.

The Formula (1980)

The Formula was one of the most eagerly awaited movies of its day because it starred two of the most legendary stage and screen actors of all time, Marlon Brando and George C. Scott. It also boasted a topical plot describing a conspiracy to suppress a secret formula for producing synthetic gasoline. Who was behind the conspiracy? None other than “the big oil companies” – in the 1970s and 80s, as today, the oil companies were periodically trotted out as public whipping boys for the adverse effects of public policies on energy prices.

The film begins during World War II with the escape into Switzerland of a German military officer carrying secret documents. In the present day, Scott plays a homicide policeman investigating the grisly murder of his former supervisor. The decedent was working abroad for a large oil company at the time of his death, and his boss (Brando) reveals that his duties included making payoffs to Middle Eastern officials. Scott’s character also learns about the existence of a formula for conversion of coal into petroleum, supposedly developed secretly by German scientists during World War II and used by the Nazis to fuel their war machine.

Scott’s character seeks the killer and the formula for the remainder of the film. Each successive information source is murdered mysteriously after speaking with him. Eventually he learns the formula from its originator, who tells him that the oil companies plan to suppress it until its value is enormously enhanced by the extinction of remaining petroleum reserves. Brando’s character blackmails Scott’s character into relinquishing the formula and the film ends with the understanding that it will be suppressed indefinitely. The world is denied its chance at plentiful oil and the oil companies enforce an artificial oil shortage.

Novelist Steve Shagan also wrote the screenplay, but it should be noted that the version of the film released to theaters was the result of a conflict with director John G. Avildsen. Although no claim was advanced about the veracity of events depicted or information presented, the audience is clearly invited to take the film’s thesis seriously. Alas, history and economics preclude this.

The film makes much of the fact that Germany was able to conduct military operations around the world for a decade despite having no internal source of petroleum and only tenuous external sources. Germany must have had the ability to manufacture synthetic fuels, we think; otherwise, how could she have waged war so long and effectively?

The premise is sound enough. Germany’s oil refineries in the Ruhr Valley were perhaps the leading military target of Allied bombings; both crude and refined oil were in critically short supply throughout the 1940s. And there really was a “formula” for synthetic fuel – or, more precisely, a chemical process. But the film’s conclusion is all wrong, almost banally so.

The Fischer-Tropsch process was invented by two German scientists – not in World War II, but in 1925. It was not secret, but rather a matter of public knowledge. German companies used it openly in the 1930s. During World War II, when Germany had little or no petroleum or refining capability, the process provided about 25% of the country’s auto fuels and a significant share of other fuels as well. After the war, the process traveled to the U.S. and several plants experimented with it. In fact, it is still used sparsely today. Possible feedstocks for conversion into petroleum are coal, natural gas and biomass.

The reason that few people know about it is that it is too expensive for widespread use. Biomass plants using it have gone broke. Natural gas is too valuable for direct use by consumers to waste on indirect conversion into petroleum. And coal conversion wavers on the edge of commercial practicality; just about the time it begins to seem feasible, something changes unfavorably.

In real life – as opposed to reel life – the problem is not that secret formulas for synthetic fuels are being hidden by the all-powerful oil cartel. It is that the open and above-board chemical processes for conversion to synthetic fuel are just too darned expensive to be economically feasible under current conditions.

Erin Brockovich (2000)

Erin Brockovich is the film that sealed the motion-picture stardom of Julia Roberts by earning her an Academy Award for Best Actress. It was based on events in the life of its title character. Erin Brockovich was an unemployed single mother of three who met liability attorney Ed Masry when he unsuccessfully represented her in her suit for damages in a traffic accident. She took a job with his firm interviewing plaintiffs in a real-estate settlement against Pacific Gas & Electric.

In the course of her interviews, Brockovich claimed (and the film portrayed) that she unearthed a laundry list of diseases and ailments suffered by the 634 plaintiffs, who were residents of Hinkley, CA. These included at least five different forms of cancer, asthma and various other complaints. Brockovich was surprised to learn that PG&E had paid the medical expenses of these residents because of the presence of chromium in the drinking water, despite having assured the residents that the water was safe to drink. Eventually, Brockovich interviewed a company employee who claimed that corporate officials at PG&E were aware of the presence of “hexavalent chromium” (e.g.; chromium from multiple sources) in the drinking water and told employees in Hinkley to hide this information from residents. The whistleblower had been told to destroy incriminating documents but kept them instead and supplied them to Brockovich.

The film does everything but accuse the company of murder in so many words. It reports the jury verdict that awarded the Hinkley residents $333 million in damages. (The standard contingency fee to the law firm is 33%.) Brockovich received a $2 million bonus from her delighted boss. The film received a flock of award nominations in addition to Roberts’s Oscar, made a pile of money and got excellent reviews.

However, a few dissenting voices were raised in the scientific community. Scathing op-eds were published in The Wall Street Journal and The New York Times by scientists who pointed out that little or no science backed up the movie’s claims – or, for that matter, the legal case on which the movie was based.

It seems that the only scientific black mark against hexavalent chromium was lung cancer suffered by industrial workers who inhaled the stuff in large quantities. In contrast, the hexavalent chromium in Hinkley was ingested in trace amounts in drinking water. The first law of toxicology (the science of toxicity) is “the dose makes the poison.” Ingestion allows a substance to be attacked by digestive acids and eliminated via excretion; inhalation would permit it to be absorbed by organs like the lungs. Ironically, lung cancer wasn’t among the varieties identified by Brockovich.

What about the lengthy list of cancers grimly recited in the movie? Doesn’t that constitute a prima facie case of wrongdoing by somebody? No – just the reverse. As the scientists pointed out, biological or industrial agents are normally targeted in their effects; after all, they were usually created for some very specific purpose in the first place. So the likelihood of one agent, like hexavelent chromium, being the proximate cause of various diverse cancers is very remote. In any town or city, a medical census covering a reasonable time span will produce a laundry list of diseases like the one Brockovich compiled.

Economics provides equal grounds for skepticism of the movie’s conclusions. The movie imputes both wrongdoing and evil motives to a company. Somewhere within that company, human beings must have harbored the motives and committed the wrongs. But why? The standard motivation behind corporate wrongdoing is always money. The monetary category involved is normally profit. Presumably the imputed rationale would run somewhere along these lines: “Corporate executives feared that admitting the truth would result in adverse publicity and judgments against the company, costing the company profits and costing them their jobs.” But that motivation can’t possibly have applied to this particular case, because PG&E was a profit-regulated public utility.

Public-utility profits are determined by public-utility commissions in hearings. If a utility earns too much profit, its rates are adjusted downward. If it earns too little, its rates are adjusted upward. For over a century, economists have tried but failed to think up ways to get utility managers to behave efficiently by cutting costs. Economists have even argued in favor of allowing utilities to keep profits earned in between rate hearings, hoping that managers will have an incentive to cut costs if the company could actually keep profits in that scenario.

But here, according to the filmmakers, PG&E executives were so fanatically dedicated to safeguarding profits that the company couldn’t keep anyway that they were willing to knowingly poison their customers. They were willing to risk losing their jobs and going to jail (if their deception was uncovered) to guard against losing their jobs for loss of profits that were never going to be gained or lost in the first place. No economist will swallow this.

If the filmmakers had an explanation for this otherwise insane behavior, they didn’t offer in the movie. And without a scientific case or an economic motive, it is impossible to accept the film’s scenario of corporate conspiracy at face value. Instead, the likely motivational scenario is that PG&E executives didn’t confess their crimes and beg forgiveness because they had absolutely no scientific reason to think they had committed any crimes. They didn’t warn Hinkley residents about “known dangers” because they didn’t know about any dangers. They didn’t need to admit the presence of chromium in the drinking water because everybody already knew there were trace amounts of chromium in the drinking water. But they certainly weren’t going to advertise the presence of non-existent dangers for fear that somebody would seize the opportunity to make a legal case where none really existed.

Movies are Fairy Tales for Adults

The moral to these cases is that movies are fairy tales for adults. Given that, the absence of economic logic in the movies is not hard to fathom. How much economic logic did we learn from the fairy tales we heard in childhood?

This is not to indict movies – or fairy tales, either. We need them for the emotional sustenance they provide. Fairy tales help cushion our childhood introduction to reality. Movies help us cope with the wear and tear of daily life by recharging our emotional batteries.

But we must never confuse the fairy tale world of movies with the rational world in which we live. Our ultimate progress as a species depends on our reliance on markets, rational choice and free institutions. Of necessity, movies operate according to the visual logic of dramatic action. We expect reel life to liberate us from the conventions of real life and this is why movies seldom make economic sense.