DRI-326 for week of 5-12-13: Paul Krugman Can’t Stand the Truth About Austerity

An Access Advertising EconBrief: 

Paul Krugman Can’t Stand the Truth About Austerity

The digital age has produced many unfortunate byproducts. One of these is the rise of shorthand communication. In journalism, this has produced an overreliance on buzzwords. The buzzword substitutes for definition, delineation, distinction and careful analysis. Its advantage is that it purports to say so much within the confines of one word – which is truly a magnificent economy of expression, as long as the word is telling the truth. Alas, all too often, the buzzword buzzsaws its way through its subject matter like a chain saw, leaving truth mutilated and amputated in its wake.

The leading government budgetary buzzword of the day is “austerity.” For several years, members of the European Union have either undergone austerity or been threatened with it – depending on whose version of events you accept. Now the word has crossed the Atlantic and awaits a visa for admission to this country. It has met a chilly reception.

In a recent (05/11/2013) column, economist Paul Krugman declares that “at this point, the economic case for austerity…has collapsed.” In order to appreciate the irony of the column, we must probe the history of the policy called “austerity.” Tracing that history back to the 1970s, we find that it was originated by Keynesian economists – ideological and theoretical soul mates of Paul Krugman. This revelation allows us to offer a theory about otherwise inexplicable comments by Krugman in his column.

The Origin of “Austerity”

The word “austerity” derives from the root word “austere,” which is used to denote something that is harsh, cold, severe, stern, somber or grave. When applied to a government policy, it must imply an intention to inflict pain and hardship. That is, the severity must be inherent in the policy chosen – it cannot be an invisible or unwitting byproduct of the policy. There may or may not be a compensating or overriding justification for the austerity, but it is the result of deliberation.

The word was first mated to policy during the debt crisis. No, this wasn’t our current federal government debt crisis or even the housing debt and foreclosure crisis that began in 2007. The original debt crisis was the 1970s struggle to deal with non-performing development loans made by Western banks to sovereign nations. At first, most of the debtor countries were low-income, less-developed countries in Africa and Latin America. Eventually, the contagion of bad loans and debt spread to middle-income countries like Mexico and Argentina. This episode was a rehearsal for the subprime-mortgage-loan defaults to follow decades later.

The original debt crisis was motivated by the same sort of “can’t miss” thinking that produced the housing mess. Sovereign nations were the perfect borrower, reasoned the big Wall Street banks of the 1970s, because a country can’t go broke the way a business can. After all, it has the power to tax its citizens, doesn’t it? Since it can’t go broke, it won’t default on its loan payments.

This line of reasoning – no, let’s call it “thinking” – found willing sets of ears on the heads of Keynesian economists, who had long been berating the West for its stinginess in funding development among less-developed countries. Agencies like the International Monetary Fund and the World Bank perked up their ears, too. The IMF was created at the end of World War II to administer a worldwide regime of fixed exchange rates. When this regime, named for the venue (Bretton Woods, New Hampshire) at which it was formally established, collapsed in 1971, the IMF was a great big international bureaucracy without a mandate. It was charmed to switch its attention to economic development. By brokering development loans to poor countries in Africa, Central and South America, it could collect administrative fees coming and going – coming, by carving off a chunk of the original loan in the form of an origination fee and going, by either rolling over the original loan or reformulating the development plan completely when the loan went bust.

The reformulation was where the austerity came in. Standard operating procedure called for the loan to be repaid either with revenues from the development project(s) funded by the loan(s) or by tax revenues reaped from taxing the profits of the project(s). Of course, the problem was that development loans made by big bureaucratic banks to big bureaucratic governments in Third World nations were usually subverted to benefit leaders in the target countries or their cronies. This meant that there were usually no business revenues or tax revenues left from which to repay the loans.

Ordinarily, that would leave the originating banks high and dry, along with the developers of the failed investment projects. “Ordinarily” means “in the context of a free market, where lenders and borrowers must suffer the consequences of their own actions.” But the last thing Wall Street banks wanted was to get their just desserts. They influenced their colleagues at the IMF and the World Bank to act as their collection agents. The agencies took off their “economic development loan broker” hats and put on one of their other hats; namely, their “international economics expert advisor” hat. They advised the debtor country how to extricate itself from the mess that the non-performing loan – the same one that they had collected fees for arranging in the first place – had got it into. Does this sound like a conflict of interest? Remember that these agencies were making money coming and going, so they had a powerful incentive to maintain the process by keeping the banks happy – or at least solvent.

Clearly, the Third World debtor country would have to scare up additional revenue with which to pay the loan. One possible way would be to divert revenue from other spending. But the agency economists were Keynesians to the marrow of their bones. They believed that government spending was stimulative to the economy and increased real income and employment via the fabled “multiplier effect,” in which unused resources were employed by the projects on which the government funds were spent. So, the last thing they were willing to advise was a diversion of spending away from the government and into repayment of debt. On the other hand, they were willing to advise Third World countries to acquire money to spend through taxation. If government were to raise $X in taxes and spend those $X, the net effect would not be a wash – it would be to increase real income by X. Why? Because taxation acquires money that private citizens would otherwise spend, but also money that they would otherwise save. When the entire amount of tax revenue is then spent by government, the net effect is to increase total spending – or so went the Keynesian thinking. One of Keynes’ most famous students, Nicholas Kaldor, later to become Lord Kaldor in Great Britain, complained in a famous 1950s’ article: “When will underdeveloped nations learn to tax?”

Thus, the development agencies kept a clear conscience when they advised their Third World clients to raise taxes in order to repay the debt incurred to Western banks. Not surprisingly, this policy advice was not popular with the populations of those countries. That policy acquired the descriptive title of “austerity.” Viewing it from a microeconomic or individual perspective, it is not hard to see why. By definition, a tax is an involuntary exaction that reduces the current or future consumption of the vict-…, er, the taxpayer. The taxpayer gains from it if, and only if, the proceeds are spent so as to more-than-compensate for the loss of that consumption and/or saving. Well, in this case, Third World taxpayers were being asked to repay loans for projects that failed to produce valuable output in the first place and did not produce the advertised gains in employment either. A double whammy – no wonder they called it “austerity!”

How austere were these development-agency recommendations? In Wealth and Poverty (1981), George Gilder offers one contemporary snapshot. “The once-solid economy of Turkey, for example, by 1980 was struggling under a 55 percent [tax] rate applying at incomes of $1,600 and a 68 percent rate incurred at just under $14,000, while the International Monetary Fund (IMF) urged new ‘austerity’ programs of devaluation and taxes as a condition for further loans.” Note Gilder’s wording; the word “austerity” was deliberately chosen by the development- agency economists themselves.

“This problem is also widespread in Latin America,” noted Gilder. Indeed, as the 1970s stretched into the 80s and 90s, the problem worsened. “[Economic] growth in Africa, Latin America, Eastern Europe, the Middle East and North Africa went into reverse in the 1980s and 1990s,” onetime IMF economist William Easterly recounted sadly in The Elusive Quest for Growth (2001). “The 1983 World Development Report of the World Bank projected a ‘central case’ annual percent per-capital growth in the developing countries from 1982 to 1995″ but “the actual per-capita growth would turn out to be close to zero.”

Perhaps the best explanation of the effect of taxes on economic growth was provided by journalist Jude Wanniski in The Way the World Works (1978). A lengthy chapter is devoted to the Third World debt crisis and the austerity policies pushed by the development agencies.

Two key principles emerge from this historical example. First, today’s knee-jerk presumption that government spending is always good, always wealth enhancing, always productive of higher levels of employment depends critically on the validity of the multiplier principle. Second, the original definition of austerity was painful increases in taxation, not decreases in government spending. And it was left-wing Keynesians themselves who were its practitioners, and who ruled out government spending decreases in favor of tax increases.

Fast Forward

Fast forward to the present day. Since the 1970s, the worldwide experience with taxes has been so unfavorable – and the devotion to lower taxes has become so ingrained – that virtually nobody outside of Scandinavia will swallow a regime of higher taxes nowadays.

Keynesian economics, thoroughly discredited not only by its disastrous economic development policy failures but also by the runaway inflation it started but could not stop in the 1970s, has emerged from under the earth like a protagonist in a George Romero movie. Its devotees still preach the gospel of stimulative government spending and high taxes. But they stress the former and downplay the latter. And, instead of embracing their former program of austerity as the means of overcoming debt, they now accuse their political opponents of practicing it. They have effected this turnabout by redefining the concept of austerity. They now define it as “slashing government spending.”

The full quotation from the Paul Krugman column quoted earlier was: “At this point, the economic case for austerity – for slashing government spending even in the face of a weak economy – has collapsed.” Notice that Krugman says nothing about taxes even though that was a defining characteristic of austerity as pioneered by development-agency Keynesians of his youth. (Krugman does not neglect devaluation, the other linchpin, since he advocates printing many more trillions of dollars than even Ben Bernanke has done so far.)

When Krugman’s Keynesian colleagues originated the policy of austerity, they did it with malice aforethought – using the term themselves while fully recognizing that the high-tax policies would inflict pain on recipients. Now Krugman projects this same attitude on his political opponents by claiming that not only does reduced government spending have harmful effects on real income and employment, but that Republicans will it so. The Republicans, then, are both evil and stupid. Republicans are evil because they “have long followed a strategy of ‘starving the beast,’ slashing taxes so as to deprive the government of the revenue it needs to pay for popular programs. They are stupid because their reluctance “to run deficits in times of economic crisis” is based on the premise that “politicians won’t do the right thing and pay down the debt in good times.” And, wouldn’t you know, the politicians who refuse to pay down the debt are the Republicans themselves. The Republicans are “a fiscal version of the classic definition of chutzpah…killing your parents, then demanding sympathy because you’re an orphan.”

But the real analytical point is that Krugman, and Democrats in general, are exhibiting the chutzpah. They have taken a policy term originated and openly embraced not merely by Democrats, but by Keynesian Democrats exactly like Krugman himself. They have imputed that policy to Republicans, who would never adopt this Democrat policy tool because its central tenet is excruciatingly high taxes. They have correctly accused Republicans of wanting to reduce government spending but wrongly associated that action with austerity in spite of the fact that their Keynesian Democrat forebears did not include it in the original austerity doctrine.

Why have they done this? For no better reason than that they oppose the Republicans politically. Psychology recognizes a behavior called “projection,” the imputing of a detested personal trait or characteristic to others. Having first developed the policy of austerity in the late 1970s and seen its disastrous consequences, Democrats now project its advocacy on their hated Republican opponents. In Krugman’s case, there are compelling reasons to suspect a psychological root cause for his behavior. His ancillary comments reveal an alarming propensity to ignore reality.

Paul Krugman’s Flight from Reality

In the quoted column alone, Krugman makes numerous factual claims that are so clearly and demonstrably untrue as to suggest a basis in abnormal psychology. Pending a full psychiatric review, we can only compare his statements with the factual record.

“In the United States, government spending programs designed to boost the economy are in fact rare – FDR’s New Deal and President Barack Obama’s much smaller recovery act are the only big examples.” Robert Samuelson’s recent book The Great Inflation and Its Aftermath (2008)covers in detail the growth and history of Keynesian economics in the U.S. During the Kennedy administration, Time Magazine featured Keynes on its cover to promote a story conjecturing that Keynesian economics had ended the business cycle. Samuelson followed Keynesian economics and such luminaries as Council of Economic Advisors Chairman Walter Heller, Nobel Laureates Paul Samuelson and James Tobin through the Kennedy, Johnson, Carter and Reagan administrations. One of his major theses was precisely that Keynesian economists produced the stagflation of the 1970s by refusing to stop deficit spending and excessive money creation – a view that helped to discredit Keynesianism in the 1980s. There can be no doubt that U.S. economic policy was dominated by Keynesian policies “designed to boost the economy” throughout the 1960s and 1970s.

Moreover, every macroeconomics textbook from the 1950s forward taught the concept of “automatic stabilizers” – government programs in which spending was designed to automatically increase when the level of economic activity declined. These certainly qualify as “big” in terms of their omnipresence, although since Krugman is an inflationist in every way he might deny their bigness in some quantitative sense. But they are certainly government spending programs, they are certainly designed to boost the economy and they are certainly continually operative – which makes Krugman’s statement still more bizarre.

“So the whole notion of perma-stimulus is a fantasy… Still, even if you don’t believe that stimulus is forever, Keynesian economics says not just that you should run deficits in bad times, but that you should pay down debt in good times.” The U.S. government has had one true budget surplus since 1961, bequeathed by the Johnson administration to President Nixon in 1969. (The accounting surpluses during the Clinton administration years of 1998-2001 are suspect due to borrowing from numerous off-budget government agencies like Social Security.) This amply supports the contention that politicians will not balance the budget cyclically, let alone annually. European economies are on the verge of collapse due to sovereign debt held by their banking systems and to the inexorable downward drift of productivity caused by their welfare-state spending. Krugman’s tone and tenor implies that “Keynesian economics” should be given the same weight as a doctor prescribing an antibiotic – a proven therapy backed by solid research and years of favorable results. Yet the history of Keynesian economics is that of a discredited theory whose repeated practical application has failed to live up to its billing. Now Krugman is in a positive snit because we don’t blindly take it on faith that the theory will work as advertised for the first time and that politicians will behave as advertised for the first time. If nothing else, one would expect a rational economist to display humility when arguing the Keynesian case – as Keynesians did when repenting their sins in favor of a greatly revised “New Keynesian Economics” during the mid-1980s.

“Unemployment benefits have fluctuated up and down with the business cycle and as a percentage of GDP they are barely half what they were at their recent peak.” Unemployment benefits have “fluctuated” up to 99 weeks during the Great Recession because Congress kept extending them. The rational Krugman knows that his fellow economists have debated whether these extensions have caused people to stop looking for work and instead rely on unemployment benefits. Robert Barro says they have, and finds that the extensions have added about two percentage points to the unemployment rate. Keynesian economists demur, claiming instead that the addition is more like 0.4%. In other words, the profession is not arguing about whether the extensions increase unemployment, only about how much. Meanwhile, Krugman is in his own world, pacing the pavement and mumbling “up and down, up and down – they’re only half what they were at their highest point when you measure them as a percentage of GDP!”

“Food stamp use is still rising thanks to a still-terrible labor market, but historical experience suggests that it too will fall sharply if and when the economy really recovers.” Food stamp (SNAP) use has steadily risen to nearly 48 million Americans. Even during the pre-recession years 2000-2008, food-stamp use rose by about 60%. Thus, the growth of the program has far outpaced growth in the rate of poverty. The Obama administration has bent over backward to liberalize criteria for qualification, allowing even high-wealth, low-income households into the program. This does not depict a temporary program whose enrollment fluctuates up and down with economic change, but rather a tightening vise of dependency.

Krugman’s picture of a “still-terrible labor market” cannot be reconciled with his claim that government spending is an effective counter-cyclical tool. If Krugman’s reaction to the anemic response to the Obama administration economic stimulus is a demand for much higher spending, he will presumably pull out that get-home-free card no matter what the effects of a spending program are. Why would much higher spending work when the actual amount failed? Krugman makes no theoretical case and cites no historical examples to support his claim – presumably because there are none. Governments need no urging to spend money – European governments are collapsing like dominos from doing exactly that. European unemployment has lingered in double digits for years despite heavy government spending, recent complaints about “austerity” to the contrary notwithstanding.

“The disastrous turn toward austerity has destroyed many jobs and ruined many lives. And its time for a U-turn.” Keep in mind that Krugman’s notion of “austerity” is reduced government spending but not higher taxes. This means that he is claiming that taxes have not gone up – when they have. And he is claiming that government spending has gone down, presumably by a lot since it has “destroyed many jobs and ruined many lives.” But government spending has not gone down; only a trivial reduction in the rate of growth of government spending has occurred during the first four and one-half months of 2013.

“Yet calls for a reversal of the destructive turn toward austerity are still having a hard time getting through.” Krugman’s rhetoric implies that Keynesian economics is a sound, sane voice that cannot be heard above the impenetrable din created by right-wing Republican voices. As a rational Krugman well knows, the mainstream news media has long been completely dominated by the Left wing. (It is the Right wing that should be complaining because the public is unfamiliar with the course of economic research over the last 40 years and the mainstream news media has done nothing to educate them on the subject.) Its day-to-day vocabulary is permeated with Keynesian jargon like “multiplier” and “automatic stabilizers.” The rhetorical advantage lies with Democrats and Keynesians. It is practical reality that has let them down. The economics profession conducted an unprecedented forty-five year research program on Keynesian economics. Its obsession with macroeconomics led to a serious neglect of microeconomics in university research throughout the 40s, 50s and 60s. By approximately 1980, the verdict was in. Keynesian economics was theoretically discredited, although its theoretical superstructure was retained in government and academia. Even textbooks were eventually revised to debunk the Keynesian debunking of Classical economics. Macroeconomic policy tools were retained not because free markets were inherently flawed but because policy was ostensibly a faster way to return to “full employment” than by relying on the slower adjustment processes of the market. The reaction to recent “stimulus” programs has demonstrated that even that modest macroeconomic aim is too ambitious.

Keynesian economics has had no trouble getting a hearing. It has had the longest, fairest hearing in the history of the social sciences. The verdict is in. And Krugman stands in the jury box, screaming that he has been framed by conservative Republicans as the bailiffs try to remove him from the courtroom.

Memory records no comparable flight from reality by a prominent economist.

DRI-280 for week of 11-11-12: Restaurant-Dish Takeaway and Comparative Economic Systems


An Access Advertising EconBrief:

 Restaurant-Dish Takeaway and Comparative Economic Systems

You are eating dinner in a casual restaurant with a spouse. No sooner does the last forkful of food ascend toward your mouth than your waiter whisks away the plate. His request for permission – “Done with that?” – is purely a formality since the plate is gone before you can object.

You have observed a tendency in recent years for restaurant servers to remove dishes with increasing alacrity. You remark this to your dinner companion who, unlike you, is a non-economist. Her all-purpose explanation of human behavior is binary: Is the object of study a nice guy or not? Nice guys remove dishes quickly so diners have more elbow room to relax.

You are an economist. You believe people act purposefully to achieve their ends. Moreover, you are thoroughly acquainted with tradeoffs. You have often had waiters take your plate before you were through with it. Some people bristle when they perceive others constantly hovering over them. There are even those – not you, of course, but boors and gluttons – who eat the food of others after finishing their own. One of these types might just react by snatching back his plate and declaring, a la John Paul Jones, “I have not yet begun to eat!”

The “nice-guy” explanation won’t suffice, since the quick-takeaway approach will suit many people well but others poorly. Restaurants that follow a consistent policy of quick takeaway risk offending some customers. Offending customers is not something restaurants do lightly. In order to make this risk worthwhile, there should be some strong motivation in the form of a compensating prospect of gain. What might that be?

One way to define an economist is by saying that they are the kind of people who ask themselves questions like this. And the mark of a good economist is that he can supply not only answers but also further implications and ramifications for social life and government policy.

The Economics of Restaurant Service

Americans have eaten in restaurants ever since America became the United States and before that. While the basic concepts underlying the restaurant sector have remained intact, structural changes have remade the industry in recent decades. The most important contributor has been the institution of franchising.

Fast-service franchising began was begun in the 1920s by A&W root-beer stands and Howard Johnson motel-restaurants. Baskin Robbins, Dairy Queen and Tastee Freeze hopped on the bandwagon in the 1930s and 40s. McDonald’s and Subway became big business in the 1950s. The decade of the 1960s saw restaurant franchises zoom to over 100,000 in number. After overcoming legal challenges posed by antitrust and the economic threat of OPEC in the 70s, franchising became the dominant form of restaurant business organization in the 1980s.

Franchising enlarged markets and made competitive entry easier. By standardizing both product and service, it made restaurant operation easier. It raised the stakes involved in success and failure. All these increased the intensity of competition. In turn, this shone the spotlight on even the minutest aspects of restaurant operation. Franchises and food groups ran schools in which they taught their franchisees and managers the fundamentals of restaurant success. Managers went out on their own to put those principles into practice. The level of professional operation ratcheted upward throughout the industry.

The word “professional” means numerous things, but in context it refers to the rigorous, even relentless application of restaurant practices single-mindedly aimed at achieving profitable operation. This entails developing a repeat-customer base and making the largest profit possible from serving that base.

Whether the quality of all types of restaurant food improved is open to debate, but it cannot be doubted that average quality rose. Today, the “greasy spoons” of yesteryear are nearly as scarce as passenger pigeons.

It was during this period of franchise domination that the practice of quick takeaway gained widespread currency. Maximizing the daily turnover of the given restaurant capacity is a commandment in the operations bible for profit-maximization. Minimizing the time between the departure of one set of guests and the arrival of their successors at each table is one way to maximize turnover. One way to reduce the time taken by clearing tables at meal’s completion is to begin the process before departure rather than waiting until the guests get up to leave; that way, fewer dishes remain to remove upon actual departure.

Fast removal of dishes not only maximizes turnover, it also maximizes the revenue take from each separate turnover. From the restaurant owner’s perspective, maximizing the size of each table’s check is another step toward maximizing total profit. After-dinner items like coffee and dessert are the obvious route to that goal. (Alcoholic drinks are the before-dinner complement of this strategy, which is why attainment of a liquor license is a coveted goal for most restaurants.) Quick takeaway aids this strategy in two ways. First, it speeds the transition from dinner to dessert. Second, it aids the server, who is in no position to handle dish removal when arriving at the table laden with desserts.

“Quick takeaway” has been standard practice throughout most of the industry for quite awhile, though. This doesn’t account for a recent speedup. For that, look deeper into the details of restaurant operation.

Table Size, Takeaway and… Demographic Trends?

Concomitant with the trend toward faster takeaway, the economist has also observed a trend toward smaller tables and booths in casual restaurants. Tables, chairs and booths come in standard sizes (there are five different booth sizes, for example), but the observed trend has been toward more booths designed to accommodate two people. Greater usage has been made of bar areas to provide food service, wherein diners can often obtain quicker service at the cost of table space and chairs limited to two people.

To understand the rationale for this changeover, pretend for a moment that all of the restaurant’s patronage consists of parties of two. Larger tables and booths would waste space and unnecessarily limit revenue per turnover, whereas designing for two would maximize the number of people served (and revenue collected) from an individual full-house turnover.

The link between table size and quick takeaway is obvious. Smaller table and booth sizes leave less room to accommodate elbows, books, newspapers, miscellaneous articles – not to mention additional dishes like dessert. (Technically, a smaller table doesn’t mean less room per person, but the whole idea behind the move to smaller tables is to achieve better utilization of capacity – the result leaves much less unused space available than did the larger tables and booths.) Now servers have even more reason to get those vacated dishes moving back to the kitchen, since there was barely room for them on the table to begin with. This reinforces the preexisting motivation for fast table-clearing and enlists the diners’ sympathy on the side of management, since table-crowding has become all too obvious.

There is still one major link left out of the chain of reasoning. In practice, restaurant parties do not consist entirely of twosomes. Casual restaurants usually include a few larger tables and/or booths, but what is to prevent larger parties from dominating smaller ones in the great scheme of things?

The last four decades have seen an increasing demographic trend toward smaller U.S. household size. In 1970, there an average of 3.1 people comprising the average U.S. household. By 2000, this had fallen to 2.62; by 2007, to 2.6 and by 2010, to 2.59.

Several forces drove this trend. First has been a shrinking birthrate. Here the U.S. is merely following the lead of other Western industrialized nations, which have seen shrinking birthrates throughout the 20th century. In the U.S., the shrinkage has waxed and waned since the 1930s. The 1990s saw a modest resurgence and U.S. births barely struggled above 2.0 per 1,000 early in the millennium. That is the replacement point – the level at which births and deaths counterbalance. As noted by leading demographer Ben Wattenberg and others, the large influx of Hispanic immigrants in recent decades undoubtedly spearheaded this comeback. Hispanics tend to be Catholic, fecund and pro-life. But since 2007, the rate has backslid down to 1.9; even the Hispanics seem to have assimilated the American cultural indifference to reproduction.

Other cultural forces have reinforced demography. Birth control has become omnipresent and routine. Divorce and illegitimacy have lost their stigma, thereby conducing to households containing only one parent. Whereas formerly it was commonplace for two men or two women to room together and share expenses, the legal status granted to homosexual partnerships has now placed a question mark around those arrangements. (This applies particularly to males; apparently the politically correct status conferred upon homosexuals does not much reassure two heterosexual men who contemplate cohabitation.) Indeed, it is today less socially questionable for unmarried male/female couples to live together than for same-sex couples – but this is practical only as a substitute for marriage, so its effect on household size is negligible.

The aggregate effect of this cultural attrition has been nearly as potent at the declining birthrate. In 1970, the fraction of households containing one person living alone was 17%. By 2007, this had risen to 27%.

Given this trend toward declining household size, we would expect to see a corresponding decline in the average size of parties at casual restaurants. After all, households (particularly adults) typically dine together rather than separately. Certainly, large groups do assemble on special occasions and regular get-togethers. But the overall trend should follow this declining pattern.

And there you have it. Smaller average household size produces smaller restaurant table and booth size, which in turn produces quick – or rather, quicker – takeaway of dishes at or before meal completion.

Many people instinctively reject this kind of analysis because they can’t picture most restaurant owner and employees thinking this deeply about such minute details or putting their plans into practice. But the foregoing analysis doesn’t necessarily assume that all restaurant owners and managers are this single-minded and obsessive. In a hotly competitive environment, the restaurants that survive and thrive will be those that do take this attitude. They will attract more business – thus, the odds of encountering smaller tables and quick takeaway will be greater even though those practices may not be uniform across the industry. Indeed, this reasoning supports the very notion of profit maximization itself. This survivorship principle was pioneered by the great economist Armen Alchian.

The Larger Meaning of Little Details

Economics is capable of supplying answers to life’s quaint little questions. (Some people would rearrange the wording of that sentence to “quaint little answers to life’s questions.”) But economics was developed to tackles bigger issues. It turns out that the little questions bear on the big ones.

One of the big questions economists ask about the behavior of business firms is: Is it socially beneficial? Business firms exist because, and to the extent that, they produce goods and services cheaper and better than individual households can. The gauge of success is the welfare of consumers.

Smaller tables and quick takeaway enable restaurants to achieve better capacity utilization. This enables them to cut costs and serve more customers. These are beneficial to consumers. The more intense competition serves to lower prices of restaurant food. This also benefits consumers.

What about the quality of food served? Table size and dish removal do not bear directly on this question, but the industry shift towards corporate control and franchised ownership has sometimes been blamed for a supposed decline in overall food quality. This hypothesis overlooks the analytical nose on its face – the fact that consumers themselves are the only possible judges of quality. Even if we assume that average quality has fallen, we have no basis for second-guessing the willingness of consumers to trade off lower quality for lower price and greater quantity. This is the same sort of tradeoff we make in every other sphere of consumption – housing, clothing, entertainment, medical care, ad infinitum.

The Left wing has recently developed a variation on its theme of corporate malignity in production and distribution of food. Corporations are destroying the health of their customers by purveying food containing too much sugar, salt, fat and taste. Only stringent government regulation of restaurant operations can hope to counteract the otherwise-irresistible lure of corporate advertising and junk food.

This hypothesis is not merely wrongheaded but wrong on the facts. Consumers have every right to trade off lower longevity for heightened enjoyment of life. This is something people often do in non-nutritive contexts such as athletics, extreme leisure pursuits like hang-gliding or public-service activities like missionary work. History indicates that, far from promoting public health, government has aided and abetted the increased incidence of type-II diabetes through wrong-headed dietary insistence on carbohydrate consumption as the foundational building block of nutrition.

Any objective appraisal must recognize that nowhere on earth can consumers find such abundance and diversity of cuisine as in the United States of America. World cuisine is amply represented even in mid-size metropolitan markets like Kansas City, Missouri and Sioux City, Iowa. There is no taste left unfulfilled – even the esoteric insistence on vegetarian meals, organic cultivation and free-range animal raising.

Restaurant Regulation

In order to appreciate the operation of a free market for restaurant meals, we need to dial down our level of abstraction and conduct a comparative-systems comparison. Heretofore we have conducted an imaginative exercise: we have explained a piece of restaurant operations under free-market competition. Now we need to envision how that piece would work under an alternative system like socialism.

In a socialist system, public ownership of the means of production dictates thoroughgoing, top-down regulation of business practice. For example, a regulator will pose the questions: How many booths and tables should the restaurant have? How big should they be? How far apart should they be spaced? How many people should we allow the restaurant to serve and how many should be allowed to sit at each table and booth?

In a socialist system, a regulator or group of them will ask this question in a centralized fashion. That is, he will ask it for a large grouping of restaurants – perhaps all restaurants, perhaps all fast-service restaurants, all bar-restaurants, all casual sit-down restaurants and all fine-dining restaurants. Or perhaps regulators will choose to group the restaurant industry differently. But group it they will and regulate each group on a one-size-rule-fits-all basis.

How will the regulator decide what regulations to impose? He will have government statistics at his disposal, such as the information cited above on average household size. It will be up to him to decide which information is relevant and how to apply the aggregate or collective information that governments collect to each individual restaurant being regulated. Even in the wildly unlikely instance that a regulator could actually visit each regulated restaurant, that could hardly happen more than once per year.

As we have just seen, free markets don’t work that way. One of the most misleading of popular perceptions is that free markets are “unregulated.” In reality, they are subject to the most stringent regulation of all – that of competition. But because the regulation part of competition works invisibly, people seem to miss its importance completely.

Instead of waiting for a central authority to certify its product as tasty and wholesome, markets supply their own verdict. Consumers try it for themselves. They ask their friends or take note when opinions are volunteered. They seek out reviews in newspapers, online and on television. When the verdict is unfavorable, bad news travels fast. This applies even more strongly to the aspect of health, by the way. Nothing empties a restaurant quicker than food-borne illness or even the rumor of it – as entrepreneurs know only too well.

In contrast, government health regulation doesn’t move nearly this fast. The cumbersome process of visits by the health inspector, trial-by-checklist followed by re-inspection – a pattern broken only rarely by a shutdown – is a classic example of bureaucracy at work. Political favoritism can affect the choice of inspections and the result. The de facto health inspector is the free market, not the government employee who holds that title.

Competitive regulation is decentralized. In our restaurant example, decisions about table size and restaurant takeaway are not made by a far-off government authority and applied uniformly. They are made on the spot, at each restaurant on a day-by-day basis. Restaurant owners and managers may possibly have the same government-collected information available to regulators, although it seems likely that they will be too busy to spend much time evaluating it. More to the point, though, they will have what the late Nobel laureate F. A. Hayek called “the information of particular time and place.” That is the time- and place-specific information about each particular restaurant that only its owner and managers can mobilize.

Merely because average household size has fallen over the U.S. does not mean that households in each and every individual neighborhood are smaller. It may be the case, for example, that in Hispanic neighborhoods – not gripped by declining birthrates or an epidemic of divorce – average household size has not fallen as it mostly has elsewhere. Those restaurants would not feel the urge to decrease table size and speed up dish collections in line with most restaurants. And well they shouldn’t, since they would serve their particular customers better by not blindly playing follow-the-leader with national trends.

Would centralized regulators pick up on this distinction? No, they would have to be clairvoyant in order to sort out the kind of exceptions that markets automatically catch.

After all, their aggregate statistics simply do not sift the data finely enough to make individual distinctions and differences visible.

But decentralized markets make those individual differences keenly felt by the people most affected. For restaurants, variations in consumer preference are felt by the very people who serve the consumer groups. Changes in demographic trends are witnessed by those whose very livelihoods are at stake. Competitive regulation works because it is on the spot, informed by the exact information needed and directed by the very people – on both sides of the market – with the motivation and expertise needed to make it effective.

Free markets allow participants to collect, disperse and heed information from any source but do not force people to respond to it. They do, however, provide incentives to respond proportionately to the magnitude of the information provided. A huge disruption of the supply of something will produce a big increase in price, suggesting to people that they reduce their consumption of this good a lot. A small decrease in a good’s price will offer a gentle inducement to increase consumption of something but not to go hog wild over it.

Again and again, we find ourselves saying that free markets nudge people in the right direction, towards doing the thing that we would want done if we could somehow magically observe all economic activity and direct by waving a magic wand. Economists laconically define this quality as being “efficient.”

Restaurant Economics and Rational Behavior

This object lesson in restaurant economics reminds us of a perceptive argument for free markets put forward by Hayek. He was responding to longtime arguments put forth by critics on the Left. The same arguments have recently reechoed following the housing bubble, financial crisis and ensuing Great Recession. Free markets may be logical, the critics concede, but only if people are rational. Since people behave irrationally, free markets must fail in practice, however well grounded their principles might be.

Hayek observed that the critics had it backwards. Markets do not require rational behavior by participants in order to function. Instead, markets encourage rational behavior by rewarding those who act rationally and penalizing those who do not. The history of mankind reveals a gradual movement towards more rational behavior; the widely noted reduction in the incidence of warfare is one noteworthy example of this.

The Audience Responds With a Burst of Applause

Can you imagine a nobler progression from the trivially mundane to the globally significant? That is what economists do.

And, by way of gratitude for this insight, your dinner companion rewards you by inquiring: “OK, now explain why restaurants are so stingy with the butter these days.”