DRI-192 for week of 5-24-15: Why Incremental Reform of Government Is a Waste of Time

An Access Advertising EconBrief:

Why Incremental Reform of Government Is a Waste of Time

Any adult America who follows politics has seen it, heard it and read it ad infinitum. A person of prominence proposes to reform government. The reform is supposed to “make government work better.” Nothing earthshaking, understand, just something to improve the dreadful state that confronts us. And if there’s one thing that everybody agrees on, it’s that government is a mess.

Newspapers turn them out by the gross – it’s one of the few things that newspapers still publish in bulk. They can be found virtually every day in opinion sections. Let’s look at a brand-spanking new one, bright and shiny, just off the op-ed assembly line. It appeared in The Wall Street Journal (5/27/2015).The two authors are a former governor of Michigan (John Engler) and a current President of the North America Building Trades Unions (Sean McGarvey). The title – “It’s Amazing Anything Ever Gets Built” – aptly expresses the current level of exasperation with day-to-day government.

The authors think that infrastructure in America – “airports, factories, power plants and factories” are cited specifically – is absurdly difficult to build, improve and replace. The difficulty, they feel, is mostly in acquiring government permission to proceed. “The permitting process for infrastructure projects… is burdensome, slow and inconsistent.” Why? “Gaining approval to build a new bridge or factory typically involves review by multiple federal agencies – such as the Environmental Protection Agency, the U.S. Forest Service, the Interior Department, the U.S. Army Corps of Engineers and the Bureau of Land Management – with overlapping jurisdictions and no real deadlines. Often, no single federal entity is responsible for managing the process. Even after a project is granted permits, lawsuits can hold things up for years – or, worse, halt a half-completed construction project.”

Gracious. These are men with impressive-sounding titles and prestigious resumes. They traffic in the measured prose of editorialists rather than the adjective-strewn rhetoric of alarmists. And their language seems all the more reasonable for its careful wording and conclusions. Naturally, having taken good care to gain the reader’s attention, they now hold it with an example: “The $3 billion Transwest Express [is] a multi-state power line that would bring upward of 3,000 megawatts of wind-generated electricity from Wyoming to about 1.8 million homes and businesses from Las Vegas to San Diego. The project delivers on two of President Obama’s priorities, renewable power and job creation, so the administration in October 2011 named [it] one of seven transmission projects to ‘quickly advance’ through federal permitting.”

You guessed it; the TransWest Express “has languished under federal review since 2007.” That’s eight (count ’em) years for a project that the Obama administration favors; we can all imagine how less well-regarded projects are doing, can’t we? In fact, we don’t have to use our imaginations, since we have the example of the Keystone XL Pipeline before us.

Last month, the Bureau of Land Management pronounced the ink dry on an environmental-impact statement well done. That left only the EPA, the Federal Highway Administration, the Corps of Engineers, the Forest Service, the National Park Service, the Bureau of Reclamation, the U.S. Fish and Wildlife Service (!) and the Bureau of Indian Affairs (!!) to be heard from. At the rate these agencies are careening through the approval process, the TransWest Express should come online about the time that the world supply of fossil fuels is entirely extinguished – a case of exquisitely timed federal permitting.

According to Messrs. Engler and McGarvey, the worst thing about this egregious case study in federal-government overreach is that it leaves “thousands of skilled craft construction workers [to] sit on their hands.” Apparently, the Obama administration was in general agreement with this line of thought, because “President Obama’s Jobs Council examined how other countries expedite the approval of large projects” and its gaze fell upon Australia.

“Australia used to be plagued with overlapping layers of regulatory jurisdiction that resemble the current regulatory structure in the U.S.” before it installed the type of reform that the two authors are laying before us. The Australian province of New South Wales “now prioritizes permit applications based on their potential economic impact, and agreements among various reviewing agencies ensure that projects are subject to a single set of requirements.” As a result of this sunburst of reformist illumination, “permitting times have shrunk… from a once-typical 249 days to 134 days.”

Mind you, that was the President’s Jobs Council talking, not the authors. And the President, listening intently, created an “interagency council… dedicated to streamlining the permitting process.” Just to make sure we knew the President wasn’t kidding, “the White House also launched an online dashboard to track the progress of select federal permit applications.”

At this point, readers might envision the two authors reading their op-ed to a live audience consisting of Wall Street Journal readers – who would greet the previous two paragraphs with a few seconds of incredulous silence, followed by gales of hilarious laughter. Doubtless sensing the pregnancy of these passages, the authors follow with some rhetorical throat-clearing: “It has become clear, however, that congressional action is needed to make these improvements permanent and to require meaningful schedules and deadlines for permit review. Fortunately, Sens. Rob Portman (R-Ohio) and Claire McCaskill (D-Mo.) have introduced the Federal Permitting Improvement Act.”

“The bill would require the government to designate a lead agency to manage the review process when permits from multiple agencies are needed. It would establish a new executive office to oversee the speed of permit processing and to maintain the online dashboard that tracks applications.”

“The bill would also impose sensible limits on the subsequent judicial review of permits by reducing the statute of limitations on environmental lawsuits from six years to two years and by requiring courts to weigh potential job losses when considering injunction requests.”

Ah-hah. Let’s summarize this. President Obama, whose world renown for taking unilateral action to achieve his ends was earned by his selective ignoring and rewriting of law, confronted a situation in which two of his administration’s priorities were being thwarted by federal agencies over which he, as the nation’s Chief Executive, wielded administrative power. What action did he take? He turned to a presidential council – a century-old political buckpassing dodge to avoid making a decision. The council proceeded to do a study – another political wheeze that dates back at least to the 19th century and has never failed to waste money while failing to solve the problem at hand. When the study ostensibly uncovered an administrative reform purporting to achieve incremental gains in efficiency, the President (a) “streamlined the process” by telling two of the agencies who were creating the worst problems in the first place to cooperate with each other via an additional layer of bureaucracy (an “interagency council”) and created an “online dashboard” so that we could all watch the ensuing slow-motion failure more closely. All these Presidential actions took place in 2011. It is now mid-2015.

And what do our two intrepid authors propose to deal with this metastatic bureaucratic cancer? Congress will point its collective finger at one of the agencies causing the original problem and give it more power by making it “manager” of the review process. (This action implies that the root cause of the problem is that somebody in government doesn’t have enough power.) Of course, the premise that “permits from multiple agencies are needed” is taken completely for granted. Next, Congress would establish still another layer of bureaucracy (the “executive office”) to “oversee” the very problem that is supposedly being solved (e.g., “speed of permit processing”). (This implies that we have uncovered two more root causes of the problem – not enough layers of bureaucracy and not enough oversight exercised by bureaucrats.) A classic means of satisfying everybody in government is by getting every branch of government into the act. Accordingly, Congress points its collective finger at “the courts” and tells them to “weigh” job losses when considering requests for injunctions against projects. (The fact that this conflicts with the original “potential economic impact” mandate doesn’t seem to have concerned Congress or, for that matter, Messrs. Engler and McGarvey.) Finally, Congress throws a last glance at this unfolding Titanic scenario and, collective chins resting on fists, rearranges one last deck chair with a four-year reduction in the statute of limitations on environmental lawsuits.

The most amazing thing is not that anything ever gets built, but that these two authors could restrain their own laughter long enough to submit this op-ed for publication. The above summary reads more like a parody submitted for consideration by Saturday Night Live or Penn and Teller.

Two questions zoom, rocket-like, to the reader’s lips upon reading this op-ed and the above summary. What good, if any, could possibly result from this kind of proposal? Why do these proposals pop up with monotonous regularity in public print? The answers to those questions give rise in turn to a third question: What are the elements of a truly effective program for government reform and why has it not emerged?

Why Doesn’t Incremental Reform Work? 

The reform proposed by Messrs. Engler and McGarvey is best characterized as “incremental” because it does not change the structure of government in any fundamental way; it merely tinkers with its operational details. It aims merely to change one small part of the vast federal regulatory apparatus (permitting) by improving one element (its speed of operation) to a noticeable but modest degree (reduce average [?] time needed to secure a permit from 269 days to 134 days). And the rhetoric employed by the authors stresses this point – aside from the attention-grabbing headline, they are at pains to emphasize their modest goal as a major selling point of their proposal. They’re not trying to change the world here. “Americans of all stripes know that something is seriously wrong when other advanced countries can build infrastructure faster and more efficiently than the U.S., the country that built the Hoover Dam.” They use words like “bipartisan proposal” and “strengthen the administration’s efforts” rather than heaping ridicule on the blatant hypocrisy and stark contradiction of the Obama administration’s actions. They want to get a bill passed. But do they want actual reform?

Superficially, it seems odd that two authors would propose reform while opposing reform. Yet close inspection confirms that hypothesis not only for this op-ed, but in general. The authors deploy the standard op-ed bureaucratic argle-bargle that we have absorbed by osmosis from thousands of other op-eds – “infrastructure,” “permitting,” “priorities,” “job creation,” “streamline [government] process,” “expedite approval,” “implemented reforms,” “economic impact,” “manage the review process,” “lead agency,” “executive office.” The trouble is that if all this really worked, we wouldn’t be where we are today. The TransWest Express review wouldn’t have begun in 2007 and still be in limbo today. The Obama Administration wouldn’t have started remedial measures in 2011 and still be waiting on them to take effect in 2015. The U.S. wouldn’t be staggering under a cumulative debt load exceeding its GDP. The federal government wouldn’t have unfunded liabilities exceeding $24 trillion. The Western world wouldn’t be supporting a welfare state that is teetering on the brink of collapse.

Who are John Engler and Sean McGarvey? John Engler was formerly the Governor of Michigan. At one time, he was considered the bright hope of the Republican Party. He began by trying to reform state government in Michigan. He failed. Instead, he was co-opted by big government. Detroit went on to declare bankruptcy. John Engler left office and went to work for the Business Roundtable. Business organizations like the Chamber of Commerce exist today for the same reason that other special-interest organizations like La Raza and AARP exist – to secure special government favors for their members and protect them from being skewered by the special favors doled out to other special-interest organizations. Sean McGarvey is President of North America’s Building Trades Unions, a department of the AFL-CIO that performs coordinative, lobbying and “research” (i.e., public-relations) functions. Unions can achieve higher wages for their members only by affecting either the supply of labor or the demand for it. There is precious little they can do to affect the demand for labor, which comes from businesses, not unions. Unions can affect the supply of labor only by reducing it, which they do in various ways. This causes unemployment, which in turn exerts continuous public-relations pressure on unions to support “job creation” measures. But true job creation can come only from the combination of consumer demand and labor productivity, which underlie the economic concept of marginal value productivity of labor.

In the jargon of economics, all these organizations are rent-seekers that seek benefits unobtainable in the marketplace. They represent their members in their capacities as producers or input suppliers, not in their capacities as consumers. In other words, rent-seekers and the op-eds they write structure their pleas for “reform” to raise the prices of goods and inputs supplied by their member/constituents and/or provide jobs to them. Virtually all the op-eds appearing in print are written by rent-seekers striving to shape pseudo-reforms in ways that suit their particular interests.

In the Engler-McGarvey case, there are two possibilities. Possibility number 1: The Federal Permitting Improvement Act actually passes Congress and actually achieves the incremental improvement promised. In this wildly unlikely case, Mr. Engler’s business clients benefit from the modest reduction in permitting times. Since the entire wage and hiring process for infrastructure processes – government or otherwise – is grossly biased in favor of union labor, Mr. McGarvey’s clients benefit as well. Possiblity 2: As the above Summary suggests, the likelihood of actual incremental improvement is infinitesimal even if the legislation were to pass, since it requires efficient behavior by the same government bureaucracy that has caused the problems requiring reform in the first place. So the chances are that the result of the reform proposal will be nil.

As far as you and I are concerned, this represents a colossal waste of time and money. But for Messrs. Engler and McGarvey, this is not so. They are creatures of government. The next-best alternative to positive benefits for their client-constituents is no change in the status quo. For Mr. Engler, the status quo gives the biggest companies big advantages over smaller competitors. For Mr. McGarvey, the status quo gives unions and union labor big advantages that they cannot begin to earn in the competitive marketplace. Unions have been losing market share steadily in the private sector for many years. But they have been gaining influence and membership in the government sector, which is ruled by legislation and lobbyists.

Op-eds and reform proposals like this one allow people like Mr. Engler and Mr. McGarvey to earn their lucrative salaries as lobbyist and union president/lobbyist, respectively, by sponsoring and promoting pseudo-reform policies whose effects on their client-constituents can be characterized as “heads we win, tails we break even.”

But what about the effects on the rest of us?

What Would Real Reform Require – and Why Don’t We Get It?

A fundamental insight of economics – we might even call it THE fundamental insight – is that consumption is the end-in-view behind all economic activity. All of us are consumers. But this very fact works against us in the realm of big government, because this diffuses the monetary stake each one of us has in any one particular issue as a consumer. A tax on an imported good will raise its price, which rates to be a bad thing for millions of Americans. But because that good forms only a small part of the total consumption of each person, the money it costs him or her will be small. The cost will not be enough to motivate him or her to organize politically against the tax. On the other hand, a worker threatened with losing his or her job to the competition posed by the imported good may have a very large sum of money at stake – or may believe that to be true. The same is true for owners of domestic import-competing firms. Consequently, there are many lobbyists for legislation against imports and almost no lobbyists in favor of free, untaxed international trade. Yet economists know that free international trade will create more happiness, more overall goods and services and almost certainly more jobs than will international trade that is limited by taxes and quotas.

This explains why so many op-ed writers are rent-seekers and so few argue in favor of economic efficiency. True reform of government would not focus on the aims of rent-seekers. It would not strive to preserve the artificial advantages currently enjoyed by large companies – neither, for that matter, would it seek to preserve the presence of small companies merely for their own sake. True reform would allow businesses to perform their inherent function; namely, to produce the goods and services that consumers value the most. The only way to effect that reform is to remove the artificial influence of government from markets and confine government to its inherent limited role in preventing fraud and coercion.

Based on this evaluation, we might expect to see economists writing op-eds opposing the views of rent-seekers. Instead, this happens only occasionally. Economists are just as keenly attuned to their self-interest as other people. Most economists are employed by government, either directly as government employees or indirectly as teachers in public universities or fellows in research institutions funded by government. At best, these economists will favor the status quo rather than true reform. Only the tiny remnant of economists who work outside government for free-market oriented research organizations can be relied upon to support true reform.

Incremental Reform Vs. Structural Reform 

Incremental reforms are sponsored by rent-seekers. They are designed either to fail or, if they succeed, to yield rents to special interests instead of real reform. Real reform must be pro-consumer in nature. But the costs of organizing consumers are vast. In order to mobilize a reform of that scale, it must offer benefits that are just as vast or greater in size and scope. That means that true reform must be structural rather than incremental. It cannot merely preserve the status quo; it must overturn it.

In other words, true reform must be revolutionary. This does not imply that it must be violent. The reform that overturned Soviet Communism, perhaps the most powerful totalitarian dictatorship in human history, was almost completely non-violent. Admittedly, it had outside help from the international community in the political and moral form from people like Lech Walesa, Pope John, British Prime Minister Margaret Thatcher and, most of all, President Ronald Reagan.

As the efforts of the Tea Party have recently demonstrated, pro-consumer reform cannot be “organized” in the mechanistic sense. It can only arise spontaneously because that is the least costly way – and therefore the only feasible way – to achieve it.

We are unlikely to read about such a reform in the public prints because most of them are owned or sponsored by people who have vested interests in big government. These interests are usually financial but may sometimes be purely ideological. Big government may be a means of suppressing competition. It may be a means of subsidizing their enterprise. It may be a means of providing a bailout when digital competition becomes too fierce. In any event, we cannot look to the op-ed pages for leadership of real government reform.

DRI-248 for week of 10-19-14: The Economic Inoculation Against Terrorism

An Access Advertising EconBrief:

The Economic Inoculation Against Terrorism

Last week’s EconBrief analyzed the military adventures undertaken by Great Britain and the United States over the last two centuries and found uncanny and unsettling similarities. In particular, we detected a growing tendency to intervene militarily to settle disputes coupled with a growing distaste for war per se. Given the lack of close substitutes for complete victory in military conflict, this is a disastrous combination.

Both Great Britain and the United States found increasing need to use military force but were increasingly reluctant to apply maximum force with promptness and dispatch. The British dithered when confronted by Islamic fanaticism in the Sudan and ended up suffering the loss of a national hero, vast prestige and the need to intervene finally anyway. The British then faced one revolt after another in southern Africa, Ireland, India and Palestine. In each case, they reacted in measured ways only to be excoriated when finally forced to take stronger action. Ultimately, they abandoned their empire rather than take the actions necessary to preserve it.

Compare the actions taken by Great Britain in India against the passive resistance led by Gandhi with those that would have been taken by, say, a totalitarian nation like Nazi Germany, Soviet Russia or Communist China. The British were repeatedly forced to back down from using force against Gandhi – not by superior force or numbers wielded by the Mahatma but by their own moral qualms about exerting the force necessary to prevail. By contrast, Gandhi would never have gained any public notice, let alone worldwide acclaim, had he lived and operated under the Third Reich. Hitler’s minions would have murdered him long before he rose to public prominence. In Soviet Russia, Gandhi would have earned a bullet in the back of his head and unmarked burial in a mass grave. In Red China, Gandhi would either have undergone re-education or joined the faceless millions on the funeral pyre in tribute to revolutionary communism.

Lawrence in Arabia: Visionary or Myopic Mystic?

Director David Lean’s magnificent film Lawrence of Arabia acquainted the world with the story of British Col. T.E. Lawrence, an obscure officer who seized the opportunity to unite disparate and warring Arab tribes in guerilla warfare against the Germans in the Ottoman Empire during World War I. Playwright Robert Bolt’s screenplay depicts the Arabs as simple, childlike victims of wily colonial exploiters. Lawrence is a martyr who seeks to restore Arabs to their former historical glory by casting out the foreign devils from Arabia – “Arabia for the Arabs.”

Lawrence is continually frustrated in his campaign to organize the Arabs into an effective and cohesive fighting force. Tribal and religious divisions separate Arabs from each other almost as much as from the Turks.  Why can’t they view themselves as Arabs, he wonders, rather than as members of particular tribes or sects?

When Lawrence succeeds in whipping his guerilla force into fighting shape, he turns them into a virtual column of the British army and becomes instrumental in winning the war in the Middle East. He assumes that, once united in war, the Arabs will remain so in peacetime. They will stand fast against the British and French colonialists and reclaim their heritage. When this hope proves illusory, he retreats home to England in disillusion.

The perspective of economics allows us the insight that Lawrence was doomed to disappointment from the start. In wartime, people of all races, creeds and nationalities are able and willing to put aside personal priorities in favor of the mutual overriding priority of winning the war. At war’s end, however, there is no longer any single overriding priority strong enough to claim universal allegiance. Now each pursues his or her own interest. Of course, this pursuit of individual interest can still produce broadly beneficial results. Indeed, it should do just that – provided the disciplining forces of free markets and competition are given free play. But in post-World War I Arabia, the ideas of Adam Smith and free markets were as alien as Dixieland jazz. Economically, Arabia was primitive and aboriginal. Its tribes were dedicated to warfare and plunder – just as the aboriginal peoples of Australia, New Guinea, North America, South America and Africa were before modern civilization caught up with them. There was a tradition of trade or exchange in aboriginal culture – but no tradition of freedom, free markets and property rights.

The Flame that Ignited the Arab Spring

Of course, Arab society did not stall out completely at the aboriginal stage of primitive, nomadic desert life. Arabs were naturally blessed with copious quantities of petroleum, the vital economic resource of the 20th century. Though mostly unable to develop this resource themselves, they did play host to companies from Western industrialized nations that created infrastructure for that purpose. The resulting cultural interaction paved the way for modernization and a measure of secularization. Thus, from a distance the major cities of the Middle East might be hard to distinguish from those of the West. Up close, though, the differences are stark.

The noted South American economist and political advisor Hernando De Soto led a joint research study into the origins of the Arab Spring of 2011. He recounted his experiences in the recent Wall Street Journal op-ed, “The Capitalist Cure for Terrorism” (Saturday-Sunday, October 11-12, 2014). The seminal event of this movement was the self-immolation of a 26-year-old Tunisian man named Mohamed Bouazizi. Judging from Western coverage of the Middle East, one would expect him to have been unemployed, disaffected and despairing of his plight. As De Soto and his team discovered, the truth was far different.

Bouazizi was not unemployed. He was a street merchant, one of the most common occupational categories in the Arab world. He began trading at age 12, graduating to the responsible position of bookkeeper at the local market by the time he was 19. At the time of his death, he was “selling fruits and vegetables from different carts and sites;” i.e., he was a multi-product, multiple-location entrepreneur. It seems clear that he was not driven to extremity by idleness and despair. So what drove him to public suicide?

Like most of his trade, Bouazizi operated illegally. His dream was to obtain the capital to expand his business into the legal economy. He wanted to buy a pickup truck for delivering his vegetables to retail outlets. He longed to form a legal company as an umbrella under which to operate – stake clear title to assets, establish collateral, get a loan for the truck.

This dream seems modest to America ears. But for Bouazizi it was unattainable. “Government inspectors made Bouazizi’s life miserable, shaking him down for bribes when he couldn’t produce [business] licenses that were (by design) virtually unobtainable. He tired of the abuse. The day he killed himself, inspectors had come to seize his merchandise and his electronic scale for weighing goods. A tussle began. One municipal inspector, a woman, slapped Bouazizi across the face. That humiliation, along with the confiscation of just $225 worth of his wares, is said to have led the young man to take his own life.”

“Tunisia’s system of cronyism, which demanded payoffs for official protection at every turn, had withdrawn its support from Bouazizi and ruined him. He could no longer generate profits or repay the loans he had taken to buy the confiscated merchandise. He was bankrupt, and the truck that he dreamed of purchasing was now also out of reach. He couldn’t sell and relocate because he had no legal title to his business to pass on. So he died in flames – wearing Western-style sneakers, jeans, a T-shirt and a zippered jacket, demanding the right to work in a legal market economy.”

Asked if Bouazizi had left a legacy, his brother replied, “Of course. He believed the poor had a right to buy and sell.”

Mohamed Bouazizi was not alone. In the next two months, at least 63 people in Tunisia, Algeria, Morocco, Yemen, Saudi Arabia and Egypt set themselves afire in imitation of and sympathy with Bouazizi. Some of them survived to tell stories similar to his. Their battle cry was “we are all Mohamed Bouazizi.” It became the rallying cry of the Arab Spring, bringing down no fewer than four political regimes.

The Western news media have been heretofore silent about the true origins of the Arab Spring. It did not originate in “pleas for political or religious rights or for higher wage subsidies.” None of the “dying statements [of the 63] referred to religion or politics.” Instead, the survivors spoke of “economic exclusion,” a la Bouazizi. “Their great objective was ‘ras el mel‘ (Arabic for ‘capital’), and their despair and indignation sprang from the arbitrary expropriation of what little capital they had.”

Das Kapital or Capital?

Nobody speaks with greater force on this subject than Hernando De Soto. He is the Latin American Adam Smith, the South American champion of free markets and property rights. He is now the world’s leading property-rights theorist, having ascended upon the deaths of Ronald Coase and Armen Alchian. And he put his own ideas into successful practice in his home country of Peru by leading the world’s only successful counter-terrorist movement in the 1980s.

The Shining Path was a Marxist band of terrorist revolutionaries who tried to overthrow the Peruvian government in the 1980s. They were led by a onetime university professor named Abimael Guzman. Guzman posed as the champion of Peru’s poor farmers and farm workers. He organized Peru’s Communist Party around the idea of massive farming communes and used the Shining Path as the recruiting arm for these communes. Some 30,000 resistors were murdered. Officials were kidnapped and held for ransom. This strategy gave Shining Path control of the Peruvian countryside by 1990.

De Soto was the government advisor charged with combatting Shining Path. He didn’t forswear the use of military force, but his first move was toward the library and the computer rather than the armory. “What changed the debate, and ultimately the government’s response, was proof that the poor in Peru weren’t unemployed or underemployed laborers or farmers, as the conventional wisdom held at the time. Instead, most of them were small entrepreneurs, operating off the books in Peru’s ‘informal economy.’ They accounted for 62% of Peru’s population and generated 34% of its gross domestic product – and they had accumulated some $70 billion worth of real-estate assets [emphasis added].

This new learning completely confuted the stylized portrayal of poverty depicted by Guzman and his Shining Path ideologues. It enabled De Soto and his colleagues to do something that is apparently beyond the capabilities of Western governments – eliminate three-quarters of the regulations and red tape blocking the path of entrepreneurs and workers, allow ordinary citizens to file complaints and legal actions against government and provide formal recognition of the property rights of those citizens. An estimated 380,000 businesses and 500,000 jobs came out of the shadows of the informal economy and into the sunlight of the legal, taxed economy. One result of this was an extra $8 billion of government revenue, which rewarded government for its recognition of the private sector.

Having put the property rights of the poor on a firm footing, De Soto could now set about eradicating Shining Path, confident that once it won the guerilla war it would not lose the peace that followed. In true free-market fashion, Peru reworked its army into an all-volunteer force that was four times its previous size. They rapidly defeated the guerillas.

In this connection, it is instructive to compare the effect of military intervention in Peru with that undertaken elsewhere. The military interventions undertaken by the U.S. and earlier by Great Britain served to recruit volunteers for terrorist groups by creating the specter of a foreign invader imposing an alien ideology on the poor. In Peru, volunteers flocked to an anti-terrorist cause that was empowering them rather than threatening them, enriching them and their neighbors rather than bombing them.

Peru stands out because the economic medicine was actually given. Other links between poverty, terrorism and lack of property rights can be cited. In the 1950s and 60s, Indonesia was home to Communist and terrorist movements. It was also a land that consistently thwarted its entrepreneurs, many of whom were immigrant Chinese, in ways reminiscent of an Arab state. The southern half of Africa has long been known for stifling entrepreneurship through bureaucratic controls and monopoly, often combined with nepotism and corruption. This began as a colonial inheritance and has passed down to the line of despots that has ruled Africa since the advent of independence.

All We Are Saying Is Give Economics a Chance

The American public is repeatedly sold the proposition that the world is dangerous and becoming more so with each passing day. Alas, the kind of military interventions practiced by the U.S. have not lessened the danger in the past and have, in fact, increased it. The only tried-and-true, time-tested solution to the problems posed by terrorism is economic, not military. We refer retrospectively to World War II as “the good war” because our cause seems so unimpeachably just when juxtaposed alongside the evils of Fascism and the Holocaust. But it is not moral afflatus and good intentions that justify war. It is the postwar economic miracles worked in German and Japan that set an invisible seal on our rosy memories of World War II. By contrast, for example, the defeat of Germany in World War I now seems Pyrrhic because the war and subsequent draconian peace terms produced Germany’s interwar economic upheaval and resulting lurch into Fascism.

The evil of war lies in the rarity of its success, not the oft-cited barbarity of its practice. The U.S. went to war in Korea, Vietnam, Kuwait, Iran and Afghanistan to counter real evils. We enjoyed considerable military success and achieved some of our goals. But we did not achieve victory. Last week’s EconBrief reminds us how overwhelmingly difficult it was even for Great Britain and the U.S. – each far and away the foremost military power of its day – to achieve their ends through war. Only in South Korea was long-term success attained, and there it was due to economic victory rather than military victory.

Careful study of world poverty and terrorism will uncover an economic phenomenon, against which military measures are largely unavailing and police tactics are merely a stopgap.

“They” Can’t Adapt to Free Markets and Institutions

One entrenched obstacle to adopting Hernando De Soto’s game plan against terrorism is the conventional thinking that certain cultures are inherently unable to absorb the principles of economics and free markets. This argument is so vaguely made that it is never clear whether proponents are arguing the genetic or cultural inferiority of the affected peoples. Recently it has been applied to former Soviet Russia when attempts to acclimate the Russian people to free markets failed. The interesting thing about this episode is that it began with the proposition that Western economic consultants could design market institutions and then superimpose them on the Russian people. In other words, elite analysts began by assuming that Russians could easily adapt to whatever economic system was designed by others for their benefit, but then took the polar opposite position that Russians were incapable of adapting to free markets. No provision was made for the possibility that – having lived for centuries under rigid autocracy – Russians might need time to adapt to free institutions.

For centuries, Chinese were considered inferior and suitable only for low-skilled labor. That is the task to which most immigrant Chinese were consigned in 19th-century America. While Chinese in China failed to achieve economic development throughout most of the 20th century, immigrant Chinese were the world’s great ethnic economic development success story. Eventually Taiwan and mainland China joined the ranks of the developed world and another development myth bit the dust.

When the short-term results of the Arab Spring dislocations disappointed many in the West, Arabs became the latest people accorded the dishonor of being deemed unable to accommodate freedom and free markets. Perhaps the most concise response to this line of thought was given indirectly by Arab leaders responding to De Soto’s charge that their countries lacked the legal infrastructure to bring the poor into the formal economy. “You don’t need to tell us this,” one replied. “We’ve always been for entrepreneurs. Your prophet chased the merchants from the temple. Our prophet was a merchant!” In other words, the Arab tradition accommodates trade, even if their legal system is hostile to it.

Once again, this space stresses the distinction between the Rule of Law – which abhors privilege and worships freedom – and mere adherence to statutory law – which often cements tyranny into place.

Bringing Free Markets and Property Rights to the Middle East

As far as Western elites and the Western news media are concerned, the only kind of Middle East economic reform worth mentioning is foreign aid. But over a half-century of government-to-government foreign aid has proven to be an unqualified disaster. Economists like William Easterly and the late Lord Peter Bauer have written copiously on the pretentions of Western development economists and the corruption of Western development agencies. This is the deadest of dead ends.

De Soto’s approach is the only institutional approach worth considering. Apparently, it is actually receiving consideration by the beneficiaries of the Arab Spring. Egypt’s President, Abdel Fattah Al Sisi, commissioned De Soto and his team to study Egypt’s informal economy. That study found that Egypt’s poor get as much income from capital, in the informal economy, as they do from salaries in the formal economy. More precisely, some 24 million salaried citizens earn about $21 billion per year in salaries while owning some $360 billion in unrecognized assets that throw off roughly an equivalent amount of yearly income. As De Soto recognizes, this income is approximately 100 times the total of all Western financial, military and development aid to Egypt. It is also “eight times more than the value of all foreign direct investment in Egypt since Napoleon invaded more than 200 years ago.”

The problem is that much of this value is locked up in bureaucratic limbo. “It can take years to do something as simple as validating a title in real estate.”

 This is the real secret to achieving economic development in the Middle East. It is also the secret to fighting terrorism and preserving American security.

DRI-284 for week of 8-10-14: All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

An Access Advertising EconBrief:

All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

By now most of America must wonder secretly whether the door to race relations is marked “Abandon all hope, ye who enter here.” Blacks – mostly teenagers and young adults, except for those caught in the crossfire – are shot dead every day throughout the country by other blacks in private quarrels, drug deals gone bad and various attempted crimes. Murder is the leading cause of death for young black males in America. We are inured to this. But the relative exception of a black youth killed by a white man causes all hell to break loose – purely on the basis of the racial identities of the principals.

The latest chilling proof of this racial theorem comes from Ferguson, MO, the St. Louis suburb where a policeman shot and killed an unarmed 18-year-old black man on Monday. The fact that the shooter is a policeman reinforces the need for careful investigation and unflinching analysis of the issues involved. The constant intrusion of racial identity is a mountainous obstacle to this process.

The Two Sides to the Story, As Originally Told

The shooting occurred on Saturday afternoon, August 9, 2014, in Ferguson, MO, where 14,000 of the 21,000 inhabitants are black and 50 of 53 assigned St. Louis County Police officers are white. The two sides of the story are summarized in an Associated Press story carrying the byline of Jim Suhr and carried on MSN News 08/13/2014. “Police have said the shooting happened after an [then-unnamed] officer encountered 18-year-old Michael Brown and another man on the street. They say one of the men pushed the officer into his squad car, then physically assaulted him in the vehicle and struggled with the officer over the officer’s weapon. At least one shot was fired inside the car. The struggle then spilled onto the street, where Brown was shot multiple times. In their initial news conference about the shooting, police didn’t specify whether Brown was the person who scuffled with the officer in the car and have refused to clarify their account.”

“Jackson said Wednesday that the officer involved sustained swelling facial injuries.”

“Dorian Johnson, who says he was with Brown when the shooting happened, has told a much different story. He has told media outlets that the officer ordered them out of the street, then tried to open his door so close to the men that it ‘ricocheted’ back, apparently upsetting the officer. Johnson says the officer grabbed his friend’s neck, then tried to pull him into the car before brandishing his weapon and firing. He says Brown started to run and the officer pursued him, firing multiple times. Johnson and another witness both say Brown was on the street with his hands raised when the officer fired at him repeatedly.”

The Reaction by Local Blacks: Protests and Violence

When a white citizen is shot by police under questionable circumstances – an occurrence that is happening with disturbing frequency – the incident is not ignored. But the consequent public alarm is subdued and contained within prescribed channels. Newspapers editorialize. Public figures express concern. Private citizens protest by writing or proclaiming their discontent.

The stylized reaction to a white-on-black incident like the one in Ferguson is quite different. Ever since the civil-rights era that began in the 1950s, these incidents are treated as presumptive civil-rights violations; that is, they are treated as crimes committed because the victim was black. Black “leaders” bemoan the continuing victim status of blacks, viewing the incident as more proof of same – the latest in an ongoing, presumably never-ending, saga of brutalization of blacks by whites. “Some civil-rights leaders have drawn comparisons between Brown’s death and that of 17-year-old Trayvon Martin.”

Rank-and-file blacks gather and march in protest, holding placards and chanting slogans tailored to the occasion. “Some protestors… raised their arms above their heads as they faced the police… The most popular chant has been ‘Hands up! Don’t shoot!'”

Most striking of all is the contrast struck by headlines like “Protests Turn Violent in St. Louis Suburb.” There is no non-black analogue to behavior like this: “Protests in the St. Louis suburb turned violent Wednesday night, with people lobbing Molotov cocktails at police, who responded with smoke bombs and tear gas to disperse the crowd.” This is a repetition of behavior begun in the 1960s, when massive riots set the urban ghettos of Harlem, Philadelphia and Detroit afire.

Joseph Epstein Weighs In

The critic and essayist Joseph Epstein belongs on the short list of the most trenchant thinkers and writers in the English language. His pellucid prose has illumined subjects ranging from American education to gossip political correctness to Fred Astaire. The utter intractability of race in America is demonstrated irrefutably by the fact that the subject reduced Epstein to feeble pastiche.

In his Wall Street Journal op-ed “What’s Missing in Ferguson, MO.”(The Wall Street Journal, Wednesday, August 13, 2014), Epstein notes the stylized character of the episode: “…the inconsolable mother, the testimony of the dead teenager’s friends to his innocence, the aunts and cousins chiming in, the police chief’s promise of a thorough investigation… The same lawyer who represented the [Trayvon] Martin family, it was announced, is going to take this case.”

But according to Epstein, the big problem is that it isn’t stylized enough. “Missing… was the calming voice of a national civil-rights leader of the kind that was so impressive during the 1950s and ’60s. In those days there was Martin Luther King Jr…. Roy Wilkins… Whitney Young… Bayard Rustin…. – all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America’s black population.”

But they are long dead. “None has been replaced by men of anywhere near the same caliber. In their place today there is only Jesse Jackson and Al Sharpton…One of the small accomplishments of President Obama has been to keep both of these men from becoming associated with the White House.” Today, the overriding problem facing blacks is that “no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime and despair.”

Wait just a minute here. What about President Obama? He is, after all, a black man himself. That was ostensibly the great, momentous breakthrough of his election – the elevation of a black man to the Presidency of the United States. This was supposed to break the racial logjam once and for all. If a black man occupying the Presidency couldn’t lead the black underclass to the Promised Land, who could?

No, according to Epstein, it turns out that “President Obama, as leader of all the people, is not well positioned for the job of leading the black population that finds itself mired in despond.” Oh. Why not? “Someone is needed who commands the respect of his or her people, and the admiration of that vast – I would argue preponderate [sic] – number of middle-class whites who understand that progress for blacks means progress for the entire country.”

To be sure, Epstein appreciates the surrealism of the status quo. “In Chicago, where I live, much of the murder and crime… is black-on-black, and cannot be chalked up to racism, except secondarily by blaming that old hobgoblin, ‘the system.’ People march with signs reading ‘Stop the Killing,’ but everyone knows that the marching and the signs and the sweet sentiments of local clergy aren’t likely to change anything. Better education… a longer school day… more and better jobs… get the guns off the street… the absence of [black] fathers – … the old dead analyses, the pretty panaceas, are paraded. Yet nothing new is up for discussion… when Bill Cosby, Thomas Sowell or Shelby Steele… have dared to speak up about the pathologies at work… these black figures are castigated.”

The Dead Hand of “Civil Rights Movement” Thinking

When no less an eminence than Joseph Epstein sinks under the waves of cliché and outmoded rhetoric, it is a sign of rhetorical emergency: we need to burn away the deadwood of habitual thinking.

Epstein is caught in a time warp, still living out the decline and fall of Jim Crow. But that system is long gone, the men who destroyed it and those who desperately sought to preserve it alike. The Kings and Youngs and Wilkins’ and Rustins are gone just as the Pattons and Rommels and Ridgeways and MacArthurs and Montgomerys are gone. Leaders suit themselves to their times. Epstein is lamenting the fact that the generals of the last war are not around to fight this one.

Reflexively, Epstein hearkens back to the old days because they were days of triumph and progress. He is thinking about the Civil Rights Movement in exactly the same way that the political left thinks about World War II. What glorious days, when the federal government controlled every aspect of our lives and we had such a wonderful feeling of solidarity! Let’s recreate that feeling in peacetime! But those feelings were unique to wartime, when everybody subordinates their personal goals to the one common goal of winning the war. In peacetime, there is no such unitary goal because we all have our personal goals to fulfill. We may be willing to subordinate those goals temporarily to win a war but nobody wants to live that way perpetually. And the mechanisms of big government – unwieldy agencies, price and wage controls, tight security controls, etc. – may suffice to win a war against other big governments but cannot achieve prosperity and freedom in a normal peacetime environment.

In the days of Civil Rights, blacks were a collective, a clan, a tribe. This made practical, logistical sense because the Jim Crow laws treated blacks as a unit. It was a successful strategic move to close ranks in solidarity and choose leaders to speak for all. In effect, blacks were forming a political cartel to counter the political setbacks they had been dealt. That is to say, they were bargaining with government as a unit and consenting to be assigned rights as a collective (a “minority”) rather than as free individuals. In social science terms, they were what F. A. Hayek called a “social whole,” whose constituent individual parts were obliterated and amalgamated into the opaque unitary aggregate. This dangerous strategy has since come back to haunt them by obscuring the reality of black individualism.

Consider Epstein’s position. Indian tribes once sent their chief – one who earned respect as an elder, religious leader or military captain, what anthropologists called a “big man” – to Washington for meetings with the Great White Father. Now, Epstein wants to restore the Civil Rights days when black leaders analogously spoke out for their tribal flock. Traditionally, the fate of individuals in aboriginal societies is governed largely by the wishes of the “big man” or leader, not by their own independent actions. This would be unthinkable for (say) whites; when was the last time you heard a call for a George Washington, Henry Ford or Bill Gates to lead the white underclass out of its malaise?

In fact, this kind of thinking was already anachronistic in Epstein’s Golden Age, the heyday of Civil Rights. Many blacks recognized the trap they were headed towards, but took the path of least resistance because it seemed the shortest route to killing off Jim Crow. Now we can see the pitiful result of this sort of collective thinking.

An 18-year-old black male is killed by a police officer under highly suspicious circumstances. Is the focus on criminal justice, on the veracity of the police account, on the evidence of a crime? Is the inherent danger of a monopoly bureaucracy investigating itself and exercising military powers over its constituency highlighted? Not at all.

Instead, the same old racial demons are summoned from the closet using the same ritual incantations. Local blacks quickly turn a candlelight protest vigil into a violent riot. Uh oh – it looks like the natives are getting restless; too much firewater at the vigil, probably. Joseph Epstein bemoans the lack of a chieftain who can speak for them. No, wait – the Great Black Father in Washington has come forward to chastise the violent and exalt the meek and the humble. His lieutenant Nixon has sent a black chief to comfort his brothers. (On Thursday, Missouri Governor Jay Nixon sent Missouri Highway Patrol Captain Ron Johnson, a black man, heading a delegation of troopers to take over security duties in Ferguson.) The natives are mollified; the savage breast is soothed. “All the police did was look at us and shoot tear gas. Now we’re being treated with respect,” a native exults happily. “Now it’s up to us to ride that feeling,” another concludes. “The scene [after the Missouri Highway Patrol took over] was almost festive, with people celebrating and honking horns.” The black chief intones majestically: “We’re here to serve and protect… not to instill fear.” All is peaceful again in the village.

Is this the response Joseph Epstein was calling for? No, this is the phony-baloney, feel-good pretense that he decried, the same methods he recognized from his hometown of Chicago and now being deployed there by Obama confidant Rahm Emmanuel. The restless natives got the attention they sought. Meanwhile, lost in the festive party atmosphere was the case of Michael Brown, which wasn’t nearly as important as the rioters’ egos that needed stroking.

But the Highway Patrol will go home and the St. Louis County Police will be back in charge and the Michael Brown case will have to be resolved. Some six days after the event, the police finally got around to revealing pertinent details of the case; namely, that Michael Brown was suspected of robbing a convenience store of $48.99 worth of boxed cigars earlier that day in a “strong-arm robbery.” Six-year veteran policeman Darren Wilson, now finally identified by authorities, was one of several officers dispatched to the scene.

Of course, the blacks in Ferguson, MO, and throughout America aren’t Indian tribesmen or rebellious children – they are nominally free American individuals with natural rights protected by the U.S. Constitution. But if they expect to be treated with respect 365 days a year they will have to stop acting like juvenile delinquents, stop delegating the protection of their rights to self-serving politicians and hustlers and start asserting the individuality they possess.

The irony of this particular case is that it affords them just that opportunity. But it demands that they shed what Epstein calls “the too-comfortable robes of victimhood.” And they will have to step out from behind the shield of the collective. The Michael Brown case is not important because “blacks” are affronted. It is important because Michael Brown was an individual American just like the whites who get shot down by police every year. If Dorian Johnson is telling the truth, Brown’s individual rights were violated just as surely whether he was black, white, yellow or chartreuse.

Policing in America Today – and the Michael Brown Case

For at least two decades, policing in America has followed two clearly discernible trends. The first of these is the deployment of paramilitary equipment, techniques and thinking. The second is a philosophy is placing the police officer’s well-being above all other considerations. Both of these trends place the welfare of police bureaucrats, employees and officers above that of their constituents in the public.

To an economist, this is a striking datum. Owners or managers of competitive firms cannot place their welfare above that of their customers; if they do, the firm will go bankrupt and cease to exist, depriving the owners of an asset (wealth) and real income and the managers of a job and real income. So what allows a police force (more specifically, the Chief of Police and his lieutenants) to do what a competitive firm cannot do? Answer: The police have a monopoly on the use of force to enforce the law. In the words of a well-known lawyer, the response to the generic question “Can the police do that?” is always “Sure they can. They have guns.”

All bureaucracies tend to be inefficient, even corrupt. But corporate bureaucracies must respond to the public and they must earn profits. So they cannot afford to ignore consumer demand. The only factor to which government bureaucracies respond is variations in their budget, which are functions of political rather than economic variables.

All of these truths are on display in this case. The police have chosen to release only a limited, self-serving account of the incident. Their version of the facts is dubious to say the least, although it could conceivably be correct. Their suppression of rioting protestors employed large, tank-like vehicles carrying officers armed with military gear, weapons and tear gas. Dorian Johnson’s account of the incident is redolent of the modern police philosophy of “self-protection first;” at the first hint of trouble, the officer’s focus is on downing anybody who might conceivable offer resistance, armed or not, dangerous or not.

What does all this have to do with the racial identities of the principals? Absolutely nothing. Oh, it’s barely possible that officer Wilson might have harbored some racial animosity toward Brown or blacks in general. But it’s really quite irrelevant because white-on-black, white-on-white and black-on-white police incidents have cropped up from sea to shining sea in recent years. Indeed, this is an issue that should unite the races rather than dividing them since police are not reluctant to dispatch whites (or Hispanics or Asians, for that matter). While some observers claim the apparent increase in frequency of these cases is only because of the prevalence of cell phones and video cameras, this is also irrelevant; the fact that we may be noticing more abuses now would not be a reason to decry the new technology. As always, the pertinent question is whether or not an abuse of power took place. And those interested in the answer to that question, which should be every American, will have to contend with the unpromising prospect of a police department – a monopoly bureaucracy – investigating itself.

That is the very real national problem festering in Ferguson, MO – not a civil-rights problem, but a civil-wrongs problem.

The Battle Lines

Traditionally, ever since the left-wing counterculture demonized police as “pigs” in the 1960s, the right wing has reflexively supported the police and opposed those who criticized them. Indeed, some of this opposition to the police has been politically tendentious. But the right wing’s general stance is wrongheaded for two powerful reasons.

First, support for law enforcement itself has become progressively less equated to support for the Rule of Law. The number and scope of laws has become so large and excessive that support for the Rule of Law would actually require opposition to the existing body of statutory law.

Second, the monopoly status of the police has enabled them to become so abusive that they now threaten everybody, not merely the politically powerless. Considering the general decrease in crime rates driven by demographic factors, it is an open question whether most people are more threatened by criminals or by abusive police.

Even a bastion of neo-conservatism like The Wall Street Journal is becoming restive at the rampant exercise of monopoly power by police. Consider these excerpts from the unsigned editorial, “The Ferguson Exception,” on Friday, August 15, 2014: “One irony of Ferguson is that liberals have discovered an exercise of government power that they don’t support. Plenary police powers are vast, and law enforcement holds a public trust to use them in proportion to the threats. The Ferguson police must prevent rioting and looting and protect their own safety, though it is reasonable to wonder when law enforcement became a paramilitary operation [emphasis added]. The sniper rifles, black armored convoys and waves of tear gas deployed across Ferguson neighborhoods are jarring in a free society…Police contracts also build in bureaucratic privileges that would never be extended to other suspects. The Ferguson police department has refused to… supply basic information about the circumstances and status of the investigation [that], if it hasn’t been botched already, might help cool passions… how is anyone supposed to draw a conclusion one way or the other without any knowledge of what happened that afternoon?”

The Tunnel… and the Crack of Light at the End

The pair of editorial reactions in The Wall Street Journal typifies the alternatives open to those caught in the toils of America’s racial strife. We can play the same loop over and over again in such august company as Joseph Epstein. Or we can dunk ourselves in ice water, wake up and smell the coffee – and find ourselves rubbing shoulders with the Journal editors.

DRI-292 for week of 6-29-14: One in Six American Children is Hungry – No, Wait – One in Five!

An Access Advertising EconBrief:

One in Six American Children is Hungry – No, Wait – One in Five!

You’ve heard the ad. A celebrity – or at least somebody who sounds vaguely familiar, like singer Kelly Clarkson – begins by intoning somberly: “Seventeen million kids in America don’t know where their next meal is coming from or even if it’s coming at all.” One in six children in America is hungry, we are told. And that’s disgraceful, because there’s actually plenty of food, more than enough to feed all those hungry kids. The problem is just getting the food to the people who need it. Just make a donation to your local food pantry and together we can lick hunger in America. This ad is sponsored by the Ad Council and Feeding America.

What was your reaction? Did it fly under your radar? Did it seem vaguely dissonant – one of those things that strikes you wrong but leaves you not quite sure why? Or was your reaction the obvious one of any intelligent person paying close attention – “Huh? What kind of nonsense is this?”

Hunger is not something arcane and mysterious. We’ve all experienced it. And the world is quite familiar with the pathology of hunger. Throughout human history, hunger has been mankind’s number one enemy. In nature, organisms are obsessed with absorbing enough nutrients to maintain their body weight. It is only in the last few centuries that tremendous improvements in agricultural productivity have liberated us from the prison of scratching out a subsistence living from the soil. At that point, we began to view starvation as atypical, even unthinkable. The politically engineered famines that killed millions in the Soviet Union and China were viewed with horror; the famines in Africa attracted sympathy and financial support from the West. Even malnutrition came to be viewed as an aberration, something to be cured by universal public education and paternalistic government. In the late 20th century, the Green Revolution multiplied worldwide agricultural productivity manifold. As the 21st century dawned, the end of mass global poverty and starvation beckoned within a few decades and the immemorial problem of hunger seemed at last to be withering away.

And now we’re told that in America – for over a century the richest nation on Earth – our children – traditionally the first priority for assistance of every kind – are hungry at the ratio of one in six?

WHAT IS GOING ON HERE?

The Source of the Numbers – and the Truth About Child Hunger

Perhaps the most amazing thing about these ads, which constitute a full-fledged campaign, is the general lack of curiosity about their origins and veracity. Seemingly, they should have triggered a firestorm of criticism and investigation. Instead, they have been received with yawns.

The ads debuted last Fall. They were kicked off with an article in the New York Times on September 5, 2013, by Jane L. Levere, entitled “New Ad Campaign Targets Childhood Hunger.” The article is one long promotion for the ads and for Feeding America, but most of all for the “cause” of childhood hunger. That is, it takes for granted that a severe problem of childhood hunger exists and demands close attention.

The article cites the federal government as the source for the claim that “…close to 50 million Americans are living in ‘food insecure’ households,” or ones in which “some family members lacked consistent access throughout the year to adequate food.” It claims that “…almost 16 million children, or more than one in 5, face hunger in the United States.”

The ad campaign is characterized as “the latest in a long collaboration between Ad Council and Feeding America, ” which supplies some 200 food banks across the country that in turn supply more than 61,000 food pantries, soup kitchens and shelters. Feeding America began in the late 1990s as another organization, America’s Second Harvest, which enlisted the support of A-list celebrities such as Matt Damon and Ben Affleck. This was when the partnership with the Ad Council started.

Priscilla Natkins, a Vice-President of Ad Council, noted that in the early days “only” one out of 10 Americans was hungry. Now the ratio is 1 out of 7 and more than 1 out of 5 children. “We chose to focus on children,” she explained, “because it is a more poignant approach to illustrating the problem.”

Further research reveals that, mirabile dictu, this is not the first time that these ads have received skeptical attention. In 2008, Chris Edwards of Cato Institute wrote about two articles purporting to depict “hunger in America.” That year, the Sunday supplement Parade Magazine featured an article entitled “Going Hungry in America.” It stated that “more than 35.5 million Americans, more than 12% of the population and 17% of our children, don’t have enough food, according to the Department of Agriculture.” Also in 2008, the Washington Post claimed that “about 35 million Americans regularly go hungry each year, according to federal statistics.”

Edwards’ eyebrows went up appropriately high upon reading these accounts. After all, this was even before the recession had been officially declared. Unlike the rest of the world, though, Edwards actually resolved to verify these claims. This is what Edwards found upon checking with the Department of Agriculture.

In 2008, the USDA declared that approximately 24 million Americans were living in households that faced conditions of “low food security.” The agency defined this condition as eating “less varied diets, participat[ing] in Federal food-assistance programs [and getting] emergency food from community food pantries.” Edwards contended that this meant those people were not going hungry – by definition. And indeed, it is semantically perverse to define a condition of hunger by describing the multiple sources of food and change in composition of food enjoyed by the “hungry.”

The other 11 million (of the 35 million figure named in the two articles) people fell into a USDA category called “very low food security.” These were people whose “food intake was reduced at times during the year because they had insufficient money or other resources for food” [emphasis added]. Of these, the USDA estimated that some 430,000 were children. These would (then) comprise about 0.6% of American children, not the 17% mentioned by Parade Magazine, Edwards noted. Of course, having to reduce food on one or more occasions to some unnamed degree for financial reasons doesn’t exactly constitute “living in hunger” in the sense of not knowing where one’s next meal was coming from, as Edwards observed. The most that could, or should, be said was that the 11 million and the 430,000 might constitute possible candidates for victims of hunger.

On the basis of this cursory verification of the articles’ own sources, Chris Edward concluded that hunger in America ranked with crocodiles in the sewers as an urban myth.

We can update Edwards’ work. The USDA figures come from survey questions distributed and tabulated by the Census Bureau. The most recent data available were released in December 2013 for calendar year 2012. About 14.5% of households fell into the “low food security” category and about 5.7% of households were in the “very low food security” pigeonhole. Assuming the current average of roughly 2.58 persons per household, this translates to approximately 34 million people in the first category and just under 13.5 million people in the second category. If we assume the same fraction of children in these at-risk households as those in 2008, that would imply about 635,000 children in the high-risk category, or less than 0.9 of 1% of the nation’s children. That is a far cry from the 17% of the nation’s children mentioned in the Washington Post article of 2008. It is a farther cry from the 17,000,000 children mentioned in current ads, which would be over 20% of America’s children.

The USDA’s Work is From Hunger

It should occur to us to wonder why the Department of Agriculture – Agriculture, yet – should now reign as the nation’s arbiter of hunger. As it happens, economists are well situated to answer that question. They know that the federal food-stamp began in the 1940s primarily as a way of disposing of troublesome agricultural surpluses. The federal government spent the decade of the 1930s throwing everything but the kitchen sink at the problem of economic depression. Farmers were suffering because world trade had imploded; each nation was trying to protect its own businesses by taxing imports of foreign producers. Since the U.S. was the world’s leading exporter of foodstuffs, its farmers were staggering under this impact. They were swimming in surpluses and bled so dry by the resulting low prices that they burned, buried or slaughtered their own output without bringing it to market in an effort to raise food prices.

The Department of Agriculture devised various programs to raise agricultural prices, most of which involved government purchases of farm goods to support prices at artificially high levels. Of course, that left the government with lots of surplus food on its hands, which it stored in Midwestern caves in a futile effort to prevent spoilage. Food distribution to the poor was one way of ridding itself of these surpluses, and this was handled by the USDA which was already in possession of the food.

Just because the USDA runs the food-stamp program (now run as a debit-card operation) doesn’t make it an expert on hunger, though. Hunger is a medical and nutritional phenomenon, not an agricultural one. Starvation is governed by the intake of sufficient calories to sustain life; malnutrition is caused by the maldistribution of nutrients, vitamins and minerals. Does the Census Bureau survey doctors on the nutritional status of their patients to provide the USDA with its data on “food insecurity?”

Not hardly. The Census Bureau simply asks people questions about their food intake and solicits their own evaluation of their nutritional status. Short of requiring everybody to undergo a medical evaluation and submit the findings to the government, it could hardly be otherwise. But this poses king-sized problems of credibility for the USDA. Asking people whether they ever feel hungry or sometimes don’t get “enough” food is no substitute for a medical evaluation of their status.

People can and do feel hungry without coming even close to being hungry in the sense of risking starvation or even suffering a nutritional deficit. Even more to the point, their feelings of hunger may signal a nutritional problem that cannot be cured by money, food pantries, shelters or even higher wages and salaries. The gap between the “low food security” category identified by the USDA and starving peoples in Africa or Asia is probably a chasm the size of the Grand Canyon.

The same America that is supposedly suffering rampant hunger among both adults and children is also supposedly suffering epidemics of both obesity and diabetes. There is only one way to reconcile these contradictions: by recognizing that our “hunger” is not the traditional type but rather the kind associated with diabetes (hence, obesity) rather than the traditional sort of starvation or malnutrition. Over-ingestion of simple carbohydrates and starches can often cause upward spikes in blood sugar among susceptible populations, triggering the release of insulin that stores the carbohydrate as fat. Since the carbohydrate is stores as fat rather than burned for energy, the body remains starved for energy and hungry even though it is getting fat. Thus do hunger and obesity coexist.

The answer is not more government programs, food stamps, food pantries and shelters. Nor, for that matter, is it more donations to non-profit agencies like Feeding America. It is not more food at all, in the aggregate. Instead, the answer is a better diet – something that millions of Americans have found out for themselves in the last decade or so. In the meantime, there is no comparison between the “hunger” the USDA is supposedly measuring and the mental picture we form in our minds when we think of hunger.

This is not the only blatant contradiction raised by the “hunger in America” claims. University of Chicago economist Casey Mulligan, in his prize-winning 2012 book The Redistribution Recession, has uncovered over a dozen government program and rule changes that reduced the incentive to work and earn. He assigns these primary blame for the huge drop in employment and lag in growth that the U.S. has summered since 2007. High on his list are the changes in the food-stamp program that substituted a debit card for stamps, eliminated means tests and allowed recipients to remain on the program indefinitely. A wealthy nation in which 46 million out of 315 million citizens are on the food dole cannot simultaneously be suffering a problem of hunger. Other problems, certainly – but not that one.

What About the Real Hunger?

That is not to say that real hunger is completely nonexistent in America. Great Britain’s BBC caught word of our epidemic of hunger and did its own story on it, following the New York Times, Washington Post, Parade Magazine party line all the way. The BBC even located a few appropriately dirty, ragged children for website photos. But the question to ask when confronted with actual specimens of hunger is not “why has capitalism failed?” or “why isn’t government spending enough money on food-security programs?” The appropriate question is “why do we keep fooling ourselves into thinking that more government spending is the answer when the only result is that the problem keeps getting bigger?” After all, the definition of insanity is doing the same thing over and over again and expecting a different result.

The New York Times article in late 2013 quoted two academic sources that were termed “critical” of the ad campaign. But they said nothing about its blatant lies and complete inaccuracy. No, their complaint was that it promoted “charity” as the solution rather than their own pet remedies, a higher minimum wage and more government programs. This calls to mind the old-time wisecrack uttered by observers of the Great Society welfare programs in the 1960s and 70s: “This year, the big money is in poverty.” The real purpose of the ad campaign is to promote the concept of hunger in America in order to justify big-spending government programs and so-called private programs that piggyback on the government programs. And the real beneficiaries of the programs are not the poor and hungry but the government employees, consultants and academics whose jobs depend on the existence of “problems” that government purports to “solve” but that actually get bigger in order to justify ever-more spending for those constituencies.

That was the conclusion reached, ever so indirectly and delicately, by Chris Edwards of Cato Institute in his 2008 piece pooh-poohing the “hunger in America” movement. It applies with equal force to the current campaign launched by non-profits like the Ad Council and Feeding America, because the food banks, food pantries and shelters are supported both directly and indirectly by government programs and the public perception of problems that necessitate massive government intervention. It is the all-too-obvious answer to the cry for enlightenment made earlier in this essay.

In this context, it is clear that the answer to any remaining pockets of hunger is indeed charity. Only private, voluntary charity escapes the moral hazard posed by the bureaucrat/consultant class that has no emotional stake in the welfare of the poor and unfortunate but a big stake in milking taxpayers. This is the moral answer because it does not force people to contribute against their will but does allow them to exercise free will in choosing to help their fellow man. A moral system that works must be better than an immoral one that fails.

Where is the Protest?

The upshot of our inquiry is that the radio ads promoting “hunger in America” and suggesting that America’s children don’t know where their next meal is coming from are an intellectual fraud. There is no evidence that those children exist in large numbers, but their existence in any size indicts the current system. Rather than rewarding the failure of our current immoral system, we should be abandoning it in favor of one that works.

Our failure to protest these ads and publicize the truth is grim testimony to how far America has fallen from its origins and ideals. In the first colonial settlements at Jamestown and Plymouth, colonists learned the bitter lesson that entitlement was not a viable basis for civilization and work was necessary for survival. We are in the process of re-learning that lesson very slowly and painfully.

DRI-304 for week of 3-2-14: Subjugating Florists: Power, Freedom and the Rule of Law

An Access Advertising EconBrief:

Subjugating Florists: Power, Freedom and the Rule of Law

A momentous struggle for human freedom is playing out in a mundane setting. Two people in Washington state are planning to wed. They want their florist, Arlene’s Flowers and Gifts, to supply flowers for the wedding. The owner, Barronelle Stutzman, refuses the job. The couple wants her to be compelled by law to provide service to them.

Even without knowing that particular facts distinguish this situation, we might suspect it. In this case, the couple consists of two homosexual men, Robert Ingersoll and Curt Freed. Ms. Stutzman’s refusal stems from an unwillingness to participate in – and thus implicitly sanction – a ceremony of which she disapproves on religious grounds.

The points at issue are two: First, does existing law forbid Ms. Stutzman’s refusal on the grounds that it is an illegal “discrimination” against the couple? Second, is that interpretation the proper one, regardless of its legality?

The first point is a matter for lawyers. (Washington’s Attorney General has filed suit against Ms. Stutzman.) The second point is a matter for all of us. On it may hinge the survival of freedom in the United States of America.

The Facts of the Case

The prospective married couple, Messrs. Ingersoll and Freed, has granted numerous interviews to publicize their side of the case. To the Christian Broadcasting Network (CBN), they described themselves as “loyal customers for a decade” of Arlene’s.

“It [Stutzman’s refusal] really hurt because it was somebody I knew,” Ingersoll confided. “We stayed awake all night Saturday. It was eating at our souls.”

For her part, Ms. Stutzman declared that “you have to make a stand somewhere in your life on what you believe….” The unspoken implication was that she had faced repeated challenges to her convictions, culminating in this decision to stand fast. “In America, the government is supposed to protect freedom, not… intimidate citizens into acting contrary to their faith convictions.”

The attitude displayed by major media outlets reflects the Zeitgeist, which decrees: Ms. Stutzman is guilty of illegal discrimination on grounds of sexual orientation. It is significant that this verdict crosses political boundaries. On the Sunday morning discussion program Face the Nation, longtime conservative columnist and commentator George Will claimed that “public-accommodations law” had long ago “settled” the relevant legal point regarding the requirement of a business owner to provide service to all comers once doors have been opened to the public at large. But Mr. Will nonetheless expressed dissatisfaction with the apparent victory of the homosexual couple over the florist. “They [homosexuals in general] have been winning…this makes them look like bad winners.” Mr. Will seemed to suggest that the couple should forego their legal right and let Ms. Stutzman off the hook as a matter of good manners.

Legal, Yes; Proper, No

The fact that the subjugation of the florist is legal does not make it right. For decades, the Zeitgeist has been growing ever more totalitarian. Today, the United States of America approaches a form of authoritarian polity called an absolute democracy. In an absolute monarchy, one person rules. In an absolute democracy, the government is democratically elected but it holds absolute power over the citizens.

The inherent definition of freedom is the absence of external constraint. In this case, that would imply that Messrs. Ingersoll and Freed would be free to engage or refuse the services of Ms. Stutzman and Ms. Stutzman would be free to provide or refuse service to Messrs. Ingersoll and Freed – on any basis whatsoever. That is what freedom means. A concise way of describing the operation of the Rule of Law would be that all (adult) citizens enjoy freedom of contract.

But in our current unfree country, Messrs. Ingersoll and Freed are free to patronize Arlene’s or not but Ms. Stutzman is not free. She is required to serve Messrs. Ingersoll and Freed, like it or not. The couple’s sexual orientation has earned them the status of a privileged class. They have the privilege of compelling service. This is a privilege enjoyed by a comparative few.

George Will and company may pontificate about settled law, but the truth is that refusals of service happen daily in American business. Businesses often refuse other businesses as a courtesy, typically as an acknowledgement of their own shortcomings or lack of specialized knowledge or expertise. Sometimes a business will frankly admit that a would-be customer falls outside their target customer class. This sort of refusal rarely, if ever, leads to recriminations. After all, who really wants to pay for a product or service unwillingly supplied? The only exception comes when the customer falls within one of the government-protected categories covered by the anti-discrimination laws. Then the fear of litigation, financial and criminal penalties and adverse publicity kicks in.

This may be the clearest sign that the Rule of Law no longer prevails in America. The Rule of Law does not mean scrupulous adherence to statutory law. It means the absence of privilege. In America today, privilege is alive and growing like a cancer. In the past, we associated the term with wealth and social position. That is no longer true. Now it connotes special treatment by government.

The Role of Competition Under the Rule of Law

Under the Rule of Law, Messrs. Ingersoll and Freed would not be able to compel Ms. Stutzman to provide flowers to their wedding. But this would not leave them without resource. The Rule of Law supports the existence of free competitive markets. The couple could simply call up another florist. True, they would be denied the service of their longtime acquaintance and supplier. But nobody is entitled to a lifetime guarantee of the best of everything. What if Ms. Stutzman was ill on their wedding day, or called out of town, or struck down by a beer truck? What if she went bankrupt or retired? The Rule of Law simply protects a free, competitive market from which Messrs. Ingersoll and Freed can pick and choose a florist.

That is not the only benefit the couple get from the Rule of Law and competition. In a competitive market, any seller who refuses service to a willing buyer must pay a penalty or cost in the form of foregone revenue. In strict, formal theory, a competitive market produces an equilibrium result in which the amount of output produced at the equilibrium price is exactly equal to the ex ante amount desired by consumers. A seller who turns away a buyer is throwing money down the drain. This is not something sellers will do lightly. Anybody who doubts this has never run a business and met a payroll. Thus, free competitive markets offer strong disincentives to discrimination.

Of course, that does not mean that businesses will never refuse a customer; the instant case proves that. But refusals of conscience like the one made by Ms. Stutzman will be comparatively rare, because it will be unusual for the owner to value the moral issue more than the revenue foregone.

The existence of competition under the Rule of Law is the safeguard that makes freedom and democracy possible. Without it, we would have to fear the tyranny of the majority over minorities. With it, we can safely rely on markets to protect the rights and welfare of minorities.

The Rule of Law and Limited Government

Free choice by both buyers and sellers is not the enemy of minority rights. The real danger to minorities is government itself – the very government that is today advertised as the champion of minorities.

After the Civil War, newly freed and enfranchised blacks entered the free economy in the South. They began to compete with unskilled and skilled white labor. This competition was successful, both because blacks were willing to work for lower wages and because some blacks had mastered valuable skills while slaves. For example, professional baseball originated in the 1860s and increased steadily in popularity; blacks participated in this embryonic period.

White laborers resented this labor-market competition. In order to artificially increase the wages of their members, labor unions had to restrict the supply of labor. Denying union membership to blacks was a common means of catering to member desires while furthering wage objectives. But the competition provided by blacks was difficult to suppress because employers had a clear incentive to hire low-wage labor that was also productive and skillful. Businesses had a strong monetary incentive not to refuse service to blacks because the money offered by blacks was just as green as anybody else’s money.

The solution found by the anti-black forces was the so-called “Jim Crow” laws. These forbade the hiring of blacks on equal terms and denied blacks equal rights to public accommodations and service. In effect, the Jim Crow laws cartelized labor and product markets in a way that would not otherwise have occurred. Governments also handed out special privileges to labor unions that enabled them to compel membership and deny it at will. Historically, labor unions excluded blacks from membership for the bulk of the 20th century. Blacks were banned from organized baseball and most other professional sports until the 1940s, when sports became the first wedge driven into the Jim Crow laws.

The apartheid law passed in southern Africa in the early 20th century also arose in order to thwart successful competition offered by white labor by black labor. Left alone, competitive labor markets were enabling black South Africans to enjoy rising wages and employment. South African labor unions agitated for government protection against black workers. The result was the “pass laws” or “color bar” or apartheid system, not unlike the Jim Crow laws prevailing in America. Once again, the purpose was to cartelize labor markets in order to erect barriers to competition offered to white labor by black workers.

The rationale behind public utilities was ostensibly to limit the pricing power and profits enjoyed by firms that would otherwise have been “natural monopolies.” In actual practice, by guaranteeing public utilities a “normal profit,” government removed the specter of a loss of revenue and profit associated with discrimination against black customers and employees. Sure enough, public utilities were among the chief practitioners of discrimination against blacks – along with government itself, which also did not fear a loss of profit resulting from its actions.

A recurring effect of government regulation of business in all its forms has been the erosion of competition. Sometimes that has been caused by costly compliance with regulation, driving businesses bankrupt and reducing market competition through attrition. Sometimes this has come from direct government cartelization of competitive markets, resulting from measures like marketing orders and quotas in milk and citrus fruit. Sometimes that has come from price supports, target prices and acreage allotments that have reduced agricultural output and raised prices or, alternatively, raised prices while creating costly surpluses for which taxpayers must pay. Sometimes the reduction in competition results from anti-trust laws like the Robinson Patman Act, deliberately designed to raise prices and restrict competition in retail business.

There is no formal, coherent theory of regulation. Instead, regulatory legislation is accompanied by vague protestations of good will and good intentions that have no unambiguous translation into policy. The typical result is that regulators either take over the role of controlling business decisions from market participants or they become the patrons and protectors of businesses within the industries they regulate. The latter attitude has evolved within the financial sector, where regulators have gradually taken the view that the biggest competitors are “too big to fail.” That is, the effects of failure would spill over onto too many other firms, causing widespread adverse effects. This, in turn, precludes discipline imposed by competitive markets, which force businesses to serve consumers well or go out of business.

The enemy of minorities is government, not free competitive markets. Government harms minorities directly by passing discriminatory laws against them or indirectly by foreclosing or lessening competition.

The Two-Edged Sword of Government Power

Many people find it difficult to perceive government as the threat because government vocally broadcasts its beneficence and cloaks its intentions in the vocabulary of good intentions. It bestows noble and high-sounding names on its legislative enactments. It endows them with historic significance. Like Edmund Rostand’s protagonist Chanticleer, government pretends that its will causes the sun to rise and set and only its benevolence stands between us and disaster.

But the blessings of government are a two-edged sword. “A government powerful enough to give us everything we want is powerful enough to take from us everything we have.” One by one, the beneficiaries of arbitrary government power have been also been stung by the exercise of that same power.

In 1954, government insisted that “separate was inherently unequal” and that the segregated education received by blacks must be inferior to that enjoyed by whites. Instead of introducing competition to schools, government intruded into education more than ever before. Now, six decades later, blacks still struggle for educational parity. And today, it is government that stands in the schoolhouse door to thwart blacks – not through segregation, but by resolutely opposing the educational competition introduced by charter schools in New York City. The overwhelming majority of charter patrons are black, who embrace the charter concept wholeheartedly. But Mayor Bill de Blasio has vowed to fight charter schools tooth and claw. The state and federal governments can be relied upon to sit on their hands, since teacher unions – diehard enemies of charter schools – are a leading constituency of the Democrat Party.

For over a century, blacks have lived and died by government and the Democrat Party. Now they are cut by the other edge of the government sword.

The print and broadcast news media have been cheerleaders for big government and the Democrat Party throughout the 20th century and beyond. First-Amendment absolutism has been a staple of left-wing thought. Recently, FCC regulators in the Obama administration hatched a plan to study journalists and their employers with a view towards tighter regulation. The pretext for the FCC’s Multi-Market Study of Critical Information Needs was that FCC broadcast licenses come with an obligation to serve the public – and how can government determine whether licensees are serving the public without thoroughly studying them? All hell has suddenly broken loose at the prospect that journalists themselves might be subjected to the same stifling regulation as other industries.

Of course, in a competitive market it is quite unnecessary to regulators to “study” the market to gauge whether it is working. Consumers make that judgment themselves. If businesses don’t serve consumers, consumers desert them and the businesses fold. Other businesses take their place and provide better service – or they join their predecessors on the scrap heap. But the presumption of government is that regulation must be necessary to promote competition – otherwise “market failure” will strand consumers up the creek without locomotion.

For decades, the knee-jerk reflex of journalists to any perceived problem has been that “no government regulation exists” to solve it. Now journalists tremble as they test the opposite edge of the government sword.

Now homosexuals are the latest group to successively experience both blades of the government sword. After years of life spent in the shadow of criminal prosecution, homosexuals have witnessed the gradual dismantling of state anti-sodomy laws. State-level bans on marriage by couples of the same gender have been invalidated by the U.S. Supreme Court. Not satisfied with their newly won freedom, homosexuals strive to wield power over their fellow citizens through coercion.

This is the only sense in which George Will was correct. His characterization of homosexuals as “bad winners” was infantile; it portrayed a serious issue of human freedom as a schoolboy exercise in bad manners. But he correctly sensed that homosexuals were winning something – even if he wasn’t quite sure what – and that this latest shift toward subjugating florists was a disastrous change in direction.

What Do Homosexuals Want? What Are They Owed Under the Rule of Law?

The holistic fallacy treats homosexuals as an organic unity with homogeneous wants and goals. In reality, they are individuals with diverse personalities and political orientations. But the homosexual movement follows a clearly discernible left-wing agenda, just as Hispanic activist organizations like La Raza hew to a left-wing line not representative of most Hispanics.

The homosexual political agenda strives to normalize and legitimize homosexual behavior by winning the imprimatur of government and the backing of government force. This movement feeds off the angst of people like Ingersoll and Freed – “It really hurt…it was eating at our souls” – who ache from the sting of rejection. The movement is selling government approval as a psychological substitute for parental and societal approval and economic rents as revenge for rejection. Homosexuals have observed the success of blacks, women and other protected classes in pursuing gains via this route.

There was a time, not so long ago when measured by the relative standard of history, when male homosexuals were not merely criminals but were subjected to a kind of informal “Jim Crow” persecution. They were routinely beaten and rolled not only by ordinary citizens but even by police. It is worth noting that these attitudes began to change decades ago, even before the advent of so-called “affirmative action” programs ostensibly designed to redress the grievances of other victim classes.

The Rule of Law demands that homosexuals receive the same rights and due-process protections as other people. It applies the same standards of consent to all sexual relationships between consenting adults. It grants the same freedom of contract – marital and otherwise – to all.

By the same token, the Rule of Law abhors privilege. It rejects the chimerical notion that the past harms suffered by individual members of groups can be compensated somehow by committing present harms that grant privilege and real income to different members of those same victimized groups.

The Rule of Law and Social Harmony

Sociologists and political scientists used to marvel as the comparative social harmony of American society – achieved despite the astonishing ethnic, racial, religious and political diversity of the citizenry. The consensus assigned credit to the American “melting pot.” The problem with this explanation is that a culture must first exist before new entrants can assimilate within it – and what mechanism achieved the original reconciliation of diverse elements?

Adherence to the Rule of Law within competitive markets made social harmony possible. It allowed the daily exchange of goods and services among individuals in relative anonymity, without disclosure of the multitudinous conflicts that might have otherwise produced stalemate and rejection. Milton Friedman observed astutely that free markets permit us to transact with the butcher, baker and candlestick maker without inquiring into their political or religious convictions. We need agree only on price and quantity. The need for broader consensus would bring ordinary life as we know it to a grinding halt; government would have to step in with coercive power in order to break the stalemate.

When everybody wears their politics, religion and sexual orientation on their sleeves, it makes life unpleasant, worrisome and exhausting. Shouldering chips weighs us down and invites conflict. This is the real source of the “polarization” complained of far and wide, not the relatively trivial differences between Republicans and Democrats. (The two parties are in firm agreement on the desirability of big government; they disagree vehemently only on who will run the show.)

Intellectuals wrongly assumed that the anonymity fostered by the Rule of Law reflected irreconcilable contradictions within society that would eventually cause violence like the Stonewall riots in 1969. The truth was that the Rule of Law reconciled contradictory views of individuals and allowed peaceful social change to occur gradually. Homosexuals were able to live, work and achieve outside of the glare of the public spotlight. It slowly dawned on the American public, at first subliminally and then consciously, that homosexuals were successfully contributing to every segment of American life. The achievements pointed to with pride today by homosexual activists were possible only because the Rule of Law facilitated this gradual, peaceful process. They were not caused by self-righteous activists and an all-powerful government bitch-slapping an ignorant, recalcitrant public into submission.

Subjugating Florists: A Pyrrhic Victory

Free competitive markets cash the checks written by the Rule of Law. Homosexuals have lived and prospered within those free-market boundaries, mirroring the tradition of Jews, blacks and other stigmatized minority groups. For centuries, homosexuals have faced ostracism and even death in various societies around the world. That remains true in certain countries even now. While it is true that homosexuals were formerly treated cruelly in America, it is also true that their cultural, economic and political gains here have been remarkably rapid by historical standards. Historical memory, rather than etiquette, should counsel against trashing the free-market institutions that have midwived that progress.

Violating the Rule of Law in exchange for the power to compel service by businesses would be far worse than a display of bad manners. It would be the worst kind of tradeoff for homosexuals, gaining a temporary political and public-relations triumph at the expense of long-run economic stability.

Of course, homosexual activists are hardly the first or the only ones grasping at the levers of government power. The history of 20th-century America is dominated by such attempts, emanating at first from the political Left but now from the Right as well. It is grimly amusing to recall that early efforts along these lines were hailed by political scientists as encouraging examples of “pluralism” and “inclusiveness” – they were supposed to be signs that the downtrodden and marginalized were now participating in the political process. Today, everybody and his brother-in-law are trying to work local, state or federal government for an edge or a subsidy. Nobody can pretend now that this is anything but the unmistakable indicator of societal disintegration and decay.

Heretofore, the visible traits of democracy – representative government, elections, checks and balances – have been considered both necessary and sufficient to guarantee freedom. The falsity of that presumption is now dawning upon us with the appreciation of democratic absolutism as an impending reality. Subjugating florists may provide the homosexual movement with the thrills of political blood sport but any victories won will prove Pyrrhic.