DRI-219 for week of 11-23-14: A Columnist’s Dawning Recognition of Deadly Auto-Safety Regulation

An Access Advertising EconBrief:

A Columnist’s Dawning Recognition of Deadly Auto-Safety Regulation

We are familiar with investigative reports by reporters in print and broadcast media and, in recent years, online. We view these as the mechanism for regulating institutions not subject to the constraints of the marketplace. Government is chief among these.

This routine has accustomed us to casting the news media in the role of cynical watchdog, always looking for wrongdoing and too prone to suspect the motives of those it covers. Of course, we may suspect the press of pre-existing bias – in favor of Democrats, for instance. But for the most part, we believe that their interests are served by finding scandal, wrongdoing and malfeasance, because these things are news.

The possibility that the press itself may be naïve and complacent is the last one we consider. It should not be overlooked.

Air Bag Safety 

Wall Street Journal columnist Holman Jenkins has written a series of columns about auto safety and regulation. Many of them followed the regulatory travails of Toyota, which endured a prolonged crucifixion when its vehicles were ostensibly subject to a problem of “unintended acceleration.” Although it was all too clear that the problem was caused by drivers unwittingly depressing the accelerator instead of the brake pedal, the company was beset by the fable that a bug in the car’s computer code was causing cars to accelerate when they should be slowing. Despite the conspicuous lack of scientific evidence for this hypothesis – not surprising in view of its impossibility – Toyota eventually was forced to pay out hundreds of millions of dollars in settlement money to make the issue go away.

Having set a tone of skepticism toward regulators, Jenkins turned next to the recent disclosure by Takata that their air bags have displayed defects. Toyota and Honda have recalled over 8 million vehicles to replace the air bags. The defect (apparently caused by moisture entering the ammonium nitrate air filter of the air bag) breaks down the explosive tablets in the air bags, causing them to burn quicker and explode more violently than normal. In turn, this shreds the metal housing surrounding the tablets and sends a shower of shrapnel into the driver and front-seat passenger (if any).

Jenkins noted that the demand by federal automobile regulators that the companies recall millions more vehicles is suspiciously timed to coincide with the end of hearings on the response by Japanese automakers to the finding of defective air bags. He reserved his strongest note of skepticism, though, for the use of air bags as safety devices.

“The faulty Takata air bags are connected to five deaths in 13 years, which is a tiny fraction of the deaths known to be caused by air bags working as designed [emphasis added]. When the Takata mess is cleaned up, we’ll still be left with a highly problematic safety technology.”

What’s this? Air bags themselves cause automobile-occupant deaths? They’re supposed to prevent deaths, not cause them. This is surely news to the general public, which is why Jenkins continues with a brief chronology of air bags’ journey from industrial infancy to ubiquity. “Washington began pushing automakers to install air bags in the 1980s, and ever since Washington has been responsible for research that confirms that air bags save hundreds of lives a year. These studies, though, credit air bags with saving people who were also wearing seat belts, when considerable evidence indicates seat belts alone do the job.”

“These studies also assume that deaths in collisions where air bags deployed are always attributable to the collision, never [to] the air bag.”

Jenkins does not mention that the push for air bags coincides with a federal push for mandatory wearing of seat belts. The first state law that required the wearing of a seat belt meeting federal specifications – essentially, a three-point seat belt buckling over the lap but also including a shoulder restraint – was passed in 1983. States received seven-figure federal bounties for passing a mandatory seat-belt law and achieving an estimated compliance beyond a specified rate.

“A 2005 study by Mary C. Mayer and Tremika Finney published by the American Statistical Association tried to correct for these errors and found that the clearest effect of air bags was an increased risk of death for unbelted occupants in low-speed crashes. Likewise, a 2002 study of 51,000 fatal accidents by University of Washington epidemiologists found that air bags (unlike seat belts) contributed little to crash survivability.”

“[Thus] air bags began as simple bombs buried in the dashboard designed to protect the typical non-seat belt wearing accident victim – the typical unbelted victim being a 170-pound teenage male. In 1997 came the reckoning: Air bags designed to meet the government’s criteria were shown to be responsible for the deaths of dozens of children and small adults in otherwise survivable accidents.”

This is the key point in Jenkins’ chronology, the point at which the reader’s eyebrows shoot up and he shakes his head in disbelief. Dozens of deaths? Adults and children? How did I miss the public furor over this? After all, when one or two people die owing to an automobile defect that a company knew about or should have known about, all hell breaks loose. In fact, the history of government suppression of unfavorable air-bag performance goes back decades. But Jenkins makes no mention of this; instead, he moves on.

“Since then, air bags have become ‘smarter,’ with computers modulating their deployment depending on type of crash, passenger characteristics and whether seat belts are being worn. Undoubtedly the technology has improved but still debatable is whether the benefits outweigh the risks and costs. Air bags remain one of the biggest reasons for vehicle recalls – and no wonder, given that these devices, which are dangerous to those who manufacture them and to those who repair vehicles, are expected to go years without maintenance or testing and then work perfectly.”

“Because, in the minds of the public, not to mention in the slow-motion videos on the evening news, air bags are seen as gentle, billowy clouds of perfect safety, yet another problem is the potential encouragement they give motorists to drive more aggressively or forgo the hassle of buckling up.” Now Jenkins has driven himself and his readers into water over their heads and is stalled. His column will drown unless it is rescued promptly. He has made a strong case that air bags are inherently unsafe, but is now suggesting something else – a different source of harm from their use. The fatal stretch of water was entered with the words “…yet another problem is the potential encouragement they [air bags] give motorists to drive more aggressively or forgo the hassle of buckling up.” This requires the services of a professional economist.

The Economic Principle of “Risk Compensation”

 

People tend to increase their indulgence in risky activities when they perceive that the safety of those activities has been enhanced. Risk should be treated just like any other consumption good – when the price of risk goes down, we should expect people to purchase more of it.

The first of the previous two sentences would meet with the approval of most people. The second would not. Yet from the economist’s perspective they might be interpreted as saying the same thing. A space shuttle is currently being readied for commercial use by tourists; wouldn’t we expect tourists to be more enthusiastic about it when improvements in launch and flight safety reduce the risk of death and serious injury for passengers? Still, we would expect there to be an appreciable risk of space travel for the indefinite future, wouldn’t we? When improvements in contraception result in better prophylactics, don’t we expect people to have sex more often, despite the fact that they still run a risk of contracting a sexually transmitted disease?

In 1975, economist Sam Peltzman published a seminal article in the Journal of Political Economy. He analyzed the effects of a series of government-mandated safety devices introduced beginning in the mid-1960s. His analysis suggested that the net effect on safety was approximately zero. Peltzman offered two explanations for this surprising result, the most plausible of which was that requiring the use of seat belts by drivers causes some people to take more driving risk than they would have if they had been driving beltless. This additional driving risk produced more accidents. While the increased use of safety devices tended to produce fewer injuries and fatalities among automobile occupants, the increased number of accidents also implicated non-automobile occupants such as pedestrians, motorcyclists and bicyclists. These additional injuries and fatalities tended to offset the injuries and fatalities saved by the use of seat belts, so that the comparative end result in driving statistics such as “fatalities per million miles driven” was a wash.

Over the succeeding forty years, this kind of outcome became proverbial throughout the social sciences, not just economics. In 2006, Smithsonian Magazine published an article summarizing the powerful effect that Peltzman’s work has had on the world. His ideas are grouped under the heading of “risk compensation,” an evocative term that implies that we satisfy our appetite for risk by compensating for added safety by “purchasing” more risk.  The principle has been observed in nations around the world, among adults and children, in activities ranging from driving to playground behavior to sports. Famous economist N. Gregory Mankiw, former Chairman of the President’s Council of Economic Advisors under President George W. Bush, blogged about “Sam Peltzman, who taught us all that mandatory seat-belt laws cause drivers to drive more recklessly.” Mankiw dubbed the relevant principle the “Peltzman Effect,” making Peltzman one of a select group to have a scientific principle named after him.

Despite the scientific status of risk compensation and the Peltzman Effect, Holman Jenkins shows no sign of having heard of it. He speaks of the “potential encouragement” offered by air bags to more aggressive driving by motorists as if he had just exhumed a Stone Age cave and stumbled upon a rectangular version of the wheel therein. And he applies the principle to air bags with no apparent awareness of its equal applicability to seat belts.

The Perils of Mandatory Safety

 

“By now,” Jenkins laments, “those of a certain age remember that Detroit was the villain that opposed putting explosive devices in their vehicles, plumping instead for mandatory seat-belt laws (which, amazingly, certain safety groups opposed).” No economist is surprised that Detroit opposed the idea of being forced to increase the cost of production by providing a safety benefit that (a) consumers didn’t want and (b) didn’t work, which would (c) expose them to endless litigation as well as threaten them personally. Seat belts were several orders of magnitude less expensive and the loss of freedom to consumers did not represent a business loss to automobile companies.

Jenkins’ failure to understand the opposition to mandatory seat-belt laws is astonishing, though, since it is based on the very same principle that he just invoked to oppose air bags. There is a lot to be said for seat belts when provided as a voluntary option for consumers. There is everything wrong with mandatory seat-belt laws because they encourage (force?) risk-loving drivers to obey the law by buckling up, then to fulfill their love of risk by driving more aggressively – and to do this as a substitute for going unbuckled in the first place. An unbuckled risk-lover is a driver who is himself bearing the risk he chooses to run. A buckled-up aggressive driver is a risk lover who is imposing the risk he chooses to run on other drivers – and pedestrians and cyclists – who may be more risk averse. This is bad theoretically because it is economically inefficient. Economic inefficiency is bad in the practical sense because it misaligns cost and benefit. In this case, it allows risk lovers to benefit from the risks they run but imposes some of the costs on other people who don’t benefit because they are risk-averse individuals who didn’t want to run those risks in the first place.

The practical side of all this has been seen is various ways. New Hampshire is the only state that hasn’t passed a mandatory seat-belt law in the interval since the mid-1980s. It has poorer-than-average weather and topography, so we would expect to it to have worse-than-average traffic-fatality results, all other things equal. Since traffic-safety expert predicted that mandatory seat-belt laws would usher in traffic-safety nirvana by reducing fatalities hugely, we would expect to find that New Hampshire highways had become a veritable slaughterhouse – if mandatory seat belts were the predicted panacea, that is.

Instead, New Hampshire traffic statistics have improved to near the top of the national rankings despite its singular lack of a mandatory seat-belt law. New Hampshire should be the poster-state for mandatory seat-belt laws; instead, it is the smoking gun that points to their guilt. This fact has gone completely unremarked in the national news media, which is probably why Holman Jenkins hasn’t noticed it.

But there is no excuse for Jenkins’ failure to notice the slowing improvement in nationwide traffic statistics that occurred along with the installation of air bags and mandatory seat-belt laws. The rate of fatalities per million vehicle miles driven has been falling since the 1930s and the growth of modern automobiles, highways, safety methods, signage and improved quality control in production and repair. The federal highway safety bureaucracy makes a point of announcing the yearly fatality data because it usually represents a recent low point. What they fail to announce is the slowing rate of decline. Indeed, recently fatalities have actually risen in spite of the poor economy and less auto travel.

Jenkins apparently considers himself daring for suggesting that air bags are counterproductive and should be eliminated. He cites the myriad of safety innovations that have come on line in the last few years: automatic lane-violation warning devices, automatic skid-correction devices, automatic collision avoidance and braking sensors, automatic stabilizers and design features that direct crash energy away from passengers. “One imponderable is how much faster progress might have been without the bureaucracy’s forced diversion of industry capital to air bags … Each stride tilts the calculation away from having an IED in the dashboard as a net benefit to motorists, bringing closer the day when a new safety innovation will be announced: an air bag-free vehicle.”

Actually, Jenkins’ repudiation of air bags and reaffirmation of mandatory seat belts puts him about 45 years behind the times – about where we stood before Sam Peltzman wrote in 1975. Jenkins deserves credit for daring to break out of the regulatory mindset by opposing air bags, something other journalists have failed to do. That indicates the intellectual depths to which the downward market spiral of journalism has taken us.

What Jenkins should be doing is calling for abolition of the Department of Transportation, not air bags. Consider this: According to Jenkins’ own logic, the DOT mounted a nationwide campaign for mandatory seat-belt laws while also insisting upon mandatory air-bag installation in vehicles. But this is crazy. Using the language of game theory, we would say that the presence of air bags “dominates” seat-belt use, making it superfluous. With an air bag, one of two things happens: the air bag deploys as intended – in which case the passenger is protected in the accident – or the air bag explodes – in which case the passenger is maimed or killed. Either way the seat belt is superfluous. Wearing a seat belt doesn’t add protection if the air bag works and doesn’t protect against shrapnel if the air bag explodes. There is also a third possibility: the air bag might explode prematurely, killing or maiming the passenger even though there is no accident. And the seat belt is superfluous in this case as well.

Although shouldn’t be too hard on Holman Jenkins, we shouldn’t feel bound by his intellectual limitations or his inhibitions. Now that we know that both mandatory air bags and mandatory seat-belt laws are abominations enacted in the name of automobile safety, what are we to make of a federal-government safety bureaucracy that insists upon them even after their demonstrated failures? And with the technology of self-driving cars a demonstrated reality, what are we to think when that same bureaucracy is distinctly reluctant to allow it to proceed?

Why Does DOT Tend to Hinder Rather Than Promote Automobile Safety?

The heading for this section will anger many readers. The conventional view of federal regulation has been described by the late Nobel laureate James Buchanan as the “romantic” view of government. Roughly speaking, it is that government regulators act nobly and altruistically in the public interest. Upon very close examination, the term “public interest” will be found so vague as to defy precise definition. However, this is advantageous in practice, as it allows each user to define it according to his or her individual desires – it makes the theory of government into a sort of fairy-tale, wish-fulfillment affair. No wonder this approach has survived so long with so little clear-eyed scrutiny! Everybody is afraid to look at it too hard for fear that their fondest dreams will go up in smoke. And indeed, that is what actually happens when we try to put this theory into practice.

Suppose we depart from this sentimental approach by inquiring into the incentives that confront bureaucrats in the Department of Transportation (DOT). First, ask what happens if DOT develops an innovative safety technology that saves the lives of consumers. Let’s say, for example, that they develop an improved seat belt, such as the three-point seat belt which turned the failed two-point lap belt into a viable safety device. Will the individual researcher(s) in DOT get a bonus? Will he or she (or they) patent the device and earn substantial royalties? Will they become famous? The answers are no, no and no, respectively. Thus, there are no positive incentives motivating DOT to improve automobile safety.

On the other hand, suppose DOT does just the opposite. Suppose it actually worsens auto safety. Indeed, suppose DOT does exactly what the political Left routinely accuses capitalist businessmen of doing; namely, kills its “customers” (in DOT’s case, this would be the consumers who are the ostensible beneficiaries of regulation).

What an irritating, outrageous question to pose! We all know that government regulatory agencies exist to protect the public, so it is unforgivably irresponsible to suggest that they would actually kill the people they are supposed to protect. But – let’s face it – that is exactly what Holman Jenkins is implying, isn’t it? He never has the cojones to blurt it out, but the statements that “Washington has been responsible for research” and “air bags designed to meet the government’s criteria were shown to be responsible for the deaths of dozens of children and small adults in otherwise survivable accidents” don’t leave much to the imagination, do they? As it happens, there is plenty more dirty linen in the government’s closet that Jenkins leaves unaired.

As long as everybody else is as deferential (or as cowardly) as Jenkins, the general public will not link government with the deaths in the way that private businesses are linked with the deaths of consumers. When more consumers die, what happens is this: government benefits. DOT uses this as the excuse to hire more people, beef up research and spend more money. Larger staffs and bigger budgets are the bureaucratic equivalent of higher profits, but this differs from the private-sector outcome in that higher profits are normally associated with better outcomes for consumers while, if anything, the reverse is true of bureaucratic expansion in government.

Suppose DOT were to recommend that we proceed at breakneck speed to adopt driverless cars in order to eliminate virtually all of our current 30,000+ annual highway fatalities. Suppose the agency even brings about this outcome within just a few years. There would be little or nothing left for the agency to do; it would have succeeded so well that it would have innovated itself out of existence. No wonder that DOT is dragging its feet to slow the acceptance of driverless cars!

In the private sector, there is an incentive to solve problems. In government, there is never an incentive to solve problems because that will usually leave government with no excuse to exist, to grow and expand. When a private firm solves a problem, it makes a big pile of money that it can use to expand or enter some new line of business – even if the solution to the problem leaves it with no reason to continue producing its current product. There is no government analogue to this reward and consequently no incentive for government to succeed, only incentives for it to fail. Indeed, there are even incentives for it to do harm. And in the arena of automobile safety, that is exactly what it has done.

Just to reinforce the point, let’s generalize it by broadening our evidentiary base beyond federal regulation. Earlier we cited various numerous safety improvements that are being incorporated piecemeal into automobiles by the major auto companies: lane-violation detection, automatic braking, collision avoidance and others. Driverless cars include all of these and more besides. The state of California has recently passed regulatory legislation forcing all driverless cars to allow a human driver to “take over in an emergency;” e.g., bypass the sensors that govern the driverless car’s actions. But every one of the safety improvements listed was designed expressly to produce mistake-proof behavior by the car in various emergency situations. In other words, the regulatory legislation has the effect of defeating the safety purpose of the driverless car. Oh, some nobler, more romantic rationale is advanced, but that is the effective result of the law.

The case of the DOT is not unique at the federal level either. Hundreds of thousands of corpses could attest to the harm the FDA has done by blocking the approval of new drugs. Many economists could, and have, detail the harm done to competition by application of the antitrust laws ostensibly designed to preserve and protect it.

Economists are the real investigative reporters. Most of the time, their tools consist of logic and arithmetic rather than confidential informants and leaked documents. But when it comes to exposes, their stories put those of journalists in the shade.

DRI-284 for week of 8-10-14: All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

An Access Advertising EconBrief:

All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

By now most of America must wonder secretly whether the door to race relations is marked “Abandon all hope, ye who enter here.” Blacks – mostly teenagers and young adults, except for those caught in the crossfire – are shot dead every day throughout the country by other blacks in private quarrels, drug deals gone bad and various attempted crimes. Murder is the leading cause of death for young black males in America. We are inured to this. But the relative exception of a black youth killed by a white man causes all hell to break loose – purely on the basis of the racial identities of the principals.

The latest chilling proof of this racial theorem comes from Ferguson, MO, the St. Louis suburb where a policeman shot and killed an unarmed 18-year-old black man on Monday. The fact that the shooter is a policeman reinforces the need for careful investigation and unflinching analysis of the issues involved. The constant intrusion of racial identity is a mountainous obstacle to this process.

The Two Sides to the Story, As Originally Told

The shooting occurred on Saturday afternoon, August 9, 2014, in Ferguson, MO, where 14,000 of the 21,000 inhabitants are black and 50 of 53 assigned St. Louis County Police officers are white. The two sides of the story are summarized in an Associated Press story carrying the byline of Jim Suhr and carried on MSN News 08/13/2014. “Police have said the shooting happened after an [then-unnamed] officer encountered 18-year-old Michael Brown and another man on the street. They say one of the men pushed the officer into his squad car, then physically assaulted him in the vehicle and struggled with the officer over the officer’s weapon. At least one shot was fired inside the car. The struggle then spilled onto the street, where Brown was shot multiple times. In their initial news conference about the shooting, police didn’t specify whether Brown was the person who scuffled with the officer in the car and have refused to clarify their account.”

“Jackson said Wednesday that the officer involved sustained swelling facial injuries.”

“Dorian Johnson, who says he was with Brown when the shooting happened, has told a much different story. He has told media outlets that the officer ordered them out of the street, then tried to open his door so close to the men that it ‘ricocheted’ back, apparently upsetting the officer. Johnson says the officer grabbed his friend’s neck, then tried to pull him into the car before brandishing his weapon and firing. He says Brown started to run and the officer pursued him, firing multiple times. Johnson and another witness both say Brown was on the street with his hands raised when the officer fired at him repeatedly.”

The Reaction by Local Blacks: Protests and Violence

When a white citizen is shot by police under questionable circumstances – an occurrence that is happening with disturbing frequency – the incident is not ignored. But the consequent public alarm is subdued and contained within prescribed channels. Newspapers editorialize. Public figures express concern. Private citizens protest by writing or proclaiming their discontent.

The stylized reaction to a white-on-black incident like the one in Ferguson is quite different. Ever since the civil-rights era that began in the 1950s, these incidents are treated as presumptive civil-rights violations; that is, they are treated as crimes committed because the victim was black. Black “leaders” bemoan the continuing victim status of blacks, viewing the incident as more proof of same – the latest in an ongoing, presumably never-ending, saga of brutalization of blacks by whites. “Some civil-rights leaders have drawn comparisons between Brown’s death and that of 17-year-old Trayvon Martin.”

Rank-and-file blacks gather and march in protest, holding placards and chanting slogans tailored to the occasion. “Some protestors… raised their arms above their heads as they faced the police… The most popular chant has been ‘Hands up! Don’t shoot!'”

Most striking of all is the contrast struck by headlines like “Protests Turn Violent in St. Louis Suburb.” There is no non-black analogue to behavior like this: “Protests in the St. Louis suburb turned violent Wednesday night, with people lobbing Molotov cocktails at police, who responded with smoke bombs and tear gas to disperse the crowd.” This is a repetition of behavior begun in the 1960s, when massive riots set the urban ghettos of Harlem, Philadelphia and Detroit afire.

Joseph Epstein Weighs In

The critic and essayist Joseph Epstein belongs on the short list of the most trenchant thinkers and writers in the English language. His pellucid prose has illumined subjects ranging from American education to gossip political correctness to Fred Astaire. The utter intractability of race in America is demonstrated irrefutably by the fact that the subject reduced Epstein to feeble pastiche.

In his Wall Street Journal op-ed “What’s Missing in Ferguson, MO.”(The Wall Street Journal, Wednesday, August 13, 2014), Epstein notes the stylized character of the episode: “…the inconsolable mother, the testimony of the dead teenager’s friends to his innocence, the aunts and cousins chiming in, the police chief’s promise of a thorough investigation… The same lawyer who represented the [Trayvon] Martin family, it was announced, is going to take this case.”

But according to Epstein, the big problem is that it isn’t stylized enough. “Missing… was the calming voice of a national civil-rights leader of the kind that was so impressive during the 1950s and ’60s. In those days there was Martin Luther King Jr…. Roy Wilkins… Whitney Young… Bayard Rustin…. – all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America’s black population.”

But they are long dead. “None has been replaced by men of anywhere near the same caliber. In their place today there is only Jesse Jackson and Al Sharpton…One of the small accomplishments of President Obama has been to keep both of these men from becoming associated with the White House.” Today, the overriding problem facing blacks is that “no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime and despair.”

Wait just a minute here. What about President Obama? He is, after all, a black man himself. That was ostensibly the great, momentous breakthrough of his election – the elevation of a black man to the Presidency of the United States. This was supposed to break the racial logjam once and for all. If a black man occupying the Presidency couldn’t lead the black underclass to the Promised Land, who could?

No, according to Epstein, it turns out that “President Obama, as leader of all the people, is not well positioned for the job of leading the black population that finds itself mired in despond.” Oh. Why not? “Someone is needed who commands the respect of his or her people, and the admiration of that vast – I would argue preponderate [sic] – number of middle-class whites who understand that progress for blacks means progress for the entire country.”

To be sure, Epstein appreciates the surrealism of the status quo. “In Chicago, where I live, much of the murder and crime… is black-on-black, and cannot be chalked up to racism, except secondarily by blaming that old hobgoblin, ‘the system.’ People march with signs reading ‘Stop the Killing,’ but everyone knows that the marching and the signs and the sweet sentiments of local clergy aren’t likely to change anything. Better education… a longer school day… more and better jobs… get the guns off the street… the absence of [black] fathers – … the old dead analyses, the pretty panaceas, are paraded. Yet nothing new is up for discussion… when Bill Cosby, Thomas Sowell or Shelby Steele… have dared to speak up about the pathologies at work… these black figures are castigated.”

The Dead Hand of “Civil Rights Movement” Thinking

When no less an eminence than Joseph Epstein sinks under the waves of cliché and outmoded rhetoric, it is a sign of rhetorical emergency: we need to burn away the deadwood of habitual thinking.

Epstein is caught in a time warp, still living out the decline and fall of Jim Crow. But that system is long gone, the men who destroyed it and those who desperately sought to preserve it alike. The Kings and Youngs and Wilkins’ and Rustins are gone just as the Pattons and Rommels and Ridgeways and MacArthurs and Montgomerys are gone. Leaders suit themselves to their times. Epstein is lamenting the fact that the generals of the last war are not around to fight this one.

Reflexively, Epstein hearkens back to the old days because they were days of triumph and progress. He is thinking about the Civil Rights Movement in exactly the same way that the political left thinks about World War II. What glorious days, when the federal government controlled every aspect of our lives and we had such a wonderful feeling of solidarity! Let’s recreate that feeling in peacetime! But those feelings were unique to wartime, when everybody subordinates their personal goals to the one common goal of winning the war. In peacetime, there is no such unitary goal because we all have our personal goals to fulfill. We may be willing to subordinate those goals temporarily to win a war but nobody wants to live that way perpetually. And the mechanisms of big government – unwieldy agencies, price and wage controls, tight security controls, etc. – may suffice to win a war against other big governments but cannot achieve prosperity and freedom in a normal peacetime environment.

In the days of Civil Rights, blacks were a collective, a clan, a tribe. This made practical, logistical sense because the Jim Crow laws treated blacks as a unit. It was a successful strategic move to close ranks in solidarity and choose leaders to speak for all. In effect, blacks were forming a political cartel to counter the political setbacks they had been dealt. That is to say, they were bargaining with government as a unit and consenting to be assigned rights as a collective (a “minority”) rather than as free individuals. In social science terms, they were what F. A. Hayek called a “social whole,” whose constituent individual parts were obliterated and amalgamated into the opaque unitary aggregate. This dangerous strategy has since come back to haunt them by obscuring the reality of black individualism.

Consider Epstein’s position. Indian tribes once sent their chief – one who earned respect as an elder, religious leader or military captain, what anthropologists called a “big man” – to Washington for meetings with the Great White Father. Now, Epstein wants to restore the Civil Rights days when black leaders analogously spoke out for their tribal flock. Traditionally, the fate of individuals in aboriginal societies is governed largely by the wishes of the “big man” or leader, not by their own independent actions. This would be unthinkable for (say) whites; when was the last time you heard a call for a George Washington, Henry Ford or Bill Gates to lead the white underclass out of its malaise?

In fact, this kind of thinking was already anachronistic in Epstein’s Golden Age, the heyday of Civil Rights. Many blacks recognized the trap they were headed towards, but took the path of least resistance because it seemed the shortest route to killing off Jim Crow. Now we can see the pitiful result of this sort of collective thinking.

An 18-year-old black male is killed by a police officer under highly suspicious circumstances. Is the focus on criminal justice, on the veracity of the police account, on the evidence of a crime? Is the inherent danger of a monopoly bureaucracy investigating itself and exercising military powers over its constituency highlighted? Not at all.

Instead, the same old racial demons are summoned from the closet using the same ritual incantations. Local blacks quickly turn a candlelight protest vigil into a violent riot. Uh oh – it looks like the natives are getting restless; too much firewater at the vigil, probably. Joseph Epstein bemoans the lack of a chieftain who can speak for them. No, wait – the Great Black Father in Washington has come forward to chastise the violent and exalt the meek and the humble. His lieutenant Nixon has sent a black chief to comfort his brothers. (On Thursday, Missouri Governor Jay Nixon sent Missouri Highway Patrol Captain Ron Johnson, a black man, heading a delegation of troopers to take over security duties in Ferguson.) The natives are mollified; the savage breast is soothed. “All the police did was look at us and shoot tear gas. Now we’re being treated with respect,” a native exults happily. “Now it’s up to us to ride that feeling,” another concludes. “The scene [after the Missouri Highway Patrol took over] was almost festive, with people celebrating and honking horns.” The black chief intones majestically: “We’re here to serve and protect… not to instill fear.” All is peaceful again in the village.

Is this the response Joseph Epstein was calling for? No, this is the phony-baloney, feel-good pretense that he decried, the same methods he recognized from his hometown of Chicago and now being deployed there by Obama confidant Rahm Emmanuel. The restless natives got the attention they sought. Meanwhile, lost in the festive party atmosphere was the case of Michael Brown, which wasn’t nearly as important as the rioters’ egos that needed stroking.

But the Highway Patrol will go home and the St. Louis County Police will be back in charge and the Michael Brown case will have to be resolved. Some six days after the event, the police finally got around to revealing pertinent details of the case; namely, that Michael Brown was suspected of robbing a convenience store of $48.99 worth of boxed cigars earlier that day in a “strong-arm robbery.” Six-year veteran policeman Darren Wilson, now finally identified by authorities, was one of several officers dispatched to the scene.

Of course, the blacks in Ferguson, MO, and throughout America aren’t Indian tribesmen or rebellious children – they are nominally free American individuals with natural rights protected by the U.S. Constitution. But if they expect to be treated with respect 365 days a year they will have to stop acting like juvenile delinquents, stop delegating the protection of their rights to self-serving politicians and hustlers and start asserting the individuality they possess.

The irony of this particular case is that it affords them just that opportunity. But it demands that they shed what Epstein calls “the too-comfortable robes of victimhood.” And they will have to step out from behind the shield of the collective. The Michael Brown case is not important because “blacks” are affronted. It is important because Michael Brown was an individual American just like the whites who get shot down by police every year. If Dorian Johnson is telling the truth, Brown’s individual rights were violated just as surely whether he was black, white, yellow or chartreuse.

Policing in America Today – and the Michael Brown Case

For at least two decades, policing in America has followed two clearly discernible trends. The first of these is the deployment of paramilitary equipment, techniques and thinking. The second is a philosophy is placing the police officer’s well-being above all other considerations. Both of these trends place the welfare of police bureaucrats, employees and officers above that of their constituents in the public.

To an economist, this is a striking datum. Owners or managers of competitive firms cannot place their welfare above that of their customers; if they do, the firm will go bankrupt and cease to exist, depriving the owners of an asset (wealth) and real income and the managers of a job and real income. So what allows a police force (more specifically, the Chief of Police and his lieutenants) to do what a competitive firm cannot do? Answer: The police have a monopoly on the use of force to enforce the law. In the words of a well-known lawyer, the response to the generic question “Can the police do that?” is always “Sure they can. They have guns.”

All bureaucracies tend to be inefficient, even corrupt. But corporate bureaucracies must respond to the public and they must earn profits. So they cannot afford to ignore consumer demand. The only factor to which government bureaucracies respond is variations in their budget, which are functions of political rather than economic variables.

All of these truths are on display in this case. The police have chosen to release only a limited, self-serving account of the incident. Their version of the facts is dubious to say the least, although it could conceivably be correct. Their suppression of rioting protestors employed large, tank-like vehicles carrying officers armed with military gear, weapons and tear gas. Dorian Johnson’s account of the incident is redolent of the modern police philosophy of “self-protection first;” at the first hint of trouble, the officer’s focus is on downing anybody who might conceivable offer resistance, armed or not, dangerous or not.

What does all this have to do with the racial identities of the principals? Absolutely nothing. Oh, it’s barely possible that officer Wilson might have harbored some racial animosity toward Brown or blacks in general. But it’s really quite irrelevant because white-on-black, white-on-white and black-on-white police incidents have cropped up from sea to shining sea in recent years. Indeed, this is an issue that should unite the races rather than dividing them since police are not reluctant to dispatch whites (or Hispanics or Asians, for that matter). While some observers claim the apparent increase in frequency of these cases is only because of the prevalence of cell phones and video cameras, this is also irrelevant; the fact that we may be noticing more abuses now would not be a reason to decry the new technology. As always, the pertinent question is whether or not an abuse of power took place. And those interested in the answer to that question, which should be every American, will have to contend with the unpromising prospect of a police department – a monopoly bureaucracy – investigating itself.

That is the very real national problem festering in Ferguson, MO – not a civil-rights problem, but a civil-wrongs problem.

The Battle Lines

Traditionally, ever since the left-wing counterculture demonized police as “pigs” in the 1960s, the right wing has reflexively supported the police and opposed those who criticized them. Indeed, some of this opposition to the police has been politically tendentious. But the right wing’s general stance is wrongheaded for two powerful reasons.

First, support for law enforcement itself has become progressively less equated to support for the Rule of Law. The number and scope of laws has become so large and excessive that support for the Rule of Law would actually require opposition to the existing body of statutory law.

Second, the monopoly status of the police has enabled them to become so abusive that they now threaten everybody, not merely the politically powerless. Considering the general decrease in crime rates driven by demographic factors, it is an open question whether most people are more threatened by criminals or by abusive police.

Even a bastion of neo-conservatism like The Wall Street Journal is becoming restive at the rampant exercise of monopoly power by police. Consider these excerpts from the unsigned editorial, “The Ferguson Exception,” on Friday, August 15, 2014: “One irony of Ferguson is that liberals have discovered an exercise of government power that they don’t support. Plenary police powers are vast, and law enforcement holds a public trust to use them in proportion to the threats. The Ferguson police must prevent rioting and looting and protect their own safety, though it is reasonable to wonder when law enforcement became a paramilitary operation [emphasis added]. The sniper rifles, black armored convoys and waves of tear gas deployed across Ferguson neighborhoods are jarring in a free society…Police contracts also build in bureaucratic privileges that would never be extended to other suspects. The Ferguson police department has refused to… supply basic information about the circumstances and status of the investigation [that], if it hasn’t been botched already, might help cool passions… how is anyone supposed to draw a conclusion one way or the other without any knowledge of what happened that afternoon?”

The Tunnel… and the Crack of Light at the End

The pair of editorial reactions in The Wall Street Journal typifies the alternatives open to those caught in the toils of America’s racial strife. We can play the same loop over and over again in such august company as Joseph Epstein. Or we can dunk ourselves in ice water, wake up and smell the coffee – and find ourselves rubbing shoulders with the Journal editors.

DRI-292 for week of 6-29-14: One in Six American Children is Hungry – No, Wait – One in Five!

An Access Advertising EconBrief:

One in Six American Children is Hungry – No, Wait – One in Five!

You’ve heard the ad. A celebrity – or at least somebody who sounds vaguely familiar, like singer Kelly Clarkson – begins by intoning somberly: “Seventeen million kids in America don’t know where their next meal is coming from or even if it’s coming at all.” One in six children in America is hungry, we are told. And that’s disgraceful, because there’s actually plenty of food, more than enough to feed all those hungry kids. The problem is just getting the food to the people who need it. Just make a donation to your local food pantry and together we can lick hunger in America. This ad is sponsored by the Ad Council and Feeding America.

What was your reaction? Did it fly under your radar? Did it seem vaguely dissonant – one of those things that strikes you wrong but leaves you not quite sure why? Or was your reaction the obvious one of any intelligent person paying close attention – “Huh? What kind of nonsense is this?”

Hunger is not something arcane and mysterious. We’ve all experienced it. And the world is quite familiar with the pathology of hunger. Throughout human history, hunger has been mankind’s number one enemy. In nature, organisms are obsessed with absorbing enough nutrients to maintain their body weight. It is only in the last few centuries that tremendous improvements in agricultural productivity have liberated us from the prison of scratching out a subsistence living from the soil. At that point, we began to view starvation as atypical, even unthinkable. The politically engineered famines that killed millions in the Soviet Union and China were viewed with horror; the famines in Africa attracted sympathy and financial support from the West. Even malnutrition came to be viewed as an aberration, something to be cured by universal public education and paternalistic government. In the late 20th century, the Green Revolution multiplied worldwide agricultural productivity manifold. As the 21st century dawned, the end of mass global poverty and starvation beckoned within a few decades and the immemorial problem of hunger seemed at last to be withering away.

And now we’re told that in America – for over a century the richest nation on Earth – our children – traditionally the first priority for assistance of every kind – are hungry at the ratio of one in six?

WHAT IS GOING ON HERE?

The Source of the Numbers – and the Truth About Child Hunger

Perhaps the most amazing thing about these ads, which constitute a full-fledged campaign, is the general lack of curiosity about their origins and veracity. Seemingly, they should have triggered a firestorm of criticism and investigation. Instead, they have been received with yawns.

The ads debuted last Fall. They were kicked off with an article in the New York Times on September 5, 2013, by Jane L. Levere, entitled “New Ad Campaign Targets Childhood Hunger.” The article is one long promotion for the ads and for Feeding America, but most of all for the “cause” of childhood hunger. That is, it takes for granted that a severe problem of childhood hunger exists and demands close attention.

The article cites the federal government as the source for the claim that “…close to 50 million Americans are living in ‘food insecure’ households,” or ones in which “some family members lacked consistent access throughout the year to adequate food.” It claims that “…almost 16 million children, or more than one in 5, face hunger in the United States.”

The ad campaign is characterized as “the latest in a long collaboration between Ad Council and Feeding America, ” which supplies some 200 food banks across the country that in turn supply more than 61,000 food pantries, soup kitchens and shelters. Feeding America began in the late 1990s as another organization, America’s Second Harvest, which enlisted the support of A-list celebrities such as Matt Damon and Ben Affleck. This was when the partnership with the Ad Council started.

Priscilla Natkins, a Vice-President of Ad Council, noted that in the early days “only” one out of 10 Americans was hungry. Now the ratio is 1 out of 7 and more than 1 out of 5 children. “We chose to focus on children,” she explained, “because it is a more poignant approach to illustrating the problem.”

Further research reveals that, mirabile dictu, this is not the first time that these ads have received skeptical attention. In 2008, Chris Edwards of Cato Institute wrote about two articles purporting to depict “hunger in America.” That year, the Sunday supplement Parade Magazine featured an article entitled “Going Hungry in America.” It stated that “more than 35.5 million Americans, more than 12% of the population and 17% of our children, don’t have enough food, according to the Department of Agriculture.” Also in 2008, the Washington Post claimed that “about 35 million Americans regularly go hungry each year, according to federal statistics.”

Edwards’ eyebrows went up appropriately high upon reading these accounts. After all, this was even before the recession had been officially declared. Unlike the rest of the world, though, Edwards actually resolved to verify these claims. This is what Edwards found upon checking with the Department of Agriculture.

In 2008, the USDA declared that approximately 24 million Americans were living in households that faced conditions of “low food security.” The agency defined this condition as eating “less varied diets, participat[ing] in Federal food-assistance programs [and getting] emergency food from community food pantries.” Edwards contended that this meant those people were not going hungry – by definition. And indeed, it is semantically perverse to define a condition of hunger by describing the multiple sources of food and change in composition of food enjoyed by the “hungry.”

The other 11 million (of the 35 million figure named in the two articles) people fell into a USDA category called “very low food security.” These were people whose “food intake was reduced at times during the year because they had insufficient money or other resources for food” [emphasis added]. Of these, the USDA estimated that some 430,000 were children. These would (then) comprise about 0.6% of American children, not the 17% mentioned by Parade Magazine, Edwards noted. Of course, having to reduce food on one or more occasions to some unnamed degree for financial reasons doesn’t exactly constitute “living in hunger” in the sense of not knowing where one’s next meal was coming from, as Edwards observed. The most that could, or should, be said was that the 11 million and the 430,000 might constitute possible candidates for victims of hunger.

On the basis of this cursory verification of the articles’ own sources, Chris Edward concluded that hunger in America ranked with crocodiles in the sewers as an urban myth.

We can update Edwards’ work. The USDA figures come from survey questions distributed and tabulated by the Census Bureau. The most recent data available were released in December 2013 for calendar year 2012. About 14.5% of households fell into the “low food security” category and about 5.7% of households were in the “very low food security” pigeonhole. Assuming the current average of roughly 2.58 persons per household, this translates to approximately 34 million people in the first category and just under 13.5 million people in the second category. If we assume the same fraction of children in these at-risk households as those in 2008, that would imply about 635,000 children in the high-risk category, or less than 0.9 of 1% of the nation’s children. That is a far cry from the 17% of the nation’s children mentioned in the Washington Post article of 2008. It is a farther cry from the 17,000,000 children mentioned in current ads, which would be over 20% of America’s children.

The USDA’s Work is From Hunger

It should occur to us to wonder why the Department of Agriculture – Agriculture, yet – should now reign as the nation’s arbiter of hunger. As it happens, economists are well situated to answer that question. They know that the federal food-stamp began in the 1940s primarily as a way of disposing of troublesome agricultural surpluses. The federal government spent the decade of the 1930s throwing everything but the kitchen sink at the problem of economic depression. Farmers were suffering because world trade had imploded; each nation was trying to protect its own businesses by taxing imports of foreign producers. Since the U.S. was the world’s leading exporter of foodstuffs, its farmers were staggering under this impact. They were swimming in surpluses and bled so dry by the resulting low prices that they burned, buried or slaughtered their own output without bringing it to market in an effort to raise food prices.

The Department of Agriculture devised various programs to raise agricultural prices, most of which involved government purchases of farm goods to support prices at artificially high levels. Of course, that left the government with lots of surplus food on its hands, which it stored in Midwestern caves in a futile effort to prevent spoilage. Food distribution to the poor was one way of ridding itself of these surpluses, and this was handled by the USDA which was already in possession of the food.

Just because the USDA runs the food-stamp program (now run as a debit-card operation) doesn’t make it an expert on hunger, though. Hunger is a medical and nutritional phenomenon, not an agricultural one. Starvation is governed by the intake of sufficient calories to sustain life; malnutrition is caused by the maldistribution of nutrients, vitamins and minerals. Does the Census Bureau survey doctors on the nutritional status of their patients to provide the USDA with its data on “food insecurity?”

Not hardly. The Census Bureau simply asks people questions about their food intake and solicits their own evaluation of their nutritional status. Short of requiring everybody to undergo a medical evaluation and submit the findings to the government, it could hardly be otherwise. But this poses king-sized problems of credibility for the USDA. Asking people whether they ever feel hungry or sometimes don’t get “enough” food is no substitute for a medical evaluation of their status.

People can and do feel hungry without coming even close to being hungry in the sense of risking starvation or even suffering a nutritional deficit. Even more to the point, their feelings of hunger may signal a nutritional problem that cannot be cured by money, food pantries, shelters or even higher wages and salaries. The gap between the “low food security” category identified by the USDA and starving peoples in Africa or Asia is probably a chasm the size of the Grand Canyon.

The same America that is supposedly suffering rampant hunger among both adults and children is also supposedly suffering epidemics of both obesity and diabetes. There is only one way to reconcile these contradictions: by recognizing that our “hunger” is not the traditional type but rather the kind associated with diabetes (hence, obesity) rather than the traditional sort of starvation or malnutrition. Over-ingestion of simple carbohydrates and starches can often cause upward spikes in blood sugar among susceptible populations, triggering the release of insulin that stores the carbohydrate as fat. Since the carbohydrate is stores as fat rather than burned for energy, the body remains starved for energy and hungry even though it is getting fat. Thus do hunger and obesity coexist.

The answer is not more government programs, food stamps, food pantries and shelters. Nor, for that matter, is it more donations to non-profit agencies like Feeding America. It is not more food at all, in the aggregate. Instead, the answer is a better diet – something that millions of Americans have found out for themselves in the last decade or so. In the meantime, there is no comparison between the “hunger” the USDA is supposedly measuring and the mental picture we form in our minds when we think of hunger.

This is not the only blatant contradiction raised by the “hunger in America” claims. University of Chicago economist Casey Mulligan, in his prize-winning 2012 book The Redistribution Recession, has uncovered over a dozen government program and rule changes that reduced the incentive to work and earn. He assigns these primary blame for the huge drop in employment and lag in growth that the U.S. has summered since 2007. High on his list are the changes in the food-stamp program that substituted a debit card for stamps, eliminated means tests and allowed recipients to remain on the program indefinitely. A wealthy nation in which 46 million out of 315 million citizens are on the food dole cannot simultaneously be suffering a problem of hunger. Other problems, certainly – but not that one.

What About the Real Hunger?

That is not to say that real hunger is completely nonexistent in America. Great Britain’s BBC caught word of our epidemic of hunger and did its own story on it, following the New York Times, Washington Post, Parade Magazine party line all the way. The BBC even located a few appropriately dirty, ragged children for website photos. But the question to ask when confronted with actual specimens of hunger is not “why has capitalism failed?” or “why isn’t government spending enough money on food-security programs?” The appropriate question is “why do we keep fooling ourselves into thinking that more government spending is the answer when the only result is that the problem keeps getting bigger?” After all, the definition of insanity is doing the same thing over and over again and expecting a different result.

The New York Times article in late 2013 quoted two academic sources that were termed “critical” of the ad campaign. But they said nothing about its blatant lies and complete inaccuracy. No, their complaint was that it promoted “charity” as the solution rather than their own pet remedies, a higher minimum wage and more government programs. This calls to mind the old-time wisecrack uttered by observers of the Great Society welfare programs in the 1960s and 70s: “This year, the big money is in poverty.” The real purpose of the ad campaign is to promote the concept of hunger in America in order to justify big-spending government programs and so-called private programs that piggyback on the government programs. And the real beneficiaries of the programs are not the poor and hungry but the government employees, consultants and academics whose jobs depend on the existence of “problems” that government purports to “solve” but that actually get bigger in order to justify ever-more spending for those constituencies.

That was the conclusion reached, ever so indirectly and delicately, by Chris Edwards of Cato Institute in his 2008 piece pooh-poohing the “hunger in America” movement. It applies with equal force to the current campaign launched by non-profits like the Ad Council and Feeding America, because the food banks, food pantries and shelters are supported both directly and indirectly by government programs and the public perception of problems that necessitate massive government intervention. It is the all-too-obvious answer to the cry for enlightenment made earlier in this essay.

In this context, it is clear that the answer to any remaining pockets of hunger is indeed charity. Only private, voluntary charity escapes the moral hazard posed by the bureaucrat/consultant class that has no emotional stake in the welfare of the poor and unfortunate but a big stake in milking taxpayers. This is the moral answer because it does not force people to contribute against their will but does allow them to exercise free will in choosing to help their fellow man. A moral system that works must be better than an immoral one that fails.

Where is the Protest?

The upshot of our inquiry is that the radio ads promoting “hunger in America” and suggesting that America’s children don’t know where their next meal is coming from are an intellectual fraud. There is no evidence that those children exist in large numbers, but their existence in any size indicts the current system. Rather than rewarding the failure of our current immoral system, we should be abandoning it in favor of one that works.

Our failure to protest these ads and publicize the truth is grim testimony to how far America has fallen from its origins and ideals. In the first colonial settlements at Jamestown and Plymouth, colonists learned the bitter lesson that entitlement was not a viable basis for civilization and work was necessary for survival. We are in the process of re-learning that lesson very slowly and painfully.

DRI-304 for week of 3-2-14: Subjugating Florists: Power, Freedom and the Rule of Law

An Access Advertising EconBrief:

Subjugating Florists: Power, Freedom and the Rule of Law

A momentous struggle for human freedom is playing out in a mundane setting. Two people in Washington state are planning to wed. They want their florist, Arlene’s Flowers and Gifts, to supply flowers for the wedding. The owner, Barronelle Stutzman, refuses the job. The couple wants her to be compelled by law to provide service to them.

Even without knowing that particular facts distinguish this situation, we might suspect it. In this case, the couple consists of two homosexual men, Robert Ingersoll and Curt Freed. Ms. Stutzman’s refusal stems from an unwillingness to participate in – and thus implicitly sanction – a ceremony of which she disapproves on religious grounds.

The points at issue are two: First, does existing law forbid Ms. Stutzman’s refusal on the grounds that it is an illegal “discrimination” against the couple? Second, is that interpretation the proper one, regardless of its legality?

The first point is a matter for lawyers. (Washington’s Attorney General has filed suit against Ms. Stutzman.) The second point is a matter for all of us. On it may hinge the survival of freedom in the United States of America.

The Facts of the Case

The prospective married couple, Messrs. Ingersoll and Freed, has granted numerous interviews to publicize their side of the case. To the Christian Broadcasting Network (CBN), they described themselves as “loyal customers for a decade” of Arlene’s.

“It [Stutzman’s refusal] really hurt because it was somebody I knew,” Ingersoll confided. “We stayed awake all night Saturday. It was eating at our souls.”

For her part, Ms. Stutzman declared that “you have to make a stand somewhere in your life on what you believe….” The unspoken implication was that she had faced repeated challenges to her convictions, culminating in this decision to stand fast. “In America, the government is supposed to protect freedom, not… intimidate citizens into acting contrary to their faith convictions.”

The attitude displayed by major media outlets reflects the Zeitgeist, which decrees: Ms. Stutzman is guilty of illegal discrimination on grounds of sexual orientation. It is significant that this verdict crosses political boundaries. On the Sunday morning discussion program Face the Nation, longtime conservative columnist and commentator George Will claimed that “public-accommodations law” had long ago “settled” the relevant legal point regarding the requirement of a business owner to provide service to all comers once doors have been opened to the public at large. But Mr. Will nonetheless expressed dissatisfaction with the apparent victory of the homosexual couple over the florist. “They [homosexuals in general] have been winning…this makes them look like bad winners.” Mr. Will seemed to suggest that the couple should forego their legal right and let Ms. Stutzman off the hook as a matter of good manners.

Legal, Yes; Proper, No

The fact that the subjugation of the florist is legal does not make it right. For decades, the Zeitgeist has been growing ever more totalitarian. Today, the United States of America approaches a form of authoritarian polity called an absolute democracy. In an absolute monarchy, one person rules. In an absolute democracy, the government is democratically elected but it holds absolute power over the citizens.

The inherent definition of freedom is the absence of external constraint. In this case, that would imply that Messrs. Ingersoll and Freed would be free to engage or refuse the services of Ms. Stutzman and Ms. Stutzman would be free to provide or refuse service to Messrs. Ingersoll and Freed – on any basis whatsoever. That is what freedom means. A concise way of describing the operation of the Rule of Law would be that all (adult) citizens enjoy freedom of contract.

But in our current unfree country, Messrs. Ingersoll and Freed are free to patronize Arlene’s or not but Ms. Stutzman is not free. She is required to serve Messrs. Ingersoll and Freed, like it or not. The couple’s sexual orientation has earned them the status of a privileged class. They have the privilege of compelling service. This is a privilege enjoyed by a comparative few.

George Will and company may pontificate about settled law, but the truth is that refusals of service happen daily in American business. Businesses often refuse other businesses as a courtesy, typically as an acknowledgement of their own shortcomings or lack of specialized knowledge or expertise. Sometimes a business will frankly admit that a would-be customer falls outside their target customer class. This sort of refusal rarely, if ever, leads to recriminations. After all, who really wants to pay for a product or service unwillingly supplied? The only exception comes when the customer falls within one of the government-protected categories covered by the anti-discrimination laws. Then the fear of litigation, financial and criminal penalties and adverse publicity kicks in.

This may be the clearest sign that the Rule of Law no longer prevails in America. The Rule of Law does not mean scrupulous adherence to statutory law. It means the absence of privilege. In America today, privilege is alive and growing like a cancer. In the past, we associated the term with wealth and social position. That is no longer true. Now it connotes special treatment by government.

The Role of Competition Under the Rule of Law

Under the Rule of Law, Messrs. Ingersoll and Freed would not be able to compel Ms. Stutzman to provide flowers to their wedding. But this would not leave them without resource. The Rule of Law supports the existence of free competitive markets. The couple could simply call up another florist. True, they would be denied the service of their longtime acquaintance and supplier. But nobody is entitled to a lifetime guarantee of the best of everything. What if Ms. Stutzman was ill on their wedding day, or called out of town, or struck down by a beer truck? What if she went bankrupt or retired? The Rule of Law simply protects a free, competitive market from which Messrs. Ingersoll and Freed can pick and choose a florist.

That is not the only benefit the couple get from the Rule of Law and competition. In a competitive market, any seller who refuses service to a willing buyer must pay a penalty or cost in the form of foregone revenue. In strict, formal theory, a competitive market produces an equilibrium result in which the amount of output produced at the equilibrium price is exactly equal to the ex ante amount desired by consumers. A seller who turns away a buyer is throwing money down the drain. This is not something sellers will do lightly. Anybody who doubts this has never run a business and met a payroll. Thus, free competitive markets offer strong disincentives to discrimination.

Of course, that does not mean that businesses will never refuse a customer; the instant case proves that. But refusals of conscience like the one made by Ms. Stutzman will be comparatively rare, because it will be unusual for the owner to value the moral issue more than the revenue foregone.

The existence of competition under the Rule of Law is the safeguard that makes freedom and democracy possible. Without it, we would have to fear the tyranny of the majority over minorities. With it, we can safely rely on markets to protect the rights and welfare of minorities.

The Rule of Law and Limited Government

Free choice by both buyers and sellers is not the enemy of minority rights. The real danger to minorities is government itself – the very government that is today advertised as the champion of minorities.

After the Civil War, newly freed and enfranchised blacks entered the free economy in the South. They began to compete with unskilled and skilled white labor. This competition was successful, both because blacks were willing to work for lower wages and because some blacks had mastered valuable skills while slaves. For example, professional baseball originated in the 1860s and increased steadily in popularity; blacks participated in this embryonic period.

White laborers resented this labor-market competition. In order to artificially increase the wages of their members, labor unions had to restrict the supply of labor. Denying union membership to blacks was a common means of catering to member desires while furthering wage objectives. But the competition provided by blacks was difficult to suppress because employers had a clear incentive to hire low-wage labor that was also productive and skillful. Businesses had a strong monetary incentive not to refuse service to blacks because the money offered by blacks was just as green as anybody else’s money.

The solution found by the anti-black forces was the so-called “Jim Crow” laws. These forbade the hiring of blacks on equal terms and denied blacks equal rights to public accommodations and service. In effect, the Jim Crow laws cartelized labor and product markets in a way that would not otherwise have occurred. Governments also handed out special privileges to labor unions that enabled them to compel membership and deny it at will. Historically, labor unions excluded blacks from membership for the bulk of the 20th century. Blacks were banned from organized baseball and most other professional sports until the 1940s, when sports became the first wedge driven into the Jim Crow laws.

The apartheid law passed in southern Africa in the early 20th century also arose in order to thwart successful competition offered by white labor by black labor. Left alone, competitive labor markets were enabling black South Africans to enjoy rising wages and employment. South African labor unions agitated for government protection against black workers. The result was the “pass laws” or “color bar” or apartheid system, not unlike the Jim Crow laws prevailing in America. Once again, the purpose was to cartelize labor markets in order to erect barriers to competition offered to white labor by black workers.

The rationale behind public utilities was ostensibly to limit the pricing power and profits enjoyed by firms that would otherwise have been “natural monopolies.” In actual practice, by guaranteeing public utilities a “normal profit,” government removed the specter of a loss of revenue and profit associated with discrimination against black customers and employees. Sure enough, public utilities were among the chief practitioners of discrimination against blacks – along with government itself, which also did not fear a loss of profit resulting from its actions.

A recurring effect of government regulation of business in all its forms has been the erosion of competition. Sometimes that has been caused by costly compliance with regulation, driving businesses bankrupt and reducing market competition through attrition. Sometimes this has come from direct government cartelization of competitive markets, resulting from measures like marketing orders and quotas in milk and citrus fruit. Sometimes that has come from price supports, target prices and acreage allotments that have reduced agricultural output and raised prices or, alternatively, raised prices while creating costly surpluses for which taxpayers must pay. Sometimes the reduction in competition results from anti-trust laws like the Robinson Patman Act, deliberately designed to raise prices and restrict competition in retail business.

There is no formal, coherent theory of regulation. Instead, regulatory legislation is accompanied by vague protestations of good will and good intentions that have no unambiguous translation into policy. The typical result is that regulators either take over the role of controlling business decisions from market participants or they become the patrons and protectors of businesses within the industries they regulate. The latter attitude has evolved within the financial sector, where regulators have gradually taken the view that the biggest competitors are “too big to fail.” That is, the effects of failure would spill over onto too many other firms, causing widespread adverse effects. This, in turn, precludes discipline imposed by competitive markets, which force businesses to serve consumers well or go out of business.

The enemy of minorities is government, not free competitive markets. Government harms minorities directly by passing discriminatory laws against them or indirectly by foreclosing or lessening competition.

The Two-Edged Sword of Government Power

Many people find it difficult to perceive government as the threat because government vocally broadcasts its beneficence and cloaks its intentions in the vocabulary of good intentions. It bestows noble and high-sounding names on its legislative enactments. It endows them with historic significance. Like Edmund Rostand’s protagonist Chanticleer, government pretends that its will causes the sun to rise and set and only its benevolence stands between us and disaster.

But the blessings of government are a two-edged sword. “A government powerful enough to give us everything we want is powerful enough to take from us everything we have.” One by one, the beneficiaries of arbitrary government power have been also been stung by the exercise of that same power.

In 1954, government insisted that “separate was inherently unequal” and that the segregated education received by blacks must be inferior to that enjoyed by whites. Instead of introducing competition to schools, government intruded into education more than ever before. Now, six decades later, blacks still struggle for educational parity. And today, it is government that stands in the schoolhouse door to thwart blacks – not through segregation, but by resolutely opposing the educational competition introduced by charter schools in New York City. The overwhelming majority of charter patrons are black, who embrace the charter concept wholeheartedly. But Mayor Bill de Blasio has vowed to fight charter schools tooth and claw. The state and federal governments can be relied upon to sit on their hands, since teacher unions – diehard enemies of charter schools – are a leading constituency of the Democrat Party.

For over a century, blacks have lived and died by government and the Democrat Party. Now they are cut by the other edge of the government sword.

The print and broadcast news media have been cheerleaders for big government and the Democrat Party throughout the 20th century and beyond. First-Amendment absolutism has been a staple of left-wing thought. Recently, FCC regulators in the Obama administration hatched a plan to study journalists and their employers with a view towards tighter regulation. The pretext for the FCC’s Multi-Market Study of Critical Information Needs was that FCC broadcast licenses come with an obligation to serve the public – and how can government determine whether licensees are serving the public without thoroughly studying them? All hell has suddenly broken loose at the prospect that journalists themselves might be subjected to the same stifling regulation as other industries.

Of course, in a competitive market it is quite unnecessary to regulators to “study” the market to gauge whether it is working. Consumers make that judgment themselves. If businesses don’t serve consumers, consumers desert them and the businesses fold. Other businesses take their place and provide better service – or they join their predecessors on the scrap heap. But the presumption of government is that regulation must be necessary to promote competition – otherwise “market failure” will strand consumers up the creek without locomotion.

For decades, the knee-jerk reflex of journalists to any perceived problem has been that “no government regulation exists” to solve it. Now journalists tremble as they test the opposite edge of the government sword.

Now homosexuals are the latest group to successively experience both blades of the government sword. After years of life spent in the shadow of criminal prosecution, homosexuals have witnessed the gradual dismantling of state anti-sodomy laws. State-level bans on marriage by couples of the same gender have been invalidated by the U.S. Supreme Court. Not satisfied with their newly won freedom, homosexuals strive to wield power over their fellow citizens through coercion.

This is the only sense in which George Will was correct. His characterization of homosexuals as “bad winners” was infantile; it portrayed a serious issue of human freedom as a schoolboy exercise in bad manners. But he correctly sensed that homosexuals were winning something – even if he wasn’t quite sure what – and that this latest shift toward subjugating florists was a disastrous change in direction.

What Do Homosexuals Want? What Are They Owed Under the Rule of Law?

The holistic fallacy treats homosexuals as an organic unity with homogeneous wants and goals. In reality, they are individuals with diverse personalities and political orientations. But the homosexual movement follows a clearly discernible left-wing agenda, just as Hispanic activist organizations like La Raza hew to a left-wing line not representative of most Hispanics.

The homosexual political agenda strives to normalize and legitimize homosexual behavior by winning the imprimatur of government and the backing of government force. This movement feeds off the angst of people like Ingersoll and Freed – “It really hurt…it was eating at our souls” – who ache from the sting of rejection. The movement is selling government approval as a psychological substitute for parental and societal approval and economic rents as revenge for rejection. Homosexuals have observed the success of blacks, women and other protected classes in pursuing gains via this route.

There was a time, not so long ago when measured by the relative standard of history, when male homosexuals were not merely criminals but were subjected to a kind of informal “Jim Crow” persecution. They were routinely beaten and rolled not only by ordinary citizens but even by police. It is worth noting that these attitudes began to change decades ago, even before the advent of so-called “affirmative action” programs ostensibly designed to redress the grievances of other victim classes.

The Rule of Law demands that homosexuals receive the same rights and due-process protections as other people. It applies the same standards of consent to all sexual relationships between consenting adults. It grants the same freedom of contract – marital and otherwise – to all.

By the same token, the Rule of Law abhors privilege. It rejects the chimerical notion that the past harms suffered by individual members of groups can be compensated somehow by committing present harms that grant privilege and real income to different members of those same victimized groups.

The Rule of Law and Social Harmony

Sociologists and political scientists used to marvel as the comparative social harmony of American society – achieved despite the astonishing ethnic, racial, religious and political diversity of the citizenry. The consensus assigned credit to the American “melting pot.” The problem with this explanation is that a culture must first exist before new entrants can assimilate within it – and what mechanism achieved the original reconciliation of diverse elements?

Adherence to the Rule of Law within competitive markets made social harmony possible. It allowed the daily exchange of goods and services among individuals in relative anonymity, without disclosure of the multitudinous conflicts that might have otherwise produced stalemate and rejection. Milton Friedman observed astutely that free markets permit us to transact with the butcher, baker and candlestick maker without inquiring into their political or religious convictions. We need agree only on price and quantity. The need for broader consensus would bring ordinary life as we know it to a grinding halt; government would have to step in with coercive power in order to break the stalemate.

When everybody wears their politics, religion and sexual orientation on their sleeves, it makes life unpleasant, worrisome and exhausting. Shouldering chips weighs us down and invites conflict. This is the real source of the “polarization” complained of far and wide, not the relatively trivial differences between Republicans and Democrats. (The two parties are in firm agreement on the desirability of big government; they disagree vehemently only on who will run the show.)

Intellectuals wrongly assumed that the anonymity fostered by the Rule of Law reflected irreconcilable contradictions within society that would eventually cause violence like the Stonewall riots in 1969. The truth was that the Rule of Law reconciled contradictory views of individuals and allowed peaceful social change to occur gradually. Homosexuals were able to live, work and achieve outside of the glare of the public spotlight. It slowly dawned on the American public, at first subliminally and then consciously, that homosexuals were successfully contributing to every segment of American life. The achievements pointed to with pride today by homosexual activists were possible only because the Rule of Law facilitated this gradual, peaceful process. They were not caused by self-righteous activists and an all-powerful government bitch-slapping an ignorant, recalcitrant public into submission.

Subjugating Florists: A Pyrrhic Victory

Free competitive markets cash the checks written by the Rule of Law. Homosexuals have lived and prospered within those free-market boundaries, mirroring the tradition of Jews, blacks and other stigmatized minority groups. For centuries, homosexuals have faced ostracism and even death in various societies around the world. That remains true in certain countries even now. While it is true that homosexuals were formerly treated cruelly in America, it is also true that their cultural, economic and political gains here have been remarkably rapid by historical standards. Historical memory, rather than etiquette, should counsel against trashing the free-market institutions that have midwived that progress.

Violating the Rule of Law in exchange for the power to compel service by businesses would be far worse than a display of bad manners. It would be the worst kind of tradeoff for homosexuals, gaining a temporary political and public-relations triumph at the expense of long-run economic stability.

Of course, homosexual activists are hardly the first or the only ones grasping at the levers of government power. The history of 20th-century America is dominated by such attempts, emanating at first from the political Left but now from the Right as well. It is grimly amusing to recall that early efforts along these lines were hailed by political scientists as encouraging examples of “pluralism” and “inclusiveness” – they were supposed to be signs that the downtrodden and marginalized were now participating in the political process. Today, everybody and his brother-in-law are trying to work local, state or federal government for an edge or a subsidy. Nobody can pretend now that this is anything but the unmistakable indicator of societal disintegration and decay.

Heretofore, the visible traits of democracy – representative government, elections, checks and balances – have been considered both necessary and sufficient to guarantee freedom. The falsity of that presumption is now dawning upon us with the appreciation of democratic absolutism as an impending reality. Subjugating florists may provide the homosexual movement with the thrills of political blood sport but any victories won will prove Pyrrhic.

DRI-267 for week of 10-27-13: ObamaCare and the Point of No Return

An Access Advertising EconBrief:

ObamaCare and the Point of No Return

The rollout of ObamaCare – long-awaited by its friends, long-dreaded by its foes – took place last week. In this case, the term “rollout” is apropos, since the program is not exactly up on its feet. Tuesday, Oct. 22, 2013 marked the debut of HealthCare.gov, the ObamaCare website, where prospective customers of the program’s health-insurance exchanges go to apply for coverage. By comparison, Facebook’s IPO was a rip-roaring success.

A diary of highlights seems like the best way to do justice to this fiasco. We are indebted to the Heritage Foundation for the chronology and many of the specific details that follow.

Tuesday, Oct. 22, 2013: This is ribbon-cutting day for the website, through which ObamaCare’s state health-insurance exchangesexpect to do most of their business. One of the most fundamental reforms sought by free-market economists is the geographic market integration of health care in the U.S. Historically, each state has its own state laws and regulatory apparatus governing insurance. This hamstrings competition. It requires companies to deal with 50 different bureaucracies in order to compete nationally and limits consumers solely to companies offering policies in their state. But ObamaCare is dedicated to the proposition that health care of, by and for government shall not perish from the earth, so it not only perpetuates but complicates this setup by interposing the artificial creation of a health-care exchange for each state, operating under a federal aegis.

Only 36 of those state exchanges open for business on time today, however. Last-minute rehearsals have warned of impending chaos, and frantic responses have produced lateness. Sure enough, as the day wears on 47 states eventually report applicant complaints of “frequent error messages.” Despite massive volume on the ObamaCare site, there is almost no evidence of actual completed applications.

Wednesday, Oct. 23, 2013: The Los Angeles Times revises yesterday’s report of 5 million “hits” on HealthCare.gov from applicants in California downward just a wee bit, to 645,000. But there is still no definitive word on actual completed applications, leading some observers to wonder whether there are any.

Thursday, Oct. 24, 2013: The scarcity of actual purchasers of health insurance on the ObamaCare exchanges leads a Washington Post reporter to compare them in print to unicorns.  More serious, though, are the growing reports of thousands of policy cancellations suffered by Americans across the nation. The culprit is ObamaCare itself; victims’ current coverage doesn’t meet new ObamaCare guidelines on matters such as openness to pre-existing conditions. Ordinarily, a significant pre-existing health condition would preclude coverage or rate a high premium. In other words, writing policies that ignore pre-existing conditions is not insurance in the true, classical sense; insurance substitutes cost for risk and the former must be an increasing function of the latter in order for the process to make any sense. ObamaCare is not really about insurance, despite its protestations to the contrary.

Friday, Oct. 25, 2013: CNBC estimates that only 1% of website applicants can proceed fully to completion and obtain a policy online because the system cannot generate sufficient valid information to process the others. A few states – notably Kentucky – have reported thousands of successful policies issued, but the vast bulk of these now appear to be Medicaid enrollees rather than health-insurance policyholders. Meanwhile, the Department of Health and Human Services (HHS) announces that its website will be offline for repairs and upgrading.

Saturday, Oct. 26, 2013: In an interview with Fox News, Treasury Secretary Jack Lew refuses to cite a figure for completed applications on the HealthCare.gov website. Among those few that have successfully braved the process, premiums seem dramatically higher than those previously paid. One example was a current policyholder whose monthly premium of $228 ballooned to $1,208 on the new ObamaCare health-care exchange policy.

Monday, Oct.28, 2013: Dissatisfaction with the process of website enrollment is now so general that application via filling out paper forms has become the method of choice. It is highly ironic that well into the 21st century, a political administration touting its technological progressivity has fallen back on the tools of the 19th century to advance its signature legislative achievement.

Official Reaction

This diary of the reception to ObamaCare conveys the impression of a public that is more than sullen in its initial reaction to the program – it is downright mutinous. It was hardly surprising, then, that President Obama chose to respond to public complaints by holding a press conference in the White House Rose Garden a few days after rollout.

Mr. Obama’s attitude can best be described as “What’s the problem?” His tone combined the unique Obama blend of hauteur and familiarity. The Affordable Care Act, he insisted, was “not just a website.” If people were having trouble accessing the website or completing the application process or making contact with an insurance company to discuss an actual plan – why, then, they could just call the government on the phone and “talk to somebody directly and they can walk you through the application process.” (How many of the President’s listeners hearkened back at this point to their previous soul-satisfying experiences on the phone with, let’s say, the IRS?) This would take about 25 minutes for an individual, Mr. Obama assured his viewers, and about 45 minutes for a family. He gave out a 1-800 number for his viewers to call. Reviews of the President’s performance noted his striking resemblance to infomercial pitchmen.

Sean Hannity was so inspired by the President’s call to action that he resolved to heed it. He called the toll-free number on-air during his AM-radio show. He spoke with a call-center employee who admitted that “we’re having a lot of glitches in the system.” She read the script that she had been given to use in dealing with disgruntled callers. Hannity thanked her and complimented her on her courtesy and honesty. She was fired the next day. Hannity declared he would compensate her for one year’s lost salary and vowed to set up a fund for callers who wanted to contribute in her behalf.

Health and Human Services Secretary Kathleen Sebelius was next up on the firing line. Cabinet officials were touring eight cities and selected regional sites to promote the program and at Sebelius’s first stop at a community center in Austin, TX, she held a press conference to respond to public outrage with the glitches in the program.

On October 26, 2013, the Fox News website sported the headline: “Sebelius Suggests Republicans to Blame for ObamaCare Website Woes.” Had the Republican Party chosen the IT contractor responsible for setting up HealthCare.gov‘s website?

No. “Sebelius suggest[ed] that Republican efforts to delay and defund the law contributed to HealthCare.gov‘s glitch-ridden debut.” Really. How? Sebelius “conceded that there wasn’t enough testing done on the website, but added that her department had little flexibility to postpone the launch against the backdrop of Washington’s unforgiving politics. ‘In an ideal world, there would have been a lot more testing, but we did not have the luxury of that. And the law said the go-time was Oct. 1. And frankly, a political atmosphere where the majority party, at least in the House, was determined to stop this any way they possibly could…was not an ideal atmosphere.”

It takes the listener a minute or so to catch breath in the face of such effrontery. The Obama Administration had three years in which to prepare for launch of the program. True, there were numerous changes to the law and to administrative procedures, but these were all made by the administration itself for policy reasons. The Democrat Party, not the Republican Party, is the majority party. The Republican Party – no, make that the Tea Party wing of the Republican Party – proposed a debt-limit settlement in which the individual mandate for insurance-policy ownership would be delayed. It was rejected by the Obama Administration. Ms. Sebelius is blaming the Republican Party for the fact that Democrats were rushed when the Republicans in fact offered the Democrats a delay that the Democrats refused.

Were Ms. Sebelius a high-level executive in charge of rolling out a new product, her performance to date would result in her dismissal. But when queried about the possibility of stepping down, she responded “The majority of people calling for me to resign, I would say, are people I don’t work for and who did not want this program to work in the first place.” Parsing this statement yields some very uncomfortable conclusions. Ms. Sebelius’s employer is not President Obama or his administration; it is the American people. Anybody calling for her resignation is also an American. But clearly she does not see it that way. Obviously, the people calling for her resignation are Republicans. And she does not see herself as working for Republicans. The question is: Who is she working for?

Two possibilities stand out. Possibility number one is that she is working for the Democrat Party. In other words, she sees the executive branch as a spoils system belonging to the political party in power. Her allegiance is owed to the source of her employment; namely, her party. Possibility number two is that she sees her allegiance as owed to President Obama, her nominal boss. This might be referred to as the corporatist (as opposed to corporate) view of government, in which government plays the role of corporation and there are no shareholders.

Neither one of these possible conceptions is compatible with republican democracy, in which ultimate authority resides with the voters. In this case, the voters are expressing vocal dissatisfaction and Ms. Sebelius is telling them to take a hike. In a free-market corporation, Ms. Sebelius would be the one unfolding her walking papers and map.

Whose Back is Against the Wall?

It is tempting to conclude that ObamaCare is the Waterloo that the right wing has been predicting and planning for President Obama ever since Election Day, 2008. And this does have a certain superficial plausibility. ObamaCare is this Administration’s signature policy achievement – indeed, practically its only one. There is no doubt that the Administration looks bad, even by the relaxed standards of performance it set during the last five years.

Unfortunately, this view of President Obama with his back against the wall, despairing and fearful, contemplating resignation or impeachment, simply won’t survive close scrutiny. It is shattered by a sober review of Barack Obama’s past utterances on the subject of health care.

As a dedicated man of the Left, Barack Obama’s progressive vision of health care in America follows one guiding star: the single-payer system. That single payer is the federal government. Barack Obama and the progressive Left are irrevocably wedded to the concept of government ownership and control of health care, a la Great Britain’s National Health Care system. In speeches and interviews going back to the beginning of his career, Obama has pledged allegiance to this flag and to the collective for which it stands, one organic unity under government, indivisible, with totalitarianism and social justice for all.

The fact that ObamaCare is now collapsing around our ears may be temporarily uncomfortable for the Obama Administration, but it is in no way incompatible with this overarching goal. Just the opposite, in fact. In order to get from where we are now to a health-care system completely owned and operated by the federal government, our private system of doctors, hospitals and insurance companies must be either subjugated, occupied or destroyed, respectively. That process has now started in earnest.

Oh, the Administration would rather that private medicine went gentle into that good night. It would have preferred killing private health insurance via euthanasia rather than brutal murder, for example. But the end is what matters, not the means.

Certainly the Administration would have preferred to maintain its hypnotic grip on the loyalty of the mainstream news media. Instead, the members of the broadcast corps are reacting to ObamaCare’s meltdown as they did upon first learning that they were not the product of immaculate conception. But this is merely a temporary dislocation, not a permanent loss. What will the news media do when the uproar dies down – change party affiliation?

For anybody still unconvinced about the long-run direction events will take, the Wednesday, October 30, 2013 lead editorial in The Wall Street Journal is the clincher.

“Americans are Losing Their Coverage by Political Design”

“For all of the Affordable Care Act’s technical problems,” the editors observe, “at least one part is working on schedule. The law is systematically dismantling the private insurance market, as its architects intended from the start.”

It took a little foresight to see this back when the law was up for passage. The original legislation included a passage insisting that it should not “be construed to require than an individual terminate coverage that existed as of March 23, 2010.” This “Preservation of Right to Maintain Existing Coverage” was the fig leaf shielding President Obama’s now-infamous declaration that “if you like your existing policy, you can keep it.” Yeah, right.

Beginning in June, 2010, HHS started generating new regulations that chipped away at this “promise.” Every change in policy, no matter how minor, became an excuse for terminating existing coverage at renewal time. This explains the fact that some 2 million Americans have received cancellation notices from their current insurers. Of course, the Obama Administration has adopted the unified stance that these cancellations are the “fault” of the insurance companies – which is a little like blaming your broken back on your neighbor because he jumped out of the way when you fell off your roof instead of standing under you to cushion your fall. Stray callers to AM radio can be heard maintaining that at least half of these cancellations will be reinstated with new policies at lower cost in the ObamaCare exchanges. If only those hot-headed Tea Partiers would stop dumping boxes of tea and behaving like pirates! Alas, a Rube Goldberg imitation of a market cannot replace the genuine article – with apologies to Mr. Goldberg, whose roundabout contraptions actually worked.

ObamaCare creates 10 types of legally defined medical benefits. They include general categories like hospitalization and prescription drugs. No policy that fails to meet the exact standards defined within the law can survive the ObamaCare review. It is widely estimated that about 80% of all individual plans, which cover 7% of the U.S. population under age 65, will fall victim to the ObamaCare scythe.

The law is replete with Orwellian rhetoric of progressive liberalism. HHS defines its purpose as the “offer [of] a small number of meaningful choices.” Uh…what about allowing individuals to gauge the tradeoff between price and quality of care that best suits their own preferences, incomes and particular medical circumstances? No, that would have “allowed extremely wide variation across plans in the benefits offered “and thus “would not have assured consumers that they would have coverage for basic benefits.” This is doublespeak for “we are restricting your range of choice for your own good, dummy.”

Liberals typically respond with a mixture of outrage and indignation when exposed as totalitarians. It is certainly true that they are not eradicating freedom of choice merely for the pure fun of it. They must create a fictitious product called “insurance” to serve a comparatively small population of people who cannot be served by true insurance – people with pre-existing conditions that make them uninsurable or ratable at very high premiums or coverage exclusions. The exorbitant costs of serving this market through government require that the tail wag the dog – that the large number of young, healthy people pay ridiculously high premiums for a product they don’t want or need in order to balance the books on this absurd enterprise. (Formerly, governments simply borrowed the money to pay for such pay-as-you-go boondoggles, but the financial price tag on this modus operandi is now threatening to bring down European welfare states around the ears of their citizens – so this expedient is no longer viable.) In order to justify enrolling everybody and his brother-in-law in coverage, government has to standardize coverage by including just about every conceivable benefit and excluding practically nothing. After all, we’re forcing people to sign up so we can’t very well turn around and deny them coverage for something the way a real, live insurance company would, can we?

It is well known that the bulk of all medical costs arise from treating the elderly. In a rational system, this would be no problem because people would save for their own old age and generate the real resources necessary to fund it. But the wrong turn in our system began in World War II, when the tax-free status of employer-provided health benefits encouraged the substitution of job-related health insurance for the wage increases that were proscribed by wartime government wage and price controls. The gradual dominance of third-party payment for health care meant that demand went through the roof, dragging health-care prices upward with it.

Now Generation X finds itself stuck with the mother of all tabs by the President whom it elected. The Gen X’ers are paying Social Security taxes to support their feckless parents and grandparents, who sat still for a Ponzi scheme and now want their children to make good. To add injury to injury, the kids are also stuck with gigantic prices for involuntary “insurance” they don’t want and can’t afford to support their elders, the uninsurables – and the incredibly costly government machinery to administer it all.

It’s just as the old-time leftist revolutionaries used to say: you can’t make an omelette without breaking eggs. Across the nation, we have heard the sound of eggs cracking for the last week.

The Point of No Return

The “point of no return” is a familiar principle in international aviation. It is the point beyond which is it closer to the final destination than to the point of origination, or the point beyond which it makes no sense to turn back. This is particularly applicable to trans-oceanic travel, where engine trouble or some other unexpected problem might make the fastest possible landing necessary.

In our case, the Obama Administration has kept this concept firmly in mind. By embroiling as many Americans as deeply as possible in the tentacles of government, President Obama intends to create a state of affairs in which – no matter how bad the current operation of ObamaCare may be – it will seem preferable to most Americans to go forward to a completely government-run system rather than “turn back the clock” to a free-market system.

A free-market system works because competition works. On the supply side of the market, eliminating state regulation of insurance would enable companies to expand across state borders and compete with each other. But this involves relying upon companies to serve consumers. And companies are the entities that just got through issuing all those cancellation notices. For millions of Americans today, the only disciplinary mechanism affecting companies is something called “government regulation” that forces them to do “the right thing” by bludgeoning them into submission. That is what regulatory agencies are doing right now – beating up on Wall Street firms and banks for causing the financial crisis of 2008 and ensuing Great Recession. The fact that this never seems to prevent the next crisis doesn’t seem to penetrate the public consciousness, for the only antidote for the failure of government regulation is more and stronger government regulation.

On the demand side of a free market, consumers scrutinize the products and services available at alternative prices and choose the ones they prefer the most. But consumers are not used to buying their own health care and vaguely feel that the idea is both dishonest and unfair. “Health care should be a right, not a privilege,” is the rallying cry of the left wing – as if proclaiming this state of affairs is tantamount to executing it. No such thing as a guaranteed right to goods and services can exist, since giving one person a political right to goods is the same thing as denying the right to others. In the financial sense, somebody must pay for the goods provided. In the real sense, virtually all goods are produced using resources that have alternative uses, so producing more of some goods always means producing fewer other goods.

This is not what the “health-care-should-be-a-right-not-a-privilege” proclaimers are talking about. Their idea is that we will give everybody more of this one thing – health care – and have everything else remain the same as it is now. That is a fantasy. But this fantasy is the prevailing mental state throughout much of the nation. One widely quoted comment by a bitterly disappointed victim of policy cancellation is revealing: “I was all for ObamaCare until I found out I was going to have to pay for it.” On right-wing talk radio, this remark is considered proof of public disillusion with President Obama. But note: The victim did not say: “I was all for ObamaCare until I found out what I was going to have to pay for it.” The distinction is vital. Today, a free lunch is considered only fitting and proper in health care. And the only free lunch to be had is the pseudo-free lunch offered by a government-run, single-payer system.

As it stands now, few if any Americans can recall what it was like to pay for their own health care. Few have experienced a true free market in medicine and health care. Thus, they will be taking the word of economists on faith that it would be preferable to a government-run system like the one in Great Britain. It is a tribute to the power of ideas that a commentator like Rush Limbaugh can make repeated references to individuals paying for their own care without generating a commercially fatal outpouring of outrage from his audience.

Grim as this depiction may seem, it accurately describes the dilemma we face.

DRI-312 for week of 9-29-13: Suppose They Gave a Government Shutdown and Nobody Cared?

An Access Advertising EconBrief:

Suppose They Gave a Government Shutdown and Nobody Cared?

Midnight on Tuesday, Oct. 1, 2013 is the deadline for the shutdown of the federal government. That is the start of the new federal fiscal year. The U.S. Constitution specifies that Congress must authorize spending by the executive branch. Strange as it seems to a country by now inured to executive and regulatory high-handedness, the government cannot legally initiate operations by writing checks on its own hook. Fiscal delinquency, delay and deceit have long been the hallmarks of Congressional action, so it seems only fitting that Congress has failed to agree on the spending authorizations for departments that would get the federal government up and running in the New Year. And this year’s calendar offers a special treat, since the Oct. 17 deadline for default looms on the horizon as the next bureaucratic drop-dead date for civilization as we know it.

Amid the breathless media countdown to Armageddon, a sober pause for introspection is in order. How big an emergency is the federal-government shutdown, really? What underlying significance does a government shutdown have? How did we reach this position? Has the underlying economic significance of our situation been correctly conveyed by commentators and news media?

OMG! The Federal Government is About to Shut Down! Oh, Wait, Time for Vacation…

The attitude of Congressional representatives toward the prospect of government shutdown might best be compared to that of college students facing final exams. The exam schedule is announced at the beginning of the semester; indeed, it is printed in the course catalog distributed at registration. The course syllabus carefully explains the importance of the final to the student’s grade. The student knows the format of the exam, its location and exact time of day.

So, having had nearly four months to prepare and all the advance warning anybody could ask for, students are well versed, confident and unruffled in the waning days of the semester, right? On the night before the exam, they spend a short time reviewing basic ideas before retiring to get a good night’s sleep, to arise refreshed and eager to meet their task on final-exam day, don’t they? And they pass the exam with flying colors?

No, students generally seek out any excuse to avoid studying the material – and excuses emerge in profusion. As time passes and the semester ages, the knowledge of their approaching fate weighs on students’ minds, producing a buildup of anxiety and kinetic energy in their bodies. This demands an outlet, and late semester is a popular time for beer busts and other recreational modes of escape. The waning days before the final exam are spent in frantic efforts to complete course work and accomplish several months’ worth of study in a few days. The culmination of this crash program arrives on execution eve, when the students cram as many isolated facts as possible into their brain cells, relying on short-term memory to pinch-hit for solid comprehension. The surprising success rate of this modus operandi is owed less to its inherent effectiveness than to the grade inflation that has overwhelmed higher education in recent decades.

Anybody who expected their Congressional representative to behave in a more mature, sensible fashion than a college underclassman has been bitterly disillusioned by experience. Consider this latest example of budget brinkmanship.

The end of the fiscal year is not a national secret. Congress has known all year it was coming. The issues dividing the two major parties were well-known from the first day; ObamaCare has been a dinosaur-sized-bone of contention since its proposal and passage in 2010 and shocking reaffirmation by the Supreme Court in 2011. There was ample time to resolve differences or remove the legislation as a political roadblock to process.

As the year wound down, it became increasingly clear that opponents were eyeball to eyeball, each waiting for the others to blink. Now it was August, with only two months left in which to stave out a shutdown. When the going get tough, the tough… go on vacation – which was exactly what Congress did, for five weeks.

For the last week, leaders like Senate Majority Leader Harry Reid (D-Nevada) and Republican Ted Cruz have suddenly come alive with frantic last-ditch efforts. Each side has crafted and passed proposals (in the Democrat-controlled Senate and the Republican House) that the other side has torpedoed. At this writing, we are down to the last-minute cramming… but it should not escape notice that Sen. Reid was not too appalled by the prospect of shutdown to leave town the weekend after superintending the defeat of Rep. Cruz’s proposal in the Senate.

A few of the more cynical commentators have observed that we have been down this road before without careening off the highway and down a mountainside into oblivion. One set of talking points refers to this as our third experience with actual shutdown, but this is far from true. In fact, the federal government has survived 19 previous shutdowns – 17 since 1977 alone, according to the Congressional Research Service. Most have lasted a few days; the most recent (and famous) one in 1996 lasted for 21 days. So much for the artificially contrived atmosphere of urgency surrounding this one, which has been another production of political theater brought to you by your national news media.

Is There a Point to the Shutdown? If So, What is It?

It should be obvious that the hype surrounding the shutdown is phony. Even if we make allowances for the timing coincidence of the fiscal-year dividing line and debt-ceiling deadline, the attention paid to the shutdown is out of all proportion to its real effects on American well-being. But even though the shutdown may be relatively innocuous in its effects, that does not make it a good idea. What does it accomplish – ostensibly or actually?

It goes without saying that detractors of the Tea Party and the Republican Congressional leadership foresee nothing good coming of the shutdown. Since these are the people who got America into the mess that now plagues it – or stood by while that happened – we can disregard their opinions.

If there is an overriding goal of those who drove the events leading to the shutdown, it is opposition to ObamaCare. This opposition has taken the form of attempts to “defund” it; that is, to deny the Obama Administration the money necessary to implement the program. Were this successful, the program would remain on the books de jure as law, but would be repealed de facto by the lack of funds to run it. The most direct means taken to achieve this end is by passing a spending authorization bill containing a rider that defunds the Affordable Care Act. The problem with this measure is that the President will never sign this bill; ObamaCare is the signature legislation of his presidency.

Plan B of the defunders has been to replace that defunding rider with one that delays implementation of ObamaCare provisions for individual citizens by one year. This is directly analogous to the delay instituted by the Obama administration itself for businesses; in effect, it simply gives private individuals the same one-year reprieve given to employers by the President himself. This measure not only has the virtue of symmetry but also of fairness and logistical convenience. It is not clear why the bill should be delayed for businesses and not for everybody else. The state exchanges that would enable individuals to acquire health insurance are not up and running anyway in several states, so this would give the system more time to iron out the kinks. And this delay is entirely legal, being instituted by the Congress and submitted for Presidential signature; the business delay was a flagrantly illegal action imposed by Presidential fiat.

But President Obama is not about to agree to this compromise measure, either. He knows that the longer ObamaCare is postponed, the longer opposition will have to build and the longer its defects will have to become manifest. Once in place, a national system this massive and bureaucratic will be almost impossible to dislodge if only due to the inertia that will set in. The President only delayed applying its business provisions out of direst necessity; everybody was so unprepared that imposition would have led to complete fiasco. So Obama wants to get half of ObamaCare going while the going is good – or at least feasible.

Republic Speaker of the House John Boehner is already confronted by panic in the ranks. Republican representatives have hardly faced the first unfriendly fire from the news media – accusing them of irresponsibly jeopardizing the welfare of the nation for their own petty political purposes – before bolting for cover. Boehner’s queries as he gives them the flat of his sword are: What is this all about if not standing on principle against the President’s program? If we can’t work for the repeal of a terrible law as soundly unpopular as ObamaCare, when will we ever oppose the President? If now isn’t the time to stand firm against policies that are spending the country into the ground and destroying the heritage of our children, when will a better time come around? Over 30 years ago, President Ronald Reagan asked: If not now, when? Well, we didn’t do it then. If we don’t reform the budget now, when?

And those are indeed the relevant questions. Most Republicans oppose ObamaCare, all right; they have enough political courage to stand up against a law when the polls proclaim it heavily unpopular. But ObamaCare is merely the tip of the spending iceberg; the entitlement programs lie jutting beneath the surface waiting to scuttle the most unsinkable of reformers. There is no sign of Republicans boarding icebreakers, kissing wives and children goodbye and signing on for the duration of the voyage to clear the sea lanes of entitlements.

And Now for Some Opinion Completely Different

Holman Jenkins of The Wall Street Journal is a commentator not noted for his sunny optimism. Nevertheless, his take on the federal-budget stalemate is decidedly more upbeat than others commonly bruited. “What if, 10 years ago, Greece had made itself a laughingstock, sacrificed its credibility, brought shame on itself – all phrases used against Washington this week – by shutting down its government because certain legislators saw ideological and electoral rewards to be gained from making a fuss over unsustainable spending? Greek TV hosts would have shouted ‘Athens is broken!'”

Instead, as Jenkins knows only too well, Greece went sleepily on its corrupt, lazy, insouciant way, only to collapse in a heap nearly a decade later. Meanwhile, Americans today “all shake a fist at Washington, denouncing its irresponsibility because politicians are ‘playing politics’ with the debt ceiling and government shutdown.” But “then again, politics is how we govern ourselves. It’s better than despotism not because each moment is a model of stately order and reason, because America is a diverse, fractious society. The only way it works is by the endless grinding out of political compromises amid shrieking and making threats and turning blue.”

Jenkins anticipates the typical facile rejoinder. “The usual suspects at this point will be stamping their feet and insisting the U.S. isn’t Greece, as if this is an insight. No country can borrow and spend infinite amounts of money, and no political system is immune to the incentive to keep trying anyway. Herein lies the real point that applies as much to Washington as to Athens.”

“It would be nice if today’s fight were genuinely about the future. Oh, wait, that’s exactly what the ObamaCare fight is about. By trying to stop a brand new entitlement before it gets started, in a country already palpably and indisputably committed to more entitlement spending than it wants to pay for, those radical House Republicans aren’t trying to chop current spending amid a sluggish recovery (however much one begins to doubt that pump-priming from Washington is the solution the economy needs). Those terrible House Republicans aren’t trying to force colleagues to commit painful votes to take away established goodies from established voting blocs – votes that neither Republicans nor Democrats have the slightest yearning to cast.”

“Those disgraceful House Republicans have made the fight exactly about the long term. Where’s the grudging approval from our Keynesian friends who constantly say immediate spending must be protected and reform saved for the long term?” Again, Jenkins knows full well that Keynesian economics is no longer a putatively consistent set of theoretical propositions; it is now a policy admonition in search of a theory and for sale to any political sponsor willing to fork over lucrative, visible jobs to Keynesian economists.

“Not only will there by more such shutdowns,” Jenkins predicts. “What passes for progress each time will be tiny – until it’s not. The 2011 sequester, which caused critics to engage in choruses of disapproval and the S&P to downgrade U.S. debt, set us on a path to today’s modestly smaller current deficits. This week’s peculiar fight may be resolved by a near-meaningless repeal of ObamaCare’s self-defeating medical-device tax – a teensy if desirable adjustment, having no bearing on the deficit tsunami that begins when the baby boomers start demanding their benefits.”

Jenkins’s peroration combines elements of Churchill and Pericles. “We are at the beginning of the beginning. Yet the birth pangs of entitlement reform that will one day inspire the world (as we did with tax reform in ’80s) may be what we’re witnessing.” Hence the title of the column: “Behind the Noise, Entitlement Reform.”

Too Little, Too Late

Holman Jenkins’s vision is seductive, but unconvincing. Its visceral appeal lies in its pragmatism and its familiarity. Pragmatism is the great American virtue. We have grown up learning to accept and adapt to incremental change. Surely the changes necessary to cope with the downsizing of the welfare state will be just one more set of adjustments – painful but bearable. How many times have we heard Cassandras prophesy doom? How many times has it appeared? This is apparently the comforting set of rationalizations that insulates us from the truth of our situation.

Unfortunately, there is no reason to believe that a long series of small changes will be both timely and sufficient to our purpose. Not only has the Obama Administration’s fiscal policies shifted the velocity of fiscal decline to warp speed, but its monetary policies have changed our major problem from financial crisis to monetary collapse. Financial crisis is something that both individual countries and world systems recover from. Monetary collapse can lead to the destruction of a nation’s economy and the end of the civil order. We not only have to change our ways, we must reverse course by 180 degrees. And do it quickly.

It seems that Jenkins envisions entitlement reform from the austere perspective of an actuary contemplating the future of a program like Social Security. Tut-tut, the actuary admonishes, this program will be bankrupt in another 20 years or so. Well, in 20 years, a solid plurality of Wall Street Journal readers will be dead or near death. Quite a few will be financially independent of the program. Most of the rest view 20 years hence as imperceptibly distant – ample time to recover from the financial improvidence of youth.

But the real crisis on our trip planner is not actuarial. Social Security affects it long before it actually becomes insolvent because its unfunded status will be factored into the calculations of bond traders, credit-rating agencies and interest-rate setters. We may or may not be at Jenkins’s “beginning of the beginning,” but we are certainly not standing in the starting blocks in terms of debt. Government at every level is in hock up to its hairline. The private sector has been making a valiant effort to deleverage – and for its pains has caught hell from Keynesian economists who lecture us about the evils of saving in the middle of a recession. (Just prior to the Great Recession, the same people were decrying our “consumption binge” and running public-service ads begging us to save more.) American banks have trillions loitering in excess reserves, just spoiling for the chance to torch the value of the dollar at home and abroad. Foreign holders of dollars and dollar-denominated assets are dying for a convenient chance to unload them. Business forecasters need binoculars to view the upside potential of U.S. interest rates. And when interest rates skyrocket, interest payments on the federal debt will crowd out practically everything else on the docket, making the budget wars of today look like Sunday-school theology debates. The endpoint of this process is monetary collapse, when the U.S. dollar is abandoned as a medium of exchange, unit of account and store of value.

Oh, and just in case the foregoing doesn’t fill you with a sense of dread, there’s the little matter of international “currency war” to ponder. During the Great Depression, many nations used monetary expansion to deliberately trash the value of their own currencies. Their aim was to make their goods look cheap to foreigners, thereby hiking their number and value of their exports and increasing employment in their export industries. Since their depreciated domestic currency would also buy fewer imports, this would supposedly encourage the citizenry to buy fewer goods from abroad, thereby increasing domestic employment in import-competing industries. This game plan is well-known to economic historians as the “beggar thy neighbor” strategy. Its inherent flaw is that it can work if, and only if, only one nation employs it. When all or most nations do it simultaneously, the effects cancel each other out in the currency market and the only result is that international trade evaporates – which is roughly what did happen during the Depression. Since international trade is a good thing which makes practically everybody better off, its virtual elimination was a disaster for everybody. And guess what? The rest of the world, watching Ben Bernanke and the Fed at work creating money like there’s no tomorrow, may well suspect the U.S. of trying just this tactic. Whether they’re right or wrong is beside the point, since it is their belief that will determine whether they retaliate by starting a trade war that mimics the devastation of the 1930s.

It is barely possible that Congress might embark on a program of haphazard, gradual deficit reduction a la Jenkins. But a thoroughgoing reform of the process is not in the cards. Thus, the danger is not a collapse caused by a government shutdown. The danger is a collapse caused by the failure to shut the government down. It is government at all levels that has turned itself into a machine for spending citizens’ money to benefit employees without providing substantial benefits to the citizens. Since there is virtually no competition for government services, there is little or no way to gauge whether any government good or service is worth what we have to pay to get it. So government just keeps rolling along, like Old Man River, carrying us all along with the flow.

The mass delusion afflicting America is cognitive dissonance. Most of us agree with Jenkins that no government can increase spending indefinitely. Yet we do not admit that this requires our government to actually cut spending for the purposes that (we believe) benefit us – or, at least, we do not admit this necessity in our lifetime. The same people who normally consider government to be intrusive, inept and unproductive magically reverse their position 180 degrees and assume that government is efficient and productive when pursuing their pet project, benefitting them and saving the world from their latest hobgoblin. This is the politico-economic equivalent of William Saroyan’s lament that everybody had to die but he had always assumed that an exception would be made in his case.

The dissonance is actually three-sided. We fail to recognize not only its quantitative dimension but also its qualitative side – government’s utter failure to solve problems and produce things of value. Thus, the real entrenched constituency for big government is not its ostensible beneficiaries – the poor, downtrodden, minorities and such. It is the bureaucrats and their minions, who collect paychecks but whose real net contribution to the social product is negative.

Until this dissonance is dispelled, it is idle to blame politicians for acting true to form.