DRI-284 for week of 8-10-14: All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

An Access Advertising EconBrief:

All Sides Go Off Half-Cocked in the Ferguson, MO Shooting

By now most of America must wonder secretly whether the door to race relations is marked “Abandon all hope, ye who enter here.” Blacks – mostly teenagers and young adults, except for those caught in the crossfire – are shot dead every day throughout the country by other blacks in private quarrels, drug deals gone bad and various attempted crimes. Murder is the leading cause of death for young black males in America. We are inured to this. But the relative exception of a black youth killed by a white man causes all hell to break loose – purely on the basis of the racial identities of the principals.

The latest chilling proof of this racial theorem comes from Ferguson, MO, the St. Louis suburb where a policeman shot and killed an unarmed 18-year-old black man on Monday. The fact that the shooter is a policeman reinforces the need for careful investigation and unflinching analysis of the issues involved. The constant intrusion of racial identity is a mountainous obstacle to this process.

The Two Sides to the Story, As Originally Told

The shooting occurred on Saturday afternoon, August 9, 2014, in Ferguson, MO, where 14,000 of the 21,000 inhabitants are black and 50 of 53 assigned St. Louis County Police officers are white. The two sides of the story are summarized in an Associated Press story carrying the byline of Jim Suhr and carried on MSN News 08/13/2014. “Police have said the shooting happened after an [then-unnamed] officer encountered 18-year-old Michael Brown and another man on the street. They say one of the men pushed the officer into his squad car, then physically assaulted him in the vehicle and struggled with the officer over the officer’s weapon. At least one shot was fired inside the car. The struggle then spilled onto the street, where Brown was shot multiple times. In their initial news conference about the shooting, police didn’t specify whether Brown was the person who scuffled with the officer in the car and have refused to clarify their account.”

“Jackson said Wednesday that the officer involved sustained swelling facial injuries.”

“Dorian Johnson, who says he was with Brown when the shooting happened, has told a much different story. He has told media outlets that the officer ordered them out of the street, then tried to open his door so close to the men that it ‘ricocheted’ back, apparently upsetting the officer. Johnson says the officer grabbed his friend’s neck, then tried to pull him into the car before brandishing his weapon and firing. He says Brown started to run and the officer pursued him, firing multiple times. Johnson and another witness both say Brown was on the street with his hands raised when the officer fired at him repeatedly.”

The Reaction by Local Blacks: Protests and Violence

When a white citizen is shot by police under questionable circumstances – an occurrence that is happening with disturbing frequency – the incident is not ignored. But the consequent public alarm is subdued and contained within prescribed channels. Newspapers editorialize. Public figures express concern. Private citizens protest by writing or proclaiming their discontent.

The stylized reaction to a white-on-black incident like the one in Ferguson is quite different. Ever since the civil-rights era that began in the 1950s, these incidents are treated as presumptive civil-rights violations; that is, they are treated as crimes committed because the victim was black. Black “leaders” bemoan the continuing victim status of blacks, viewing the incident as more proof of same – the latest in an ongoing, presumably never-ending, saga of brutalization of blacks by whites. “Some civil-rights leaders have drawn comparisons between Brown’s death and that of 17-year-old Trayvon Martin.”

Rank-and-file blacks gather and march in protest, holding placards and chanting slogans tailored to the occasion. “Some protestors… raised their arms above their heads as they faced the police… The most popular chant has been ‘Hands up! Don’t shoot!'”

Most striking of all is the contrast struck by headlines like “Protests Turn Violent in St. Louis Suburb.” There is no non-black analogue to behavior like this: “Protests in the St. Louis suburb turned violent Wednesday night, with people lobbing Molotov cocktails at police, who responded with smoke bombs and tear gas to disperse the crowd.” This is a repetition of behavior begun in the 1960s, when massive riots set the urban ghettos of Harlem, Philadelphia and Detroit afire.

Joseph Epstein Weighs In

The critic and essayist Joseph Epstein belongs on the short list of the most trenchant thinkers and writers in the English language. His pellucid prose has illumined subjects ranging from American education to gossip political correctness to Fred Astaire. The utter intractability of race in America is demonstrated irrefutably by the fact that the subject reduced Epstein to feeble pastiche.

In his Wall Street Journal op-ed “What’s Missing in Ferguson, MO.”(The Wall Street Journal, Wednesday, August 13, 2014), Epstein notes the stylized character of the episode: “…the inconsolable mother, the testimony of the dead teenager’s friends to his innocence, the aunts and cousins chiming in, the police chief’s promise of a thorough investigation… The same lawyer who represented the [Trayvon] Martin family, it was announced, is going to take this case.”

But according to Epstein, the big problem is that it isn’t stylized enough. “Missing… was the calming voice of a national civil-rights leader of the kind that was so impressive during the 1950s and ’60s. In those days there was Martin Luther King Jr…. Roy Wilkins… Whitney Young… Bayard Rustin…. – all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America’s black population.”

But they are long dead. “None has been replaced by men of anywhere near the same caliber. In their place today there is only Jesse Jackson and Al Sharpton…One of the small accomplishments of President Obama has been to keep both of these men from becoming associated with the White House.” Today, the overriding problem facing blacks is that “no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime and despair.”

Wait just a minute here. What about President Obama? He is, after all, a black man himself. That was ostensibly the great, momentous breakthrough of his election – the elevation of a black man to the Presidency of the United States. This was supposed to break the racial logjam once and for all. If a black man occupying the Presidency couldn’t lead the black underclass to the Promised Land, who could?

No, according to Epstein, it turns out that “President Obama, as leader of all the people, is not well positioned for the job of leading the black population that finds itself mired in despond.” Oh. Why not? “Someone is needed who commands the respect of his or her people, and the admiration of that vast – I would argue preponderate [sic] – number of middle-class whites who understand that progress for blacks means progress for the entire country.”

To be sure, Epstein appreciates the surrealism of the status quo. “In Chicago, where I live, much of the murder and crime… is black-on-black, and cannot be chalked up to racism, except secondarily by blaming that old hobgoblin, ‘the system.’ People march with signs reading ‘Stop the Killing,’ but everyone knows that the marching and the signs and the sweet sentiments of local clergy aren’t likely to change anything. Better education… a longer school day… more and better jobs… get the guns off the street… the absence of [black] fathers – … the old dead analyses, the pretty panaceas, are paraded. Yet nothing new is up for discussion… when Bill Cosby, Thomas Sowell or Shelby Steele… have dared to speak up about the pathologies at work… these black figures are castigated.”

The Dead Hand of “Civil Rights Movement” Thinking

When no less an eminence than Joseph Epstein sinks under the waves of cliché and outmoded rhetoric, it is a sign of rhetorical emergency: we need to burn away the deadwood of habitual thinking.

Epstein is caught in a time warp, still living out the decline and fall of Jim Crow. But that system is long gone, the men who destroyed it and those who desperately sought to preserve it alike. The Kings and Youngs and Wilkins’ and Rustins are gone just as the Pattons and Rommels and Ridgeways and MacArthurs and Montgomerys are gone. Leaders suit themselves to their times. Epstein is lamenting the fact that the generals of the last war are not around to fight this one.

Reflexively, Epstein hearkens back to the old days because they were days of triumph and progress. He is thinking about the Civil Rights Movement in exactly the same way that the political left thinks about World War II. What glorious days, when the federal government controlled every aspect of our lives and we had such a wonderful feeling of solidarity! Let’s recreate that feeling in peacetime! But those feelings were unique to wartime, when everybody subordinates their personal goals to the one common goal of winning the war. In peacetime, there is no such unitary goal because we all have our personal goals to fulfill. We may be willing to subordinate those goals temporarily to win a war but nobody wants to live that way perpetually. And the mechanisms of big government – unwieldy agencies, price and wage controls, tight security controls, etc. – may suffice to win a war against other big governments but cannot achieve prosperity and freedom in a normal peacetime environment.

In the days of Civil Rights, blacks were a collective, a clan, a tribe. This made practical, logistical sense because the Jim Crow laws treated blacks as a unit. It was a successful strategic move to close ranks in solidarity and choose leaders to speak for all. In effect, blacks were forming a political cartel to counter the political setbacks they had been dealt. That is to say, they were bargaining with government as a unit and consenting to be assigned rights as a collective (a “minority”) rather than as free individuals. In social science terms, they were what F. A. Hayek called a “social whole,” whose constituent individual parts were obliterated and amalgamated into the opaque unitary aggregate. This dangerous strategy has since come back to haunt them by obscuring the reality of black individualism.

Consider Epstein’s position. Indian tribes once sent their chief – one who earned respect as an elder, religious leader or military captain, what anthropologists called a “big man” – to Washington for meetings with the Great White Father. Now, Epstein wants to restore the Civil Rights days when black leaders analogously spoke out for their tribal flock. Traditionally, the fate of individuals in aboriginal societies is governed largely by the wishes of the “big man” or leader, not by their own independent actions. This would be unthinkable for (say) whites; when was the last time you heard a call for a George Washington, Henry Ford or Bill Gates to lead the white underclass out of its malaise?

In fact, this kind of thinking was already anachronistic in Epstein’s Golden Age, the heyday of Civil Rights. Many blacks recognized the trap they were headed towards, but took the path of least resistance because it seemed the shortest route to killing off Jim Crow. Now we can see the pitiful result of this sort of collective thinking.

An 18-year-old black male is killed by a police officer under highly suspicious circumstances. Is the focus on criminal justice, on the veracity of the police account, on the evidence of a crime? Is the inherent danger of a monopoly bureaucracy investigating itself and exercising military powers over its constituency highlighted? Not at all.

Instead, the same old racial demons are summoned from the closet using the same ritual incantations. Local blacks quickly turn a candlelight protest vigil into a violent riot. Uh oh – it looks like the natives are getting restless; too much firewater at the vigil, probably. Joseph Epstein bemoans the lack of a chieftain who can speak for them. No, wait – the Great Black Father in Washington has come forward to chastise the violent and exalt the meek and the humble. His lieutenant Nixon has sent a black chief to comfort his brothers. (On Thursday, Missouri Governor Jay Nixon sent Missouri Highway Patrol Captain Ron Johnson, a black man, heading a delegation of troopers to take over security duties in Ferguson.) The natives are mollified; the savage breast is soothed. “All the police did was look at us and shoot tear gas. Now we’re being treated with respect,” a native exults happily. “Now it’s up to us to ride that feeling,” another concludes. “The scene [after the Missouri Highway Patrol took over] was almost festive, with people celebrating and honking horns.” The black chief intones majestically: “We’re here to serve and protect… not to instill fear.” All is peaceful again in the village.

Is this the response Joseph Epstein was calling for? No, this is the phony-baloney, feel-good pretense that he decried, the same methods he recognized from his hometown of Chicago and now being deployed there by Obama confidant Rahm Emmanuel. The restless natives got the attention they sought. Meanwhile, lost in the festive party atmosphere was the case of Michael Brown, which wasn’t nearly as important as the rioters’ egos that needed stroking.

But the Highway Patrol will go home and the St. Louis County Police will be back in charge and the Michael Brown case will have to be resolved. Some six days after the event, the police finally got around to revealing pertinent details of the case; namely, that Michael Brown was suspected of robbing a convenience store of $48.99 worth of boxed cigars earlier that day in a “strong-arm robbery.” Six-year veteran policeman Darren Wilson, now finally identified by authorities, was one of several officers dispatched to the scene.

Of course, the blacks in Ferguson, MO, and throughout America aren’t Indian tribesmen or rebellious children – they are nominally free American individuals with natural rights protected by the U.S. Constitution. But if they expect to be treated with respect 365 days a year they will have to stop acting like juvenile delinquents, stop delegating the protection of their rights to self-serving politicians and hustlers and start asserting the individuality they possess.

The irony of this particular case is that it affords them just that opportunity. But it demands that they shed what Epstein calls “the too-comfortable robes of victimhood.” And they will have to step out from behind the shield of the collective. The Michael Brown case is not important because “blacks” are affronted. It is important because Michael Brown was an individual American just like the whites who get shot down by police every year. If Dorian Johnson is telling the truth, Brown’s individual rights were violated just as surely whether he was black, white, yellow or chartreuse.

Policing in America Today – and the Michael Brown Case

For at least two decades, policing in America has followed two clearly discernible trends. The first of these is the deployment of paramilitary equipment, techniques and thinking. The second is a philosophy is placing the police officer’s well-being above all other considerations. Both of these trends place the welfare of police bureaucrats, employees and officers above that of their constituents in the public.

To an economist, this is a striking datum. Owners or managers of competitive firms cannot place their welfare above that of their customers; if they do, the firm will go bankrupt and cease to exist, depriving the owners of an asset (wealth) and real income and the managers of a job and real income. So what allows a police force (more specifically, the Chief of Police and his lieutenants) to do what a competitive firm cannot do? Answer: The police have a monopoly on the use of force to enforce the law. In the words of a well-known lawyer, the response to the generic question “Can the police do that?” is always “Sure they can. They have guns.”

All bureaucracies tend to be inefficient, even corrupt. But corporate bureaucracies must respond to the public and they must earn profits. So they cannot afford to ignore consumer demand. The only factor to which government bureaucracies respond is variations in their budget, which are functions of political rather than economic variables.

All of these truths are on display in this case. The police have chosen to release only a limited, self-serving account of the incident. Their version of the facts is dubious to say the least, although it could conceivably be correct. Their suppression of rioting protestors employed large, tank-like vehicles carrying officers armed with military gear, weapons and tear gas. Dorian Johnson’s account of the incident is redolent of the modern police philosophy of “self-protection first;” at the first hint of trouble, the officer’s focus is on downing anybody who might conceivable offer resistance, armed or not, dangerous or not.

What does all this have to do with the racial identities of the principals? Absolutely nothing. Oh, it’s barely possible that officer Wilson might have harbored some racial animosity toward Brown or blacks in general. But it’s really quite irrelevant because white-on-black, white-on-white and black-on-white police incidents have cropped up from sea to shining sea in recent years. Indeed, this is an issue that should unite the races rather than dividing them since police are not reluctant to dispatch whites (or Hispanics or Asians, for that matter). While some observers claim the apparent increase in frequency of these cases is only because of the prevalence of cell phones and video cameras, this is also irrelevant; the fact that we may be noticing more abuses now would not be a reason to decry the new technology. As always, the pertinent question is whether or not an abuse of power took place. And those interested in the answer to that question, which should be every American, will have to contend with the unpromising prospect of a police department – a monopoly bureaucracy – investigating itself.

That is the very real national problem festering in Ferguson, MO – not a civil-rights problem, but a civil-wrongs problem.

The Battle Lines

Traditionally, ever since the left-wing counterculture demonized police as “pigs” in the 1960s, the right wing has reflexively supported the police and opposed those who criticized them. Indeed, some of this opposition to the police has been politically tendentious. But the right wing’s general stance is wrongheaded for two powerful reasons.

First, support for law enforcement itself has become progressively less equated to support for the Rule of Law. The number and scope of laws has become so large and excessive that support for the Rule of Law would actually require opposition to the existing body of statutory law.

Second, the monopoly status of the police has enabled them to become so abusive that they now threaten everybody, not merely the politically powerless. Considering the general decrease in crime rates driven by demographic factors, it is an open question whether most people are more threatened by criminals or by abusive police.

Even a bastion of neo-conservatism like The Wall Street Journal is becoming restive at the rampant exercise of monopoly power by police. Consider these excerpts from the unsigned editorial, “The Ferguson Exception,” on Friday, August 15, 2014: “One irony of Ferguson is that liberals have discovered an exercise of government power that they don’t support. Plenary police powers are vast, and law enforcement holds a public trust to use them in proportion to the threats. The Ferguson police must prevent rioting and looting and protect their own safety, though it is reasonable to wonder when law enforcement became a paramilitary operation [emphasis added]. The sniper rifles, black armored convoys and waves of tear gas deployed across Ferguson neighborhoods are jarring in a free society…Police contracts also build in bureaucratic privileges that would never be extended to other suspects. The Ferguson police department has refused to… supply basic information about the circumstances and status of the investigation [that], if it hasn’t been botched already, might help cool passions… how is anyone supposed to draw a conclusion one way or the other without any knowledge of what happened that afternoon?”

The Tunnel… and the Crack of Light at the End

The pair of editorial reactions in The Wall Street Journal typifies the alternatives open to those caught in the toils of America’s racial strife. We can play the same loop over and over again in such august company as Joseph Epstein. Or we can dunk ourselves in ice water, wake up and smell the coffee – and find ourselves rubbing shoulders with the Journal editors.

DRI-258 for week of 8-3-14: ’10 Things You Shouldn’t Order in a Restaurant’ Are Really 11 Signs of Cultural Decay

An Access Advertising EconBrief: 

’10 Things You Shouldn’t Order in a Restaurant’ Are Really 11 Signs of Cultural Decay

Mainstream economics has become overwhelmingly quantitative. Economic theory is exclusively mathematical while empirical economics is highly statistical. Alas, this is faux quantification, since the theoretical mathematics is wrongly focused, badly used and too narrow to be of much practical use. Statistical economics, consisting of ever-more-complex regression and simulation models, is even more dubious. Leading economists have committed a succession of embarrassing faux pas, notably an obsessive misuse of the concept of “statistical significance.”

These errors have diverted attention away from cultural study. Economic consultants blithely assumed that they could convert the Soviet Union into a kind of blackboard version of a market economy by advising central planners. When this proved chimerical, the consultants collected their fees and went back to their figurative drawing boards full of models.

This space has often cited the observation by F.A. Hayek that free capitalist markets promote rational behavior rather than requiring it for successful operation. That is, by rewarding those who behave rationally, free markets tend to impart an evolutionary bias in favor of reason, much as the principle of natural selection tends to reward characteristics conducive to survival of a species.

It is time to ponder the reversibility of this relationship. We should ask whether the disparagement of capitalism and suppression of free markets does not, in fact, tend to encourage human irrationality.

A recent article generated by the website answer.com and syndicated on the Internet provides a case in point.

10 Things [Really 11] That You Shouldn’t Order in a Restaurant

The article is a familiar sort of contemporary “insider investigative writing.” Somebody purporting to be in the know offers special knowledge to the hoi polloi as a gratuitous prophylactic gesture.

You, among millions of Americans every day, walk into a McDonald’s restaurant. You consider your order. To energize your system, you choose a McCafe…

No, you don’t. Because, according to the anonymous author of this article, “…the machines are routinely neglected and rarely cleaned at most McDonald’s restaurants. Most employees are not even trained on its cleaning and maintenance. All the McCafe beverages [are] run through the dirty machine.”

This is a representative specimen of the 10 Things You Shouldn’t Order In A Restaurant. In fact, there were actually 11 cautionary cases in the particular article sampled, not 10 – but that is probably the least important thing about the piece. Indeed, it is not even a defect; since when does a consumer object to getting more goods and services than promised? The phrase “particular article sampled” highlights what will soon emerge as an overpowering irony – answer.com is actually serving up previously prepared content, repackaged and recycled with new titles and spruced up with a fresh example of two. That is, it is giving its readers exactly what it accuses the ‘offending’ restaurants in this article of serving their customers.

 

For our purposes, it is best to group these cases according to the unifying logical principle that unites them.

Restaurants Are Guilty of Implicit Dishonesty in Product “Labeling”

According to the author, restaurants are guilty of implicit dishonesty because they produce and present their products in ways that you would not approve if you knew about it. Notice the use of “you” above rather than “we.” The difference between “you” and “we” is “me” or, in this case, the universal “he” represented by the author. He does know about it – or claims to. That is the tenor of the article; an insider explaining to the suckers how they are being taken for a ride. We might compare it to a con man gone straight, explaining the tricks of the trade as expiation for his sins. Readers are expected to suck in their breath and wince at each new revelation.

“Wendy’s Chili” is a prime example. “The meat in the chili is the leftover meat that dries up until there is enough for a batch of chili…they just add hot water to [unsold leftover chili] and mix it up.” Shocking! But compare that with the depredation represented by “Meatloaf.” “There is often more filler than meat in most meat loaf served in restaurants. They drown the meatloaf in sauce and seasonings, so you won’t notice the lack of taste and meat.” Who knew?

The “BBQ Sandwiches at KFC” example offers a truly stunning expose. “The BBQ sandwich is made up from old chicken and soaked in BBQ sauce.” The news that BBQ sandwiches at Kentucky Fried Chicken are composed of chicken soaked in BBQ sauce – rather than, say, veal marinated in white wine – is a crushing blow. But at least we can hope the higher-quality menu items live up to their implicit billing. No, nothing escapes the author’s merciless gaze. Even the “Gourmet Burgers” are destined for infamy. “Many restaurants try and use one expensive ingredient in a burger, and then label it ‘gourmet’ with a significantly higher price tag. Most so-called expensive ingredients are overrated, used in extremely small amounts, or mixed with other ingredients making them cheaper to use.”

What about the specials? No good; “restaurant[s] usually put…extra ingredients about to expire…out as the next day’s special, disguised in a sauce. They use the sauce to hide the fact that the ingredients are… past their prime.” Well, we can surely patronize the “Best Sellers” and be assured of top-quality dishes of fresh vintage, right? Certainly not. “Many think that best sellers have high turnover. However, to keep up with demand, restaurants will pre-make the top sellers, leaving them to sit around and develop food borne illness. By ordering a less popular item, it is more likely to be made to order.”

Restaurants are Ripping You Off

Why are restaurants behaving like jerks? Do they have a focused dislike for you particularly? (Indeed, it would seem that the prior question should be “why are restaurants ‘behaving’ at all?” since bricks, mortar, stone and wood do not ordinarily assume human characteristics and perform human actions.) Although the author does not address this question directly, he does provide an indirect answer: Restaurants are trying to rip you off by providing poor value for price.

Take the outrageous case of “Ice Cream.” “…Unless it is made in house or is some sort of exotic ice cream, it is not worth it. You can go to any grocery store and buy the exact same thing they are serving you for ten times the price (sic).” Not content with a 1000% markup, the author doubles the ante. “Iceberg Lettuce” is “…one of the cheapest items that you can buy in a grocery store… ninety eight percent water and then marked up at least twenty times at a restaurant. Plus germs can hide in the cracks of the lettuce.”

Restaurants Are Criminally Negligent of their Customers’ Health

Speaking of germs, that brings us to a really serious matter. Restaurants aren’t merely trying to take your money. They’re trying to rob you of your health, too. It isn’t just that they don’t clean their machines. Why, they deliberately “leave” their biggest-selling items to “sit around and develop food borne illness!” (And you thought it was people who became ill, not foods.)

In “Hot Dogs at the BallPark,” the author slaughters a venerable American sacred cow – or perhaps a porcine metaphor might be more fitting. “[Hot dogs] are kept in water after [!] they are grilled. Any hot dogs left at the end of the day are put back in the refrigerator, and [sold subsequently].” Presumably, the bacteriological horror committed here is so ghastly that it must be left to the imagination.

We should note that the change in venue illustrates the recycled nature of the answer.com product. Since the “ball park” is hardly a restaurant, this example is obviously cribbed from another article such as “10 Things You Should Never Order Out.” Curiously, readers did not comment on this clear-cut violation of the article’s product “label.”

Reader Comments

And that brings us to reader comments, often the most interesting and enlightening aspect of these articles. This author was able to peruse only a representative sample of comments, but they proved as chillingly illuminating as a proverbial slap in the face.

With a few exceptions, readers were scathing in their condemnation of the author’s writing style (“he’s lucky to have a job”) and his impressionistic approach to his subject. Yet the comments were most memorable for what they did not say. They apparently failed to recognize the implicit absurdity of the diatribe against fast food in general. They missed the utter economic illogic running through the entire piece. Worst of all, they missed the ideological implications with which the article was pregnant.

The comment by one David Goodwin is particularly meaningful. After refuting several of the author’s specific examples – apparently on the basis of Goodwin’s personal experience in the restaurant business – Mr. Goodwin continues in this schizophrenic vein: “…Almost ALL fast food joints use already prepared food that the store just customizes. It mostly comes from Sysco Foods, PFG and US Food Service. Yes, everywhere you go uses PREPARED food from these companies. Even a 5-star restaurant I was sous chef in used one or more of these companies. If you want good wholesome food you will have to make it yourself. Capitalist America does NOT CARE ABOUT YOUR HEALTH. [italic emphasis added]“

Mr. Goodwin, like most readers, recognizes the factual inaccuracy of this article. Yet this does not stop him from embracing the author’s pejorative view of the restaurant industry in general or fast food in particular. To top off this contradictory evaluation, he blames the shortcomings of the status quo on capitalism – opening the door to the implication that “socialist America” could produce the “good wholesome food” that as of now can only be had through self-sufficiency.

The Truth About Restaurants and Fast Food

The truth about restaurants and fast food lies at the opposite pole from the rantings of the answer.com author. It is equally distant from the reactionary leftism of Mr. Goodwin.

It is unnecessary to employ statistical regression to verify that restaurants in general and fast food (or casual-service) restaurants in particular, constitute a ferociously competitive sector – arguably, the most competitive in the U.S. economy. Entry into the business is easy. Franchising has given countless average Americans a paint-by-numbers entrée to the corporate business world that was heretofore unimaginable. But new restaurants fail in large numbers because the discipline and demands of the business are fearsome.

Each of the abuses complained of by the author could conceivably exist in the United States of America on a random, occasional basis. But the author clearly intends that we consider them universal, permanent features of the restaurant landscape, as witness his title “10 Things You Should Never Order in A[ny] Restaurant.” This is insanely ridiculous. Evaluated in this sense, his article is dreck.

Let us consider the charges and specifications brought by the author against restaurants and fast food. First comes the charge of dishonesty in product labeling. Contemplate the “phony meatloaf” charge leveled against restaurants that substitute “filler” for meat, then “drown the meatloaf in sauce and seasonings so you won’t notice the lack of taste and meat.” If there were laws of logic, the author would be charged with a capital crime for this passage. How can a dish “drown[ed] in sauce and seasonings” suffer the defect of “lack of taste?” But setting aside this mammoth contradiction, ponder the implication of the words “so you won’t notice.” Taking those words at face value, this restaurateur (or chef) has achieved a feat worthy of a James Beard culinary prize and a Nobel Prize in economics. He has substituted flavorings for meat and cheaper for more costly inputs while producing a final product that the consumer likes equally well. And we are supposed to complain about this feat of magic?

Of course, the truth is that the author was so blinded by his prejudice and inner need to prejudice the reader against restaurants that he lost the ability to write coherent prose. Examine the one case not yet discussed – “Anything With Steak or Beans at Taco Bell.” “The beans look like cat food when they come to the restaurant. Employees just add water and stir…When the steak sits too long it becomes like hair gel.” Here, the author has no crime to allege and is reduced to hurling insults – and not even at the final product but at the inputs. Do the beans taste bad? No, they “look like cat food” – a quixotic sort of insult since cat food looks just as appetizing as human foods like canned tuna and salmon. The author does not say whether the steak “becomes like hair gel” in texture, taste or appearance, leaving the reader to puzzle over what hair-gel experience the author is bringing to the table.

His overweening prejudice leads the author to criminalize food-preparation practices followed by every household and every reader of this article. Is there a household in America that does not take food uneaten during a meal and “put [it] back in the refrigerator,” to “come out again the next day” like the hot dogs at the ball park? Is there a reader of the article unfamiliar with the concept of leftovers?

Similarly, the author criminalizes acts of standard restaurant business practice that are followed by every household. What household chef has not had to “pre-make” dinner for a working spouse or children engaged in extra-curricular activities? If a precise estimate of demand is not available for the average household of 2.1 persons, why is it a crime for a restaurant unable to precisely gauge its dinner demand to pre-make food? Or retain excess or unordered food? Could a household remain solvent if it followed the food-retention policies implicitly demanded by the author; e.g., discarded all “ingredients” still usable but “about to expire?” Indeed, could the author? Does he?

The acid test of fast food or any restaurant food is not what the author thinks its inputs look like or how it was put together or what the author thinks consumers think they are getting. It is not even what regulators think or what the health inspector thinks. The acid test is what consumers think about the food’s taste, its effect on their health and well-being and about their reaction to the dining experience – if any. It is superfluous to point out that consumers don’t need the author’s opinions to render their judgments. The fast-food industry alone accounts for $117 billion in annual sales in the U.S. The tacit premise of this article is that those consumers are idiots.

The usual effect of substituting filler – whatever that might be – for meat is a less palatable meatloaf. And the usual effect of that is consumers stop buying the product. That is how markets work. Any restaurant that can reverse this presumption is rewarding consumers as well as itself; it should be lauded, not condemned. It is the restaurant’s job to use sauces to improve dishes, whether by “disguising” flavors or enhancing them. How well it succeeds is decided by consumers; that is how markets work. If restaurants follow practices that promote the spread of “food borne [sic] disease,” than consumers get sick and attribute their illness to their restaurant experience. When that happens, other consumers shun the offending restaurants in droves. Disease-spreading restaurants go out of business. That is how markets work.

And that is the most insidious effect of this answer.com article. Without ever admitting what he is up to -perhaps even without conscious intent – the author is nourishing the suspicion in the minds of his readers that markets don’t work. If we take him at face value, they don’t work because you – the consumer – are too bloody stupid. You can be taken in by those evil geniuses, the “restaurants,” who trick you with fillers and sauces and grillers and just plain hot water – boy, you’re dumb. Not like the author. He is smart. He is on to those evil restaurateurs.

If You’re So Smart, Why Aren’t You Rich?

The great economic historian Deirdre (formerly Donald) McCloskey would interrupt the author at this point with what she calls “the American question”: “If you’re so smart, why aren’t you rich?” Our author is a restaurant insider, wise in the ways and wiles of the industry. He knows what is good for consumers, who are being hosed by incumbent producers – according to him. Sellers are enjoying 1000% and 2000% markups on some items that can be had in groceries – or so he claims. He wants to help consumers; that is why he writes articles like this. But why stop at telling them their troubles? Why not actually help them, and make a fortune in the process? After all, a successful restaurant can make good money; that is the return for all those long, hard hours of work and the expertise needed to give diners what they want.

The author could go into partnership with Mr. Goodwin, who – unlike capitalist America- cares about the health of Americans. Their business plan is already written; their blueprint is clear. Mr. Goodwin knows that prepared good is not “good wholesome food.”(Notice that Mr. Goodwin levels two charges against the Syscos and US Foods of the world; their food is neither palatable nor is it safe to eat.) Mr. Goodwin, the former sous chef, also has restaurant experience. Why don’t these two join forces to show Taco Bell and Wendy’s and McDonald’s how it’s done? Since they are convinced that one of the world’s largest countries is crying out for something good and safe to eat, they presumably are assured of a market for their products. With industry expertise, a guaranteed market, a pricing umbrella spread out by corporate price leaders and yawning pricing margins just waiting to be exploited, they certainly shouldn’t have any trouble attracting investors for their venture. New restaurants still open up every day with poorer prospects than these.

The two share a gospel for restaurant practice: When in doubt, throw it out! Prepare no dish before its time! Make only products that consumers cannot make for themselves. “Gourmet” foods must not only taste exceptional, their cost must also be exceptional. And everything sold at low prices and no markups, just as it is in those gastronomic paradises, the socialist countries of the world.

And no long lines, of course. It wouldn’t be fast food otherwise, would it?

What are they waiting for?

 

Reader Reaction

The cultural inferences stemming from this article do not flow from its poverty of fact, logic, grammar, or punctuation. They are unrelated to its ideological tendentiousness, lamentable though that is. They derive from readers’ reactions to it.

As noted above, readers intuitively grasped the basic falsity of the article itself – its contradictions, its factual deficit, even its absurd implications. Yet despite this, they said hardly a good word about fast food or restaurants in general. And they showed no realization of the blessings bestowed upon them by free markets.

 

There is no doubt whatever that these blessings are real. Even ignoring longtime mainstays like McDonald’s, Kentucky Fried Chicken, Wendy’s and Taco Bell, America is embracing fast food with increasing fervor. Regional favorites are overtaking the industry leaders. The favorites are a mix of old and new. Some of them, like WhiteCastle, Krystal and Friendly’s, date back to the 1920s and 30s. Others, like In-N-Out Burgers, Whataburger, Blake’s Lotaburger and TacoTime, are relics of the 1940s, 50s and 60s. Bojangles, Culver’s and Raising Cane’s are part of the newer generation. If we are to assume that this massive, widespread and growing embrace of fast food is some grotesque market failure, some huge bamboozle of consumers by “the big corporations” – penetrable only by wise guys who write badly like the answer.com author – then it has been ongoing for nearly a century, affecting four generations of American consumers.

During the rise of fast food, America has become a nation of restaurant diners. Unfortunately, the proliferation of subsidies – ultra-low interest rates midwived by the Federal Reserve, loans and other subsidies at every level of government – has produced a restaurant bubble that has been deflating since before the Great Recession. But nobody can cogently deny that Americans can choose from a greater variety of restaurants serving more and better food than ever before. All this has coexisted with rising standards of living, rising life expectancy and falling incidence of disease.

Americans owe all of this to capitalism and free markets. (Not coincidentally, they owe much of it to relative freedom of immigration as well.) But the readers of the answer.com travesty were tongue-tied when it came time to defend the system. They were all-too-willing to be mute witnesses to the libel of the fast-food and restaurant industries.

Economic freedom is under assault. The outcome is in doubt. When the primary beneficiaries of the system approve of the attacks or abstain rather than taking up arms, the battle is very nearly lost by default. This is happening today. Why do the beneficiaries of a process refuse to defend it? The gradual diminution of freedom in markets over the course of the twentieth century has produced successively less rational generations of Americans. This is not the same things as being less educated or less intelligent, because economic reasoning is a specialized form of abstract logic. The answer.com article and reader comments show the effect of this across a range of the political spectrum, but much more incisively on the mainstream. There have always been ill-conceived attacks from the political left, but over time we are losing both the will and the ability to answer those attacks cogently.

DRI-291 for week of 7-27-14: How to Debate Bill Moyers

An Access Advertising EconBrief:

How to Debate Bill Moyers

In the course of memorializing a fellow economist who died young, Milton Friedman observed that “we are all of us teachers.” He meant the word in more than the academic sense. Even those economists who live and work outside the academy are still required to inculcate economic fundamentals in their audience. The general public knows less about economics than a pig knows about Sunday – a metaphor justly borrowed from Harry Truman, whose opinion of economists was famously low.

Successful teachers quickly sense that they have entered their persuasive skills into a rhetorical contest with the students’ inborn resistance to learning. Economists face the added handicap that most people overrate their own understanding of the subject matter and are reluctant to jettison the emotional baggage that hinders their absorption of economic logic.

All this puts an economist behind the eight-ball as educator. But in public debate, economists usually find themselves frozen against the rail as well (to continue the analogy with pocket billiards). The most recent case of this competitive disadvantage was the appearance by Arthur C. Brooks, titular head of the conservative American Enterprise Institute (AEI), on the PBS interview program hosted by longtime network fixture Bill Moyers.

Brooks vs. Moyers: An Unequal Contest

At first blush, one might consider the pairing of Brooks, seasoned academic, Ph D. and author of ten books, with Moyers, onetime divinity student and ordained minister who left the ministry for life in politics and journalism, to be an unequal contest. And so it was. Brooks spent the program figuratively groping for a handhold on his opponent while Moyers railed against Brooks with abandon. It seemed clear that each had different objectives. Moyers was insistent on painting conservatives (directly) and Brooks (indirectly) as insensitive, unfeeling and uncaring, while Brooks seemed content that he understood the defensive counterarguments he made in his behalf, even if nobody else did.

Moyers never lost sight of the fact that he was performing to an audience whose emotional buttons he knew from memory and long experience. Brooks was speaking to a critic in his own head rather than playing to an alien house whose sympathies were presumptively hostile.

To watch with a rooting interest in Brooks’ side of the debate was to risk death from utter frustration. In this case, the only balm of Gilead lies in restaging Brooks’ reactions to Moyers’ sallies. This should amount to a debater’s handbook for economists in dealing with the populists of the hard political left wing.

Who is Bill Moyers?

It is important for any debater to know his opponent going into the debate. Moyers is careful to put up a front as an honest broker in ideas. Brooks’ appearance on Moyers’ show is headlined as “Arthur C. Brooks: The Conscience of a Compassionate Conservative,” as if to suggest that Moyers is giving equal time in good faith to an ideological opponent.

This is sham and pretense. Bill Moyers is a professional hack who has spent his whole life in the service of the political left wing. While in his teens, he became a political intern to Texas Senator Lyndon Johnson. After acquiring a B.A. degree in journalism from the University of Texas at Austin, Moyers got an M.A. from the Southwest Baptist Theological Seminary in Fort Worth, Texas. After ordination, he forsook the ministry for a career in journalism and left-wing politics, two careers that have proved largely indistinguishable for over the last few decades. He served in the Peace Corps from 1961-63 before joining the Johnson Administration, serving as LBJ’s Press Secretary from 1965-67. He performed various dirty tricks under Johnson’s direction, including spearheading an FBI investigation of Goldwater campaign aides to uncover usable dirt for the 1964 Presidential campaign. (Apparently, only one traffic violation and one illicit love affair were unearthed among the fifteen staffers.) A personal rift with Johnson led to his resignation in 1967. Moyers edited the Long Island publication Newsday for three years and he alternated between broadcast journalism (PBS, CBS, back to PBS) and documentary-film production thereafter until his elevation to the presidency of the SchumanCenter for Media and Democracy in 1990. Now 80 years old, he occupies a position best described as “political-hack emeritus.”

With this resume under his belt, Moyers cannot maintain any pretense as an honest broker in ideas, his many awards and honorary degrees notwithstanding. After all, the work of America’s leading investigative reporters, James Steele and Donald Barlett, has been exposed in this space as shockingly inept and politically tendentious. Journalists are little more than political advocates and Bill Moyers has thrived in this climate.

In the 1954 movie Night People, Army military intelligence officer Gregory Peck enlightens American politician Broderick Crawford about the true nature of the East German Communists who have kidnapped Crawford’s son. “These are cannibals…bloodthirsty cannibals who are trying to eat us up,” Peck insists. That describes Bill Moyers and his ilk, who are among those aptly characterized by F.A. Hayek as the “totalitarians in our midst.”

This is the light in which Arthur Brooks should have viewed his debate with Bill Moyers. Unfortunately, Brooks seemed stuck in defensive mode. His emphasis on “conscience” and “compassion” seemed designed to stress that he had a conscience – why leave the inference that this was in doubt? – and that he was a compassionate conservative – as opposed to what other kind, exactly? Thus, he began by giving hostages to the enemy before even sitting down to debate.

Brooks spent the interview crouched in this posture of defense, thereby guaranteeing that he would lose the debate even if he won the argument.

Moyers’ Talking Points – and What Brooks Should Have Said

Moyers’ overall position can be summarized in terms of what the great black Thomas Sowell has called “volitional economics.” The people Moyers disapproves of – that is, right-wingers and owners of corporations – have bad intentions and are, ipso facto, responsible for the ills and bad outcomes of the world.

Moyers: “Workers at Target, McDonald’s and Wal-Mart need food stamps to survive…Wal-Mart pays their workers so little that their average worker depends on $4,000 per year in government subsidies.”

Brooks: “Well, we could pay them a higher minimum wage – then they would be unemployed and be completely on the public dole…”

Moyers: “Because the owners of Wal Mart would not want to pay them that higher minimum wage [emphasis added].

 

WHAT BROOKS SHOULD HAVE SAID: “Wait a minute. Did you just say that the minimum wage causes higher unemployment because business owners don’t want to pay it? Is that right? [Don't go on until he agrees.] So if the business owners just went ahead and paid all their low-skilled employees the higher minimum wage instead of laying off some of them, everything would be fine, right? That’s what your position is? [Make him agree.]

Well, then – WHY DON’T YOU DO IT? WHY DON’T YOU – BILL MOYERS – GO BUY A MCDONALD’S FRANCHISE AND PAY EVERY LOW-SKILLED EMPLOYEE CURRENTLY MAKING THE MINIMUM WAGE AND EVERY NEW HIRE THE HIGHER MINIMUM WAGE YOU ADVOCATE. SHOW US ALL HOW IT’S DONE. DON’T JUST CLAIM THAT I’M WRONG – PROVE IT FOR ALL THE WORLD TO SEE. THEN YOU CAN HAVE THE LAUGH ON ME AND ALL MY RIGHT-WING FRIENDS.

[When he finishes sputtering:] You aren’t going to do it, are you? You certainly can’t claim that Bill Moyers doesn’t have the money to buy a franchise and hire a manager to run it. And you certainly can’t claim that the left-wing millionaires and billionaires of the world don’t have the money -not with Tom Steyer spending a hundred million dollars advertising climate change. The minimum wage has been in force since the 1930s and the left-wing has been singing its praises for my whole life – but when push comes to shove the left-wing businessmen pay the same wages as the right-wing businessmen. Why? Because they don’t want to go broke, that’s why.

WHY IT IS IMPORTANT TO SAY THIS: The audience for Bill Moyers’ program consists mainly of people who agree with Bill Moyers; that is, of economic illiterates who do their reasoning with their gall bladders. It is useless to use formal economic logic on them because they are impervious to it. It is futile to cite studies on the minimum wage because the only studies they care about are the recent ones – dubious in the extreme – that claim to prove the minimum wage has only small adverse effects on employment.

The objective with these people is roughly the same as with Moyers himself: take them out of their comfort zone. There is no way they can fail to understand the idea of doing what Moyers himself advocates because it is what they themselves claim to want. All Brooks would be saying is: Put your money where your mouth is. This is the great all-purpose American rebuttal. And he would be challenging people known to have money, not the poor and downtrodden.

This is the most straightforward, concrete, down-to-earth argument. There is no way to counter it or reply to it. Instead of leaving Brooks at best even with Moyers in a “he-said, he-said” sort of swearing contest, it would have left him on top of the argument with his foot on Moyers’ throat, looking down. At most, Moyers could have limply responded with, “Well, I might just do that,” or some such evasion.

Moyers: “Just pay your workers more… [But] instead of paying a living wage… [owners] do stock buy-backs…”

Brooks: [ignores the opportunity].

WHAT BROOKS SHOULD HAVE SAID: “Did you just use the phrase ‘LIVING WAGE,’ Mr. Moyers? Would you please explain just exactly what a LIVING WAGE is? [From here on, the precise language will depend on the exact nature of his response, but the general rebuttal will follow the same pattern as below.] Is this LIVING WAGE a BIOLOGICAL LIVING WAGE? I mean, will workers DIE if they don’t receive it? But they don’t have it NOW, right? And they’re NOT dying, right? So the term as you use it HAS NOTHING TO DO WITH LIVING OR DYING, does it? It’s just a colorful term that you use because you hope it will persuade people to agree with you by getting them to feel sorry for workers, isn’t it?

There are over 170 countries in the world, Mr. Moyers. In almost all of those countries, low-skilled workers work for lower wages than they do here in the United States. Did you know that? In many countries, low-skilled workers earn the equivalent of less than $1,000 per year in U.S. dollars. In a few countries, they earn just a few hundred dollars worth of dollar-equivalent wages per year. PER YEAR, Mr. Moyers. For you to sit here and use the term “LIVING WAGE” for a wage THIRTY TO FIFTY TIMES HIGHER THAN THE WAGE THEY EARN IS POSITIVELY OBSCENE. Don’t you agree, MR. MOYERS? They don’t die either – BUT I BET THEY GET PRETTY HUNGRY SOMETIMES. What do you bet – MR. MOYERS?

WHY IT IS IMPORTANT TO SAY THIS: The phrase “living wage” has been a left-wing catch-phrase longer than most people today have been alive. Its use is “free” because users are never challenged to explain or defend it. It sounds good because it has a nice ring of urgency and necessity to it. But upon close examination it disintegrates like toilet tissue in a bowl. There is no such wage as a wage necessary to sustain life in the biological sense. For one thing, it would vary across a fairly wide range depending on various factors ranging from climate to gender to race to nutrition to prices to wealth to…well, the factors are numerous. It would also be a function of time. Occasionally, classical economists like David Ricardo and Karl Marx would broach the issue, but they never answered any of the basic questions; they just assumed them away in the time-honored manner of economists everywhere. For them, any concept of a living wage was pure theoretical or algebraic, not concrete or numerical. Today, for the left wing, the living wage is purely a polemical concept with zero concreteness. It is merely a club to beat the right wing with. It is without real-world significance or content.

Given this, it is madness to allow your debate opponent the use of this club. Take the club away from him and use it against him.

Bill Moyers: “Wal Mart, which earned $17 billion in profit last year…”

Arthur Brooks: [gives no sign of noticing or caring].

WHAT ARTHUR BROOKS SHOULD HAVE SAID: “You just said that Wal Mart earned $17 billion in profit last year. You did say that, didn’t you – I don’t want to be accused of misquoting you. Does that seem like a lot of money to you? [He will respond affirmatively.] Why? Is it a record of some kind? Did somebody tell you it was a lot of money? Or does it just sort of sound like a lot? I’m asking this because you seem to think that sum of money has a lot of significance, as though it were a crime, or a sin, or special in some way. You seem to think it justifies special notice on your part. You seem to think it justifies your demanding that Wal Mart pay higher wages to their workers than they’re doing now. And my question is: WHY? Unless my ears deceive me, you seem to be making these claims on the basis of the PURE SIZE of the amount. You think Wal Mart should “give” some of this money to its low-skilled workers – is that right? [He will agree enthusiastically.]

OK then. Here’s what I think: WHY DON’T YOU, MR. MOYERS? [He will pretend not to understand.] I MEAN EXACTLY WHAT I SAID. WHY DON’T YOU DO IT, MR. MOYERS, IF THAT’S WHAT YOU BELIEVE? [He will smile or laugh: "Because I'm not Wal Mart, that's why.] BUT YOU ARE, MR. MOYERS. OR YOU CAN BE. ANYBODY CAN BE. FOR THAT MATTER, THOSE WAL-MART WORKERS WHOSE WELFARE YOU CLAIM TO CARE FOR SO MUCH CAN BE, TOO. ALL YOU HAVE TO DO IS BUY WAL-MART STOCK. IT TRADES PUBLICLY, YOU KNOW.

IF YOU THINK WAL- MART SHOULD GIVE ITS MONEY AWAY, THEN BUY WAL-MART STOCK, TAKE THE IVIDENDS YOU PAY YOU AND GIVE THE MONEY AWAY TO WHEREEVER YOU THINK IT SHOULD GO. AFTER ALL, ONCE YOU BUY WAL MART STOCK…NOW YOU’RE WAL-MART. YOU OWN THE COMPANY. AT LEAST, YOU OWN A FRACTION OF IT, JUST LIKE ALL THE OTHER OWNERS OF WAL-MART DO. YOU WANT WAL MART TO GIVE ITS PROFITS AWAY? OK, GIVE THEM AWAY YOURSELF. WHY SHOLD THE GOVERNMENT WASTE MILLIONS OF DOLLARS IN BUREAUCRATIC OVERHEAD IN ACCOMPLISHING SOMETHING THAT YOU CAN ACCOMPLISH CHEAP FOR THE COST OF A DISCOUNT BROKERAGE COMMISSION?

And you can deduct it from your income tax as a charitable contribution…MR. MOYERS.

As far as that’s concerned, as a matter of logic, if Wal-Mart’s workers really agree with you that Wal-Mart is scrooging away in profits the money that should go to them in wages, then the workers could do the same thing, couldn’t they? They could buy Wal-Mart’s stock and earn that share of the profit that you want the company to give them. It’s no good claiming they don’t have the money to do it because they’d not only be getting a share of these profits you say are so fabulous, they’d also be owning the company that you’re claiming is such a super profit machine that it’s got profits to burn – or give away. If what you say is really true, you should be screaming at Wal-Mart’s workers to buy shares instead of wasting time trying to get the government to take money away from Wal-Mart so some of it can trickle down to the workers.

Of course, that’s the catch. I don’t even know if YOU YOURSELF BELIEVE THE BALONEY YOU’VE BEEN SPREADING AROUND IN THIS INTERVIEW. I don’t think you even know the truth about all three of those companies that you claim are so flush with profits. To varying degrees, they’re actually in trouble, MR. MOYERS. It’s all in the financial press, MR MOYERS – which you apparently haven’t read and don’t care to read. McDonald’s has had to reinvent itself to recover its sales. Wal-Mart is floundering. Target has lost touch with its core customers. And the $17 billion that seems like so much profit to you doesn’t constitute such a great rate of return when you spread it over the hundreds of thousands of individual Wal Mart shareholders – as you’re about to find out when you take my advice to put your money where you great big mouth is – MR. MOYERS.

WHY IT IS IMPORTANT TO SAY THIS: The mainstream press has been minting headlines out of absolute corporate profits for decades. The most prominent victim of this has been the oil companies because they have been the biggest private companies in the world. Any competent economist knows that it is the rate of return that reveals true profitability, not the absolute size of profits. Incredibly, this fact has not permeated to the public consciousness despite the popularity of 401k retirement-investment accounts.

Buying Wal-Mart stock is just another way of implementing the “put your money where your mouth is” strategy discussed earlier. If Bill Moyers’ view of the company were correct – which it isn’t, of course – it would make much more sense than redistributing money via other forms of government coercion.

The Goal of Debate

If you play poker and nobody ever calls your bluff, it will pay you to bluff on the slightest excuse. In debate, you have to call your debate opponent’s bluffs; otherwise, you will be bluffed down to your underwear even when your opponent isn’t holding any cards. Arthur Brooks was just as conservative in his debating style as in his ideology – he refused to call even Moyers’ most ridiculous bluffs. This guaranteed that the best outcome he could hope for was a draw even if his performance was otherwise flawless. It wasn’t, so he came off poorly.

Of course, he was never going to “win” the debate in the sense of persuading hard-core leftists to convert to a right-wing position. His job was to leave them shaken and uncomfortable by denying Bill Moyers the ease and comfort of taking his usual polemical stances without fear of challenge or rebuttal. This would have delighted the few right-wingers tuned in and put the left on notice that they were going to be bloodied when they tried their customary tactics in the future. In order to accomplish this, it was necessary to do two things. First, take the battle to Bill Moyers on his own level by forcing him to take his own advice, figuratively speaking. Second, clearly indicate by your contemptuous manner that you do not respect him and are not treating him as an intellectual equal and an honest broker of ideas. People react not only to what you say but to how you say it. If you respect your opponent, they will sense it and accord him that same respect. If you despise him, this will come through – as it should in this case. That is just as important as the intellectual part of the debate.

In a life-and-death struggle with cannibals, not getting eaten alive can pass for victory.

DRI-283 for week of 7-20-14: The Mentality of Emotional Entitlement: the Steve Bartman Incident Revisited

An Access Advertising EconBrief: 

The Mentality of Emotional Entitlement: the Steve Bartman Incident Revisited

The legacy of wisdom left by Nobel laureate F.A. Hayek includes his observation that markets work not because participants are rational, but because the very act of participation itself tends to make us more rational. Since markets reward rational behavior, capitalist societies become more rational over time as successful members prosper and multiply.

When segments of society resist this evolutionary progress, it behooves us to wonder why. It took almost a century and a half for sports executives to adapt the economic logic of choice into a tool to improve the on-field performance of sports teams. In the last decade, that modus operandi has swept the industry like an antibiotic-resistant germ. The business end of sports obeys Hayek’s evolutionary dictum.

Sports fans are a different story; they resist economic logic as baloney rejects the grinder. The Oscar-winning film telling the story of “moneyball” was a success d’estime but a poor performer at the box office. Even more telling is their faith in the so-called “economic development benefits” conferred by a professional sports franchise on its host city. Study after study has found no incremental gain accruing from the lavish subsidies conferred upon sports teams in the form of stadium-construction and/or -repair subsidies, rent forgiveness and relocation bonuses. To the extent that they exist, the absolute size of the economic benefit from a sports franchise is comparable to that of a mid-sized department store, according to various estimates by economists. But when subsidy proponents cite ostensible “multiplier effects” from sports teams that would make even a Keynesian economist blush with shame, sports fans fall for it like eggs from a tall chicken.

To rationalize such arguments in the name of economics – the science of rational choice – requires industrial-strength chutzpah. Where does this come from? The answer is best provided by an illustration – the notorious Steve Bartman incident.

The Shame of a City and a Profession: the Steve Bartman Incident

In the 2003 baseball season, the Florida Marlins and Chicago Cubs competed in the National League championship series for the right to play in the World Series. For the Cubs and their fans, this was a climactic moment. The team had not played in the World Series since its 1945 loss to the Detroit Tigers. Their only World Series victory had come in 1908, over the Tigers. This 95-year championship drought was (and remains) the longest of any American professional sports team.

The Cubs whetted their fans’ appetites further by taking a 3-1 lead in games won before losing game five. In the sixth game, they led 3-0 going into the eighth inning. Now just six outs away from their first World Series appearance in 58 years, they could feel the taste of clubhouse champagne bubbling in their mouths. When the first Marlins’ hitter was retired, there were only five outs to go. The next hitter doubled, placing a runner at second base. That brought up Manny Castillo, who worked the ball-and-strike count to its maximum of three balls and two strikes. Then he lifted a fly ball down the left-field line to the edge of the grandstand.

Cubs’ left-fielder Moises Alou was an outstanding hitter but a rather poor outfielder. Nevertheless, he raced to the grandstand, arriving just as the ball reached the stands. As Cubs’ announcer Thom Brennaman called the play (and endless replays confirmed), Alou had to jump high in the air and reach backhanded over the grandstand rail to reach the ball, because the stands were elevated well above ground-level at that point in Wrigley Field . (Presumably, this progressive elevation proceeding farther towards the outfield is a deliberate construction device intended to improve sight lines for fans.)

Outfielder Alou wasn’t the only person reaching for the ball. Reconciling the various accounts, replays and photographs of the incident suggests that at least six fans reached for the ball and at least three fans were very close to it. We know the names of two of those fans. One of them was a man named Pat Conley, who later commented quite volubly about his attempt to catch the ball. However, the ball passed behind him, just beyond his reach.

As fate would have it, the ball came within a foot or two of the fan occupying Aisle 4, Row 8, Seat 113. The occupant was a 26-year-old man named Steve Bartman. He was a lifelong Chicago Cubs’ fan who attended the fame with a good friend and the friend’s girlfriend. Bartman had played baseball himself as a boy and was now coach of a youth team. The words “came down” should not be interpreted to mean that the ball descended directly to the floor of the grandstand. Although it was difficult to distinguish what happened in live action, slow-motion replays clearly show that Bartman extended his arms directly out from his body and touched the ball, interrupting its course toward Alou’s glove below. The ball bounced away into the grandstand, where it was picked up by another fan. It was simply a foul ball, and Castillo’s at-bat continued.

Alou leaped and pivoted away from the grandstand, gesturing back toward the fans who had reached for the ball and grimacing in dismay and disapproval. At first, the fans in the stadium seemed unaware of the exact source of his annoyance. But in the Fox Television broadcast booth, announcer Brennaman told his viewers “that was a Cubs fan that tried to make that catch.” Broadcast partner Steve Lyons volunteered his reaction: “Why? I’m surprised that someone hasn’t thrown that fan onto the field.” Cell phones throughout the stadium began to ring as fans received calls from friends and acquaintances telling them that a Cubs fan had prevented Alou from making the catch – and identifying the fan by describing his clothing and the headphones he wore. Bartman, it seems, was such a hard-core fan that he brought his radio to the game to listen to the play-by-play even while watching on-site.

Very soon, resounding chants of “Asshole! Asshole!” filled the air and Bartman became the focus of attention both within the stadium and on camera. Stadium security guards arrived at his seat and surrounded him as a protective measure. Meanwhile, the game continued as follows: Castillo drew a walk, with ball four being a wild pitch that allowed the runner on second to advance to third; the next hitter singled, driving in the Marlins’ first run; the next hitter slapped an easy ground ball to star-fielding shortstop Alex Gonzales. According to most accounts, it seemed to be a made-to-order double play ball that would have ended the eighth inning with the Cubs ahead by two runs. But Gonzales fumbled the ball, leaving the bases loaded. The next hitter doubled, driving in two runs and tying the game, 3-3. With men on second and third, the next Marlin was walked intentionally. Jeff Conine hit a sacrifice fly that scored the Marlins’ go-ahead run while the inning’s second out was recorded. (Had Alou caught Castillo’s foul ball, this would have been the third out of the inning – assuming that everything subsequently would have happened the same as it did, in fact, happen.) The throw to the infield was not cut off, allowing the runners to advance. Another intentional walk was followed by a double that drove in three more Marlin runs, followed by still one more run-scoring single. Manny Castillo returned to the plate and ended the inning with a pop-up and the Marlins now ahead 8-3. This became the game’s final score.

But Steve Bartman never got to see the inning through to completion. As the fans’ chants grew louder and their threats more menacing, the guards escorted him away from his seat. As Bartman entered the tunnel leading outside the stadium, he was pelted with beer, cups and other debris. One fan tried to assault him; another fan tried to restrain the assailant and ended up in a fight for his pains. Bartman was eventually accompanied to the home of one of the security guards, as it was felt he might not be safe in his own home. His name and address had been obtained and callers to sports-talk shows that night proposed to organize expeditions to his home in order to kill him.

In the seventh and deciding game of the National League championship series, the Cubs trailed 3-0 but came back to take a 5-3 lead. They could not hold the lead and lost, 9-6. In two subsequent playoff series, they also failed to reach the World Series.

The furor over Bartman did not abate with the end of the game or even the end of the series. It was plain that substantial segments of the public blamed Bartman for the Cubs’ loss. This was so despite official statements by the team, the National League and the Commissioner of Baseball deploring the actions of fans and the persecution of Bartman and denying his effect on the game’s outcome. Several Cubs players insisted that they, not Bartman, bore sole responsibility for the loss. Bartman himself later issued a two-paragraph statement in which he apologized for his actions (“I’m truly sorry”) and stressed his undying loyalty to the Cubs. Replying to those, notably sportswriters, who criticized his failure to yield the right of way to Alou, he explained that “I had my eyes glued on the approaching ball the entire time and was so caught up in the moment that I did not even see Moises Alou, much less that he may have had a play.”

The magnitude of public obsession with the Bartman incident can be gauged by the fate of the ball that Bartman and Alou failed to catch. It was retrieved by a Chicago lawyer who sold it at auction for $113,000. But the ball was not retired to a glass case. The buyer arranged for its public detonation (!), with the steam generated by the explosion being piped off and infused into a special batch of spaghetti sauce (!!). This ritual was promoted as a kind of exorcism of the evil spirits that were supposedly thwarting the Cubs – and, by extension, the hopes and dreams of their fans.

Bartman did not lack for defenders. Sportswriters professed to be appalled by the fans’ behavior. They found the announcers’ actions questionable. Over time, they admired Bartman’s unconditional refusal of all monetary compensation for appearances, promotions or first-person accounts. But their support for Bartman has been strangely reserved and conditional. They have stressed his apology, as they would for a criminal who has paid a debt owed to society and now deserves to be let alone. They have hedged their conclusions by admitting that Bartman was at least partially at fault for his own persecution – the reasons vary from not “try[ing] his best to get out of the way, even if he wasn’t of a mind to see Alou approaching” to sitting silent and looking guilty before being led away from his seat to not being vigorous in his own defense. At least one writer sees no benefit to the team in urging Bartman to appear at Wrigley Field to receive a public apology, believing fans would “rip him to pieces.”

In 2011, Oscar-winning documentarian Alex Gibney’s documentary on the Bartman incident, Catching Hell, was shown on the ESPN network after making the rounds of the festival circuit. The documentary gathers the relevant archival materials on the case and utilizes the contemporary quick-cut, soundbite editing technique of presentation. Its most shocking feature is its presentation of fans and commentators critical of Bartman, such as the one who blandly maintains that Bartman’s headphone-wearing, deadpan presence was an insufferable provocation to fans and viewers. The fan did not provide a rationale for this stance or say how Bartman should have behaved.

Although Catching Hell was clearly intended as a tacit condemnation of the persecution of Steve Bartman, it hews to the party line in refusing to absolve Bartman of blame for his own crucifixion. Its closing lines express regret that Bartman couldn’t “have sacrificed his dream of catching a foul ball and pulled his hands away to allow Alou to catch that ball.”

According to friends, Bartman continues to avoid going out in public – particularly to Wrigley Field – for fear of being recognized. He refuses interviews; when tracked down by a reporter, he deferred responses pending consultation with his “legal team.” He is employed by a financial-consulting firm in the Chicago area.

As recently as 2013 – ten years after the Bartman Incident – the New York Times characterized Steve Bartman as “the most infamous fan, perhaps, in the history of American sports.”

It is time to lay the myth of Steve Bartman to rest once and for all.

The Truth About the Steve Bartman Incident

The most efficient way to reveal the truth about the Steve Bartman Incident is to isolate its individual components and uproot the axioms planted in each one.

The Cubs would have won game six if Moises Alou had caught Manny Castillo’s foul fly ball. This is perhaps the most deeply planted axiom of the entire episode. It explains the otherwise inexplicable persistence of the widespread antagonism toward Bartman. (However, it is not the source of that antagonism, as we shall see.) Apparently, there are thousands of people in the Chicago metropolitan area who are convinced that destiny had decreed a Cub victory that was derailed by Steve Bartman.

This is pure conjecture. There is no logical basis for believing or doubting it. It belongs to the same historical category as “what would have happened if John Wilkes Booth had not shot Abraham Lincoln in April, 1865.” The uncertainty derives from the fact that the counterfactual (Alou’s catch, which we are now assuming but that didn’t actually happen) changes the past and makes subsequent events indeterminate. If we were to assume that all subsequent plays unfolded exactly as they did in reality, then the second out of the inning – Jeff Conine’s sacrifice fly – would in fact have become the third out of the inning with the Cubs leading, 3-2. (Castillo, who scored the Cubs’ second run, would have been an out rather than a baserunner.) But that is a conjectural assumption on our part. For example, the runner on first who cautiously advanced only to third base with only one out on the Marlins’ game-tying double might well have tried to score with two outs. Why? Because the chances of scoring from third on an out with one out argue against making a risky try to score, while the chances of scoring with two outs are much less and justify taking more baserunning risk to score the tying run.

To grasp the importance of conjecture, contrast the actual situation in that playoff game with a different hypothetical case. Suppose that there had been two outs in the bottom of the ninth inning with the Cubs leading. Now it would be certain that the Cubs would have won if Alou had caught the ball. (There are no “subsequent events” subject to conjecture.) Of course, that would still leave the most interesting questions unanswered, but at least this question would be settled. In the actual case, though, the question “what would have happened if Alou had caught the ball” is unanswered and unanswerable. It does point to another relevant question – and another planted axiom.

Alou would have caught the ball if Bartman had not interfered with the play. This is another axiom that is buried deeply in the soil of Wrigley Field – and should be exhumed for study. Virtually everybody takes it for granted. Ironically, Alou himself was quoted as follows by the Associated Press in 2008: “You know the funny thing? I wouldn’t have caught it anyway.” This would seem a rather astonishing admission, since it renders moot virtually everything said since the conclusion of the game. To muddy the waters even further, though, Catching Hell interviews Alou in a contrary vein: “I’m convinced 100% that I had that ball in my glove” if not for Bartman.

Alou can hardly be viewed as a neutral witness, but his opinion is really irrelevant since this belongs in the same category of conjecture as the preceding premise. We simply don’t know whether Alou, left to his own devices, would have caught Castillo’s foul ball or not. What we do know is that it would have been a very difficult play and that Alou was a poor outfielder. More precisely, Alou would have had to make a leaping, backhanded catch while coming to an abrupt stop after a long run and while leaning over a barrier. As baseball players of all ages know, catching a baseball while undergoing bodily strain can cause a player to reflexively close his gloved hand prematurely, leading to a missed catch. There is also the difficulty of following a spinning foul ball under windy conditions to consider.

Once again, Alou’s catch is a conjectural possibility rather than a presumed certainty. If he would have missed the ball anyway, then Bartman’s role in the episode is irrelevant. This, in turn, leads to another contention that many people have taken for granted.

Castillo should have been called out because Bartman’s actions constituted “interference” with Alou according to the rules of baseball. Two lawyers who are also Cubs’ fans tried to make this case in a “brief” they wrote as a public service. Their claim was based on the fact that still photographs taken from some angles seemed to show that Bartman’s momentum had carried his arms across the (imaginary) vertical plane separating the playing field from the grandstand.

Unfortunately, Bartman has a conclusive defense to this charge. The same combination of replays and photographs shows with absolute certainty that Alou had to leap up and reach backhanded across the grandstand wall in order to position himself for the catch. Thus, the locus of the play itself was on the grandstand side of the imaginary plane, not on the playing-field side. Bartman may have penetrated the imaginary plane if his forward momentum placed his arms slightly across the plane, but this was irrelevant to the question of interference because that penetration occurred after the ball was disturbed on its downward path toward Alou’s glove. Presumably, this explains why the umpire did not invoke the interference rule and call Castillo out.

Like many laws or rules passed by people anxious to dispose of a problem, the interference rule works well enough in ordinary situations. This was one of the proverbial “hard cases that make bad law.” The logical solution to the problem of interference is for baseball-team owners to prevent it, either by building solid grandstand walls thick enough to prevent fans from reaching players or by placing barriers on the outside of the grandstand walls to achieve the same effect. It is inherently illogical to make fans, the paying customers to a sports contest, bear responsibility for insuring the integrity of that contest.

And this, in turn, leads us to the biggest planted axiom of all.

Practically everybody who has spoken publicly on the episode, including Bartman himself, assumes that Bartman did something wrong. But if the outcome of the game was conjectural even if Alou had caught Castillo’s ball, if we cannot even be sure that Alou would have caught it, and if Bartman cannot be adjudged legally guilty of interference according to baseball rules, exactly what did Steve Bartman do wrong? The fine print on the back of tickets sold by major-league baseball teams contains a disclaimer of liability for injuries incurred by causes such as foul balls and accidentally released or splintered bats. If teams are not responsible, it follows that fans must have the absolute right to protect themselves while within the confines of the grandstand.

Everybody seems to take it for granted that Steve Bartman acted deliberately to catch Manny Castillo’s foul ball as a souvenir. He may or may not have, but that is a non sequitur; every fan has the inherent right to self-protection. A foul ball such as the one hit by Castillo travels with sufficient force to fracture bones, even a skull. In a recent case now in litigation, a fan lost an eye to a line drive. In very rare cases, such injuries can be fatal; Little League players have suffered cardiac arrest when struck in the chest by a thrown baseball.

Sportswriters have sometimes suggested that Bartman should have evaded the Castillo’s fly ball to allow Alou to make the catch. This suggestion is utterly fatuous. Consider the implied logistics. The elapsed time of Castillo’s foul fly ball has been estimated at four seconds. In principle, there is only one person who should have to take such evasive action – the person whose seat is the ultimate destination of the ball. But at the time the ball is hit, nobody can know which person that is because nobody has the power to precisely determine where the ball will land. Even when the ball begins its downward trajectory, its true destination is still in doubt; that is why so many people were recorded reaching for it. At most, Steve Bartman had perhaps a second of realization (give or take a few fractions) during which he knew that the ball was heading for him. In an episode of the old Twilight Zone television series, he might have snapped his fingers, frozen time for everybody and everything at the stadium except him, looked around to assess the situation, glanced downward at the field and saw Moises Alou nearing the grandstand, and moved a safe distance away – then snapped his fingers to starttime rolling again. Alas, he did not operate in Rod Serling’s world of imagination.

In reality, his own account dovetails exactly with rational expectation. Bartman, the former player and current youth coach, did exactly what he taught his players to do – he followed the flight of the ball. He had no reason to think that Alou had a play on the ball because of the elevated seating and because he could not predict exactly where the ball would land. By the time he knew where the ball would end up, he had about a second to react. Instinct took over, as it would with anybody else in his position. He may have been partly trying to catch the ball – his father said that he “taught Steve to catch foul balls whenever he could” – but the chances are that he was just trying to get his hands on the ball to fend it off, the normal avoidance reflex toward approaching harm. The accounts describing the incident often say that Bartman “deflected” or “knocked” the ball away from Alou – these are not synonyms for “caught.”

Bartman didn’t simply let the ball go because he was instinctively concentrated on making contact with it. Until the last fraction of a second before the ball reached him, he couldn’t tell if it would hit him or miss him by a foot or so, so he followed his instinctive “plan” to catch or deflect it.

Should Bartman have ducked down at the last second to avoid the ball? Sometimes people do this when the last-second realization of the baseball hurtling at them and their lack of a baseball glove makes them bail out. But in this case Bartman had no place to go to and would probably have run into Alou instead of deflecting the ball. And somebody else reaching for the ball would have moved in to occupy his space and deflect or catch it instead.

Bartman paid for his seat. He had no legal or moral obligation to leave it unless that would improve his chances of self-protection. If everybody who thought the ball might come close to them when the ball was first hit were to get up and run to a new location, this would mean that a whole vector of ticketholders would be up and running simultaneously. This would produce chaos and the likelihood of injuries and wouldn’t improve the chances of avoiding interference even if other fans could be prevented from simultaneously moving toward the area. No, the avoidance idea is just plain silly. Besides, Bartman didn’t have time or reason to think of it.

Because fans like to fantasize, the catching of foul balls has acquired a kind of cachet. It seldom happens because struck baseballs are hard enough to catch with a baseball glove and because the partial contact between bat and foul ball imparts spin that increases the difficulty of a catch. But this cachet gave people the pretext they needed to blame Bartman. He was selfish, out souvenir-hunting, not thinking of his team – of their team – the way a loyal fan should.

This issue is a non sequitur, a red herring. Bartman had a legal and moral right to catch the ball in self-protection, keep it as a souvenir, sell it or give it away to charity. He was not obligated to avoid the problem or anticipate it.

Bill Buckner played first base for the Boston Red Sox in the 1986 World Series. His fielding error cost them the sixth game, prolonging a supposed “curse” that had frustrated the team since pitcher Babe Ruth had helped secure its 1918 World Series triumph. Buckner sums up the inherent logic of the Steve Bartman incident. “To get crucified the way he did is mind-boggling. He didn’t do anything. Take a major-league ballplayer and sit him in that seat and he would have done the same thing. I would have done the same thing.”

Why Did Fans React As They Did to the Steve Bartman Incident?

Baseball was been America’s national pastime for over a century. Passions and loyalties have run hot for the game. But no episode as ugly and wide-ranging as the Bartman Incident comes to mind. In past years, Americans might have disapproved of Bartman’s actions. But they would have restrained themselves from behaving as savagely as Cubs’ fans did that night. They would have reserved judgment before behaving hurtfully to Bartman.

We began by noting an observation by F.A. Hayek. Hayek also made a critical distinction between freedom and power. The Cubs’ fans were not exercising their freedom of speech or action that night because they did not have the right to behave as they did. A right is only valid if its exercise does not deprive somebody else of their rights. The fans deprived Bartman of his personal safety, liberty and pursuit of happiness by assaulting and persecuting him falsely. At least one announcer incited fans to commit a criminal act.

The fans felt entitled to assert power over Bartman because their emotions were so strongly aroused by their loyalty to the Cubs and by the vicarious investment they held in the success of “their” sports team. A theory of just entitlement, such as a property right, implies a logical grounding to that entitlement. But the fans had no logic on their side, as we have just demonstrated. The strength of their emotions was their only argument. Thus, they operated on the basis of an implicit theory that we can call a theory of emotional entitlement. Their emotions were so strongly engaged that they immediately sought validation from the flimsiest of pretexts. This same mentality of emotional entitlement closes fans’ minds to the case against professional sports as an engine of economic development.

The modern entitlement society is an artifact of the welfare state. Its hold is even stronger in the rest of the world than it is here in the United States. It is therefore no coincidence that soccer riots involving dozens of deaths are an occasional feature of life in those foreign countries. The Steve Bartman Incident is a low-level version of this same descent into nihilistic decadence, where the Rule of Law gives way to the sway of emotional entitlement.

DRI-284 for week of 7-13-14: Why Big Government is Rotten to the Core: The Tale of the Taxpayers’ Defender Inside Federal Housing

An Access Advertising EconBrief:

Why Big Government is Rotten to the Core: The Tale of the Taxpayers’ Defender Inside Federal Housing

Today the trajectory of our economic lives is pointed steeply downward. This space has been disproportionately devoted to explaining both how and why. That explanation has often cited the theory of government failure, in which the purported objects of government action are subordinated to the desires of politicians, bureaucrats, government employees and consultants. Economists have been excoriated for sins of commission and omission. The resulting loss of personal freedom and marketplace efficiency has been decried. The progressive march toward a totalitarian state has been chronicled.

A recent column in The Wall Street Journal ties these themes together neatly. Mary Kissel’s “Weekend Interview” column of Saturday/Sunday, July 12/13, 2014, is entitled “The Man Who Took On Fannie Mae.” It describes the working life of “career bureaucrat” and economist, Edward DeMarco, whose most recent post was acting director of the Federal Housing Finance Agency. Ms. Kissel portrays him as the man “who fought to protect American taxpayers” and “championed fiscal responsibility” in government. As we shall see, however, he is really integral to the malfunctioning of big government in general and economics in particular.

The Career of Edward DeMarco

Edward DeMarco is that contradictory combination, a career government bureaucrat who is also a trained economist. He received a PhD. in economics from the University of Maryland in the late 1980s and went to work for the General Accounting Office (GAO). As “low man on the totem pole,” he was handed the job of evaluating Fannie Mae and Freddie Mac. They had been around since the 1930s but were known to few and understood by fewer in Congress. The decade-long-drawn-out, painful series of savings-and-loan bailouts had scalded the sensibilities of representatives and regulators alike. DeMarco’s job was to determine if Fannie and Freddie were another bailout landmine lying in wait for detonation.

His answer was: yes. The implicit taxpayer backstop provided to these two institutions – not written into their charter but tacitly acknowledged by everybody in financial markets – allowed them to borrow at lower interest rates than competitors. This meant that they attracted riskier borrowers, which set taxpayers up to take a fall. And the Congressional “oversight” supposedly placing the two under a stern, watchful eye was actually doing the opposite – acting in cahoots with them to expand their empire in exchange for a cut of the proceeds.

DeMarco sounded the alarm in his report. And sure enough, Congress acted. In 1992, it established the Office of Federal Housing Oversight (OFHO). A triumph for government regulation! A vindication of the role of economics in government! A victory for truth, justice and the American way!

Yeah, right.

DeMarco pinned the tail on this donkey right smack on the hindquarters. “‘The Fannie and Freddie Growth Act,'” he called it, “because it told the market ‘Hey, we really care about these guys, and we’re concerned about them because they’re really important.'” In other words, the fix was in: Congress would never allow Fannie and Freddie to fail, and their implicit taxpayer guarantee was good as gold.

This was the first test of DeMarco’s mettle. In that sense, it was the key test, because the result jibed with the old vaudeville punchline, “we’ve already agreed on what you are; now we’re just haggling about the price.” As soon as the ineffectual nature of OFHO crystallized, DeMarco should have screamed bloody murder. But the “low man on the totem pole” in a government bureaucracy can’t do that and still hope for a career; DeMarco would have had to say sayonara to the security of government employment in order to retain his integrity. Instead, he kept his mouth shut.

Kissel discreetly overlooks this because it doesn’t jibe with her picture of DeMarco as heroic whistleblower. She is acting as advocate rather than journalist, as editor rather than reporter.

Any doubts about the fairness of this judgment are dispelled by Kissel’s narrative. “After stints at the Treasury and Social Security Administration, DeMarco found himself working at the very oversight office that his reports to Congress had helped create.” Oh, he “found himself” working there, did he? At the very office that had doublecrossed and betrayed him? “It was 2006, when Fannie and Freddie’s growth had been turbocharged by the government’s mortgages-for-all mania. Mr. DeMarco recalls that during his ‘first couple of weeks’ at the agency, he attended a conference for supervision staffers organized to tell them ‘about great, new mortgage instruments’ – subprime loans, he says, with a sardonic chuckle.” But what exactly did he do about all this while it was in progress, other than chuckling sardonically?

The first twenty years of Edward DeMarco’s career illustrate the workings of big government to a T. They depict the “invisible handshake” between orthodox, mainstream economics and the welfare state that has replaced the “invisible hand” of the marketplace that economics used to celebrate.

The Mainstream Economist as Patsy for Politicians and Bureaucrats

Mainstream economists are trained to see themselves as “social engineers.” Like engineers, they are trained in advanced mathematics. Like engineers, they are trained as generalists in a wide-ranging discipline, but specialize in sub-disciplines – civil, mechanical and chemical engineering for the engineer, macroeconomics and microeconomics for the economist. Like engineers, economists hone their specialties even more finely into sub-categories like monetary economics, international economics, industrial organization, labor economics, financial economics and energy economics. Economists are trained to think of themselves are high theoreticians applying optimizing solutions to correct the failures of human society in general and markets in particular. They take it for granted that they will command both respect and power.

This training sets economists up to be exploited by practical men of power and influence. Lawyers utilize the services of economists as expert witnesses because economists can give quantitative answers to questions that are otherwise little more than blind guesses. Of course, the precision of those quantitative answers is itself suspect. If economists really could provide answers to real-world questions that are as self-assured and precise as they pretend on the witness stand, why would they be wasting their lives earning upper-middle-class money as expert witnesses? Why are they not fabulously rich from – let us say – plying those talents as traders in commodity or financial markets? Still, economists can fall back on the justified defense that nobody else can provide better estimates of (say) wages foregone by an injured worker or business profits lost due to tortious interference. The point is, though, that economists owe their status as experts to default; their claim on expertise is what the late Thorstein Veblen would call “ceremonial.”

When economists enter the realm of politics, they are the veriest babes in the savage wood. Politicians want to take other people’s money and use it for their own – almost always nefarious – purposes. They must present a pretense of legitimacy, competence and virtue. They will use anybody and everybody who is useful to them. Economists hold doctorates; they teach at universities and occupy positions of respect. Therefore, they are ideal fronts for the devices of politicians.

Politicians use economists. They hire them or consult with them or conspicuously call them to testify in Congress. This satisfies the politicians’ debt to competence legitimacy, competence, virtue and conscience (if they have one). Have they not conferred with the best available authority? And having done so, politicians go on to do whatever they intended to do all along. They either ignore the economist or twist his advice to suit their intentions.

That is exactly what happened to Edward DeMarco. His superiors gave him an assignment. Like a dutiful economist, he fulfilled it and sat back waiting for them to act on his advice. They acted, all right – by creating an oversight body that perverted DeMarco’s every word.

Deep down, mainstream economists envision themselves as philosopher kings – either as (eventual) authority figures or as Talleyrands, the men behind the throne who act as ventriloquists to power. When brought face-to-face with the bitter disillusion of political reality, they react either by retreating into academia in a funk or by retreating into their bureaucratic shell. There is a third alternative: occupational prostitution. Some economists abandon their economic principles and become willing mouthpieces for politicians. They are paid in money and/or prestige.

It is clear that DeMarco took the path of bureaucratic compliance. Despite the attempt of WSJ’s Kissel to glamorize his role, his career has obviously been that of follower rather than either leader or whistleblower. His current comments show that he harbors great resentment over being forced to betray his principles in order to make the kind of secure living he craved.

For our purposes, we should see him as the wrong man for the job of taxpayers’ defender. That job required an extraordinary man, not a bureaucrat.

DeMarco, DeMartyr

The second career of Edward DeMarco – that of “DeMarco, DeMartyr” to the cause of fiscal responsibility and taxpayer interests, began after the housing collapse and financial panic of 2008. After bailout out Fannie and Freddie, Congress had to decide whether to close them down or reorganize them. They fell back on an old reliable default option – create a new agency, the Federal Housing Finance Agency, whose job it was to ride herd on the “toxic twins.” When FHFA’s director, James Lockhart, left in August, 2009, Treasury Secretary Timothy Geithner appointed DeMarco as acting director.

DeMarco began by raising executive salaries to stem the exodus of senior management. This got him bad press and hostility from both sides of the Congressional aisle. DeMarco set out to reintroduce the private sector to the mortgage market by reducing loan limits and shrinking the mortgage portfolios of Fannie and Freddie. But we shouldn’t get the wrong idea here – DeMarco wasn’t actually trying to recreate a free market in housing. “I wasn’t trying to price Fannie and Freddie out of the market so much as get the price closer so that the taxpayer capital is getting an appropriate rate of return and that, more important, we start selling off this risk,” DeMarco insists. He was just a meliorist, trying to fine-tune a more efficient economic outcome by the lights of the academic mainstream. Why, he even had the President and the Financial Stability Oversight Council (FSOV) on his side.

Ms. Kissel depicts DeMarco as a staunch reformer who was on his way to turning the housing market around. “Mr. DeMarco’s efforts started show results. Housing prices recovered, both [Fannie and Freddie] started to make money – lots of it – and private insurance eyed getting back into the market. Then in August 2012 the Obama administration decided to ‘sweep’ Fannie and Freddie’s profits, now and in the future, into the government’s coffers. The move left the companies unable to build up capital reserves, and shareholders sued.”

That was just the beginning. DeMarco was pressured by Congress and the administration to write down principal on the loans of borrowers whose homes were “underwater;” e.g., worth less at current market value than the value remaining on the mortgage. He also opposed creation of a proposed housing trust fund (or “slush fund,” as Kissel aptly characterizes it). Apart from the obvious moral hazard involved in systematically redrawing contracts to favor one side of the transaction, DeMarco noted the hazard to taxpayers in giving mortgagees – 80% of whom were still making timely payments – an incentive to default or plead hardship in order to benefit financially. How could mortgage markets attract investment and survive in the face of this attitude?

This intelligent evaluation won him the undying hatred of “members of Congress [and] President Obama’s liberal allies [including] White House adviser Van Jones [who] told the Huffington Post “you could have the biggest stimulus program in America by getting rid of one person;” namely, DeMarco. “Realtors, home builders, the Mortgage Bankers Association, insured depositories and credit unions” fronted for the White House by pressuring DeMarco to “degrade lending standards” to the least creditworthy borrowers – a practice that epitomized the housing bubble at its frothiest. “Protestors organized by progressive groups showed up more than once outside [DeMarco's] house in Silver Spring, MD, demanding his ouster. A demonstration in April last year brought out 500 picketers with ‘Dump DeMarco’ signs and 15-foot puppets fashioned to look like him. ‘My first reaction was of course one of safety,’ [said DeMarco]. ‘When I first saw them, I was standing a few feet from the window of a ground-level family room and they’re less than 10 feet way through this pane of glass, and it was a crowd of people so big I couldn’t tell how many people were out there. And then all the chanting and yelling started.’ His wife had gone to pick up their youngest daughter…’so I had to get on the phone and tell her ‘Don’t come.’ Then he called the police, who eventually cleared the scene. ‘It was unsettling,’ he says. ‘I think it was meant to be unsettling… They wanted me to start forgiving debt on mortgages.'” This is what Ms. Kissel calls “the multibillion-dollar do-over,” to which “Mr. DeMarco’s resistance made him unpopular in an administration that was anxious to refire the housing market.” Ms. KIssel’s metaphor of government as arsonist is the most gripping writing in the article.

Epilogue at FHFA

Edward DeMarco was the “acting” director at FHFA. The Senate capitulated to pressure for his removal by approving Mel Watt, Majority Leader Harry Reid’s pick, as permanent director. Watt immediately began implementing the agenda DeMarco had resisted. DeMarco had successfully scheduled a series on increases in loan-guarantee fees as one of a series of measures to entice private insurers back into the market. Watt delayed them. He refused to lower loan limits for Fannie and Freddie from their $625,000 level. He directed the two companies to seek out “underserved, creditworthy borrowers;” i.e., people who can’t afford houses. He assured the various constituencies clamoring for DeMarco’s ouster that “government will remain firmly in control of the mortgage market.”

DeMarco’s valedictory on all this is eye-opening in more ways than one. Reviewing what Ms. Kissel primly calls “government efforts to promote affordable housing,” DeMarco dryly observes, “‘Let’s say it was a failed effort…To me, if you go through a 50-year period, and you do all these things to promote housing, and the homeownership rate is [the same as it was 50 years ago], I think the market’s telling you we’re at an equilibrium.’ If we assume “that only government can foster homeownership among people ‘below median income,’ that ‘suggests a troubling view of markets themselves.'”

And now the whole process is starting all over again. “If we have another [sic] recession, if there’s some foreign crisis that …affects our economy, it doesn’t matter whatever the instigating event is, the point is that if we have another round of house-price declines like we’ve had, we’re going erode most of that remaining capital support.” Characteristically, he refuses to forthrightly state the full implications of his words, which are: We are tottering on the brink of full-scale financial collapse.

Edward DeMarco: Blackboard Economist

The late Nobel laureate Ronald Coase derided what he called “blackboard economists” – the sort who pretended to solve practical problems by proposing a theoretical solution that assumed they possessed information they didn’t and couldn’t have. (Usually the solution came in the form of either mathematical equations or graphical geometry depicted on a classroom blackboard, hence the term.)

Was Coase accusing his fellow economists of laziness? Yes and no. Coase believed that transactions costs were a key determinant of economic outcomes. Instead of investigating transactions costs of action in particular cases, economists were all too prone to assume those costs were either zero (allowing markets to work perfectly) or prohibitive (guaranteeing market failure). Coase insisted that this was pure laziness on the part of the profession.

But information isn’t just lying around in the open waiting for economists to discover it. One of Coase’s instructors at the London School of Economics, future Nobel laureate F.A. Hayek, pointed out that orthodox economic theory assumed that everybody already knew all the information needed to make optimal decisions. In reality, the relevant information was dispersed in fragmentary form inside the minds of billions of people rather than concentrated in easily accessible form. The market process was not a mere formality of optimization using given data. Instead, it was markets that created the incentives and opportunities for the generation and collation of this fragmented, dispersed information into usable form.

Blackboard economists were not merely lazy. They were unforgivably presumptuous. They assumed that they had the power to effectuate what could only be done by markets, if at all.

That lends a tragic note to Ms. Kissel’s assurance that “Mr. DeMarco isn’t against government support for housing – if done properly.” After spending his career as “the loneliest man in government” while fighting to stem the tide of the housing bubble, Edward DeMarco now confesses that he doesn’t oppose government interference in the housing market after all! The problem is that the government didn’t ask him how to go about it – they didn’t apply just the right optimizing formula, didn’t copy his equations off the blackboard.

And when President Obama and Treasury Secretary Geithner and the housing lobbyists and the realtors and builders and mortgage bankers and lenders and progressive ideologues hear this explanation, what is their reaction? Do they smack their foreheads and cry out in dismay? Do they plead, “Save us from ourselves, Professor DeMarco?”

Not hardly. The mounted barbarians run roughshod over Mr. DeMarco waving his blackboard formula and leave him rolling in the dust. They then park their horses outside Congress and testify that “See? He’s in favor of government intervention, just as we are – we’re just haggling about the price.” Politicians with a self-interested agenda correctly view any attempt at compromise as a sign of weakness, an invitation to “let’s make a deal.” It invokes contempt rather than respect.

That is exactly what happened to Edward DeMarco. He is left licking the wounds of 25 years of government service and whining about the fact that the fact that politicians are self-interested, that government regulators do not really regulate but in fact serve the interests of the regulated, that the political left wing will stop at nothing, including physical intimidation and force.

No spit, Spurlock. We are supposed to stand up and cheer for a man who is only now learning this after spending 25 years in the belly of the savage beast? Whose valiant efforts at reform consisted of recommending optimizing nips and tucks in the outrageous government programs he supervised? Whose courageous farewell speech upon being run out of office, a la Douglas MacArthur, is “I’m not against government support for housing if done properly?”

Valedictory for Edward DeMarco

The sad story of Edward DeMarco is surely one more valuable piece of evidence confirming the theory of big government as outlined in this space. Those who insist that government is really full of honest, hard-working, well-meaning people full of idealistic good intentions doing a dirty job the best they can will now have an even harder time saying it with a straight face. It is one thing when big government opposes exponents of laissez faire; we expect bank robbers to shoot at the police. But gunning down an innocent bystander for shaking his fist in reproof shows that the robber is a hardened killer rather than a starving family man. When the welfare state steamrolls over an Edward DeMarco’s efforts to reform it at the margins, it should be clear to one and all that big government is rotten to the core.

Even so, the fact that Edward DeMarco was and is an honest man who thought he was doing good does not make him a hero. Edward DeMarco is not a martyr. He is a cautionary example. The only way to counteract big government is to oppose it openly and completely by embracing free markets. Anything less fails while giving aid and comfort to the enemy. Failure coupled with career suicide can only be redeemed by service to the clearest and noblest of principles.

DRI-254 for week of 7-6-14: The Selling of Environmentalism

An Access Advertising EconBrief:

The Selling of Environmentalism

The word “imperialism” was coined by Lenin to define a process of exploitation employed by developed nations in the West on undeveloped colonies in the Eastern hemisphere. In recent years, though, it has been used in a completely different context – to describe the use of economic logic to explain practically everything in the world. Before the advent of the late Nobel laureate Gary Becker, economists were parochial in their studies, confining themselves almost exclusively to the study of mankind in its commercial and mercantile life. Becker trained the lens of economic theory on the household, the family and the institution of marriage. Ignoring the time-honored convention of treating “capital” as plant and equipment, he (along with colleagues like Theodore Schultz) treated human beings as the ultimate capital goods.

Becker ripped the lid off Pandora’s Box and the study of society will never be the same again. We now recognize that any and every form of human behavior might profitably be seen in this same light. To be sure, that does not mean employing the sterile and limiting tools of the professional economist; namely, advanced mathematics and formal statistics. It simply means subjecting human behavior to the logic of purposeful action.

Environmentalism Under the Microscope

The beginnings of the environmental movement are commonly traced to the publication of Silent Spring in 1961 by marine biologist Rachel Carson. That book sought to dramatize the unfavorable effects of pesticides, industrial chemicals and pollution upon wildlife and nature. Carson had scientific credentials – she had previously published a well-regarded book on oceanography – but this book, completed during her terminal illness, was a polemic rather than a sober scientific tract. Its scientific basis has been almost completely undermined in the half-century since publication. (A recent book devoted entirely to re-examination of Silent Spring by scientific critics is decisive.) Yet this book galvanized the movement that has since come to be called environmentalism.

An “ism” ideology is, or ought to be, associated with a set of logical propositions. Marxism, for example, employs the framework of classical economics as developed by David Ricardo but deviates in its creation of the concept of “surplus value” as generated by labor and appropriated by capitalists. Capitalism is a term intended invidiously by Marx but that has since morphed into the descriptor of the system of free markets, private property rights and limited government. What is the analogous logical system implied by the term “environmentalism?”

There isn’t one. Generically, the word connotes an emotive affinity for nature and corresponding distaste for industrial civilization. Beyond that, its only concrete meaning is political. The problem of definition arises because, in and of itself, an affinity for nature is insufficient as a guide to human action. For example, consider the activity of recycling. Virtually everybody would consider it de rigueur as part of an environmentalist program. The most frequent stated purpose of recycling is to relieve pressure on landfills, which are ostensibly filling up with garbage and threatening to overwhelm humanity. The single greatest component of landfills is newsprint. But the leachates created by the recycling of newsprint are extremely harmful to” the environment;” e.g., their acidic content poisons soils and water and they are very costly to divert. We have arrived at a contradiction – is recycling “good for the environment” or “bad for the environment?” There is no answer to the question as posed; the effects of recycling are couched in terms of tradeoffs. In other words, the issue is dependent on economics, not emotion only.

No matter where we turn, “the environment” confronts us with such tradeoffs. Acceptance of the philosophy of environmentalism depends on getting us to ignore these tradeoffs by focusing on one side and ignoring the other. Environmental advocates of recycling, for instance, customarily ignore the leachates and robotically beat the drums for mandatory recycling programs. When their lopsided character is exposed, environmentalists retreat to the carefully prepared position that the purity of their motives excuses any lapses in analysis and overrides any shortcomings in their programs.

Today’s economist does not take this attitude on faith. He notes that the political stance of environmentalists is logically consistent even if their analysis is not. The politics of environmentalism can be understood as a consistent attempt to increase the real income of environmentalists in two obvious ways: first, by redistributing income in favor of their particular preferences for consumption (enjoyment) of nature; and second, by enjoying real income in the form of power exerted over people whose freedom they constrain and real income they reduce through legislation and administrative and judicial fiat.

Thus, environmentalism is best understood as a political movement existing to serve economic ends. In order to do that, its adherents must “sell” environmentalism just as a producer sells a product. Consumers “buy” environmentalism in one of two ways: by voting for candidates who support the legislation, agencies, rules and rulings that further the environmental agenda; and by donating money to environmental organizations that provide real income to environmentalists by employing them and lobbying for the environmental agenda.

Like the most successful consumer products, environmentalism has many varieties. Currently, the most popular and politically successful one is called “climate change,” which is a model change from the previous product, “global warming.” In order to appreciate the economic theory of environmentalism, it is instructive to trace the selling of this doctrine in recent years.

Why Was the Product Called “Climate Change” Developed?

The doctrine today known as “climate change” grew out of a long period of climate research on a phenomenon called “global warming.” This began in the 1970s. Just as businessmen spent years or even decades developing products, environmentalists use scientific (or quasi-scientific) research as their product-development laboratory, in which promising products are developed for future presentation on the market. Although global warming was “in development” throughout the 1970s and 80s, it did not receive its full “rollout” as a full-fledged environmental product until the early 1990s. We can regard the publication of Al Gore’s Earth in the Balance in 1992 as the completed rollout of global warming. In that book, Gore presented the full-bore apocalyptic prophesy that human-caused global warming threatened the destruction of the Earth within two centuries.

Why was global warming “in development” for so long? And after spending that long in development limbo, why did environmentalists bring it “to market” in the early 1990s? The answers to these questions further cement the economic theory of environmentalism.

Global warming joined a long line of environmental products that were brought to market beginning in the early 1960s. These included conservation, water pollution, air pollution, species preservation, forest preservation, overpopulation, garbage disposal, inadequate food production, cancer incidence and energy insufficiency.The most obvious, logical business rationale for a product to be brought to market is that its time has come, for one or more reasons. But global warming was brought to market by a process of elimination. All of the other environmental products were either not “selling” or had reached dangerously low levels of “sales.” Environmentalists desperately needed a flagship product and global warming was the only candidate in sight. Despite its manifest deficiencies, it was brought to market “before its time;” e.g., before its scientific merits had been demonstrated. In this regard, it differed from most (although not all) of the previous environmental products.

Those are the summary answers to the two key questions posed above. Global warming (later climate change) spent decades in development because its scientific merits were difficult if not impossible to demonstrate. It was brought to market in spite of that limitation because environmentalists had no other products with equivalent potential to provide real income and had to take the risks of introducing it prematurely in order to maintain the “business” of environmentalism as a going concern. Each of these contentions is fleshed out below.

The Product Maturation Suffered by Environmentalism

Businesses often find that their products lead limited lives. These limitations may be technological, competitive or psychological. New and better processes may doom a product to obsolescence. Competitors may imitate a product into senescence or even extinction. Fads may simply lose favor with consumers after a period of infatuation.

As of the early 1990s, the products offered by environmentalism were in various stages of maturity, decline or death.

Air pollution was a legitimate scientific concern when environmentalism adopted it in the early 1960s. It remains so today because the difficulty of enforcing private property rights in air make a free-market solution to the problem of air pollution elusive. But by the early 1990s, even the inefficient solutions enforced by the federal government had reduced the problem of air pollution to full manageability.

Between 1975 and 1991, the six air pollutants tracked by the Environmental Protection Agency (EPA) fell between 24% and 94%. Even if we go back to 1940 as a standard of comparison – forcing us to use emissions as a proxy for the pollution we really want to measure, since the latter wasn’t calculated prior to 1975 – we find that three of the six were lower in 1991 and total emissions were also lower in 1991. (Other developed countries showed similar progress during this time span.)

Water pollution was already decreasing when Rachel Carson wrote and continued to fall throughout the 1960s, 70s and 80s. The key was the introduction of wastewater treatment facilities to over three-quarters of the country. Previously polluted bodies of water like the CuyahogaRiver, the AndroscogginRiver, the northern Hudson River and several of the Great Lakes became pure enough to host sport-fishing and swimming. The Mississippi River became one of the industrialized world’s purest major rivers. Unsafe drinking water became a non-problem. Again, this was accomplished despite the inefficient efforts of local governments, the worst of these being the persistent refusal to price water at the margin to discourage overuse.

Forests were thriving in the early 1990s, despite the rhetoric of environmental organizations that inveighed against “clear-cutting” by timber companies. In reality, the number of wooded acres in the U.S. had grown by 20% over the previous two decades. The state of Vermont had been covered 35% by forest in the late nineteenth century. By the early 1990s, this coverage had risen to 76%.

This improvement was owed to private-sector timber companies, who practiced the principle of “sustainable yield” timber management. By the early 1990s, annual timber growth had exceeded harvest every year since 1952. By 1992, the actual timber harvest was a miniscule 384,000 acres, six-tenths of 1% of the land available for harvest. Average annual U.S. wood growth was three times greater than in 1920.

Environmentalists whined about the timberlands opened up for harvest by the federal government in the national parks and wildlife refuges, but less logging was occurring in the National Forests than at any time since the early 1950s. Clear-cut timber was being replaced with new, healthier stands that attracted more wildlife diversity than the harvested “old-growth” forest.

As always, this progress occurred in spite of government, not because of it. The mileage of roads hacked out of national-park land by the Forest Service is three times greater that of the federal Interstate highway system. The subsidized price at which the government sells logging rights on park land is a form of corporate welfare for timber companies. But the private sector bailed out the public in a manner that would have made John Muir proud.

Garbage disposal and solid-waste management may have been the most unheralded environmental victory won by the private sector. At the same time that Al Gore complained that “the volume of garbage is now so high that we are running out of places to put it,” modern technology had solved the problem of solid-waste disposal. The contemporary landfill had a plastic bottom and clay liner that together prevent leakage. It was topped with dirt to prevent odors and run-off. The entire U.S. estimated supply of solid waste for the next 500 years could be safely stored in one landfill 100-yards deep and 20 miles on a side. The only problem with landfills was a siting problem, owing to the NIMBY (“not in my back yard”) philosophy fomented by environmentalism. The only benefit to be derived from recycling could be had from private markets by recycling only those materials whose benefits (sales revenue) exceeded their reclamation costs (including a “normal” profit).

Overpopulation was once the sales leader of environmentalism. In 1968’s The Population Bomb, leading environmentalist Paul Ehrlich wrote that “the battle to feed all of humanity is over. In the 1970s, the world will undergo famines – hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date, nothing can prevent a substantial increase in the world death rate….” Ehrlich also predicted food riots and plummeting life expectancy in the U.S. and biological death for a couple of the Great Lakes.

Ehrlich was a great success at selling environmentalism. His book, and its 1990 sequel The Population Explosion, sold millions of copies and recruited untold converts to the cause. Unfortunately, his product had a limited shelf life because his prophecies were spectacularly inaccurate. The only famines were politically, not biologically, triggered; deaths were in the hundreds of thousands, not millions. Death rates declined instead of rising. The Great Lakes did not die; they were completely rehabilitated. Even worse, Ehrlich made a highly publicized bet with economist Julian Simon that the prices of five metals handpicked by Ehrlich would rise in real terms over a ten-year period. (The loser would pay the algebraic sum of the price changes incurred.) The prices went down in nominal terms despite the rising general level of prices over the interval – another spectacular prophetic failure by Ehrlich.

It’s not surprising that Ehrlich, rather than the population, bombed. In the 1960s, the world’s annual population growth was about 2.0%. By the 1990s, it would fall to 1.6%. (Today, of course, our problem is falling birth rates – the diametric opposite of that predicted by environmentalism.)

Therefore, the phantom population growth predicted by environmentalism did not comprise one component of the inadequate food supply foreseen with uncanny inaccuracy by environmentalists. Ehrlich and others had foreseen a Malthusian scenario in which rising population growth overtook diminishing agricultural productivity. They were just as wrong about productivity as about population. The Green Revolution ushered in by Norman Borlaug et al drove one of the world’s leading agricultural economists to declare that “the scourge of famine due to natural causes has been almost conquered….”

The other leg of environmentalism’s collapsing doomsday scenario of inadequate food was based on cancer incidence. Not only would the food supply prove insufficient, according to environmentalists, it was also unsafe. Industrial chemicals and pesticides were entering the food supply through food residues and additives. They were causing cancer. How did we know this? Tests on animals – specifically, on mice and rats – proved it.

There was only one problem with this assertion. Scientifically speaking, it was complete hooey. The cancer risk of one glass of wine was about 10,000 -12,000 times greater than that posed by the additives and pesticide residues (cumulatively) in most food products. Most of our cancer risk comes from natural sources, such as sunlight and natural pesticides produced by plants. Some of these occur in common foods. Still, cancer rates had remained steady or fallen over the previous fifty years except for lung cancers attributable to smoking and melanomas attributable to ultraviolet light. Cancer rates among young adults had decreased rapidly. Age-adjusted death rates had mostly fallen.

Energy insufficiency had been brought to market by environmentalists in the 1970s, during the so-called Energy Crisis. It sold well when OPEC was allowed to peg oil prices at stratospheric levels. But when the Reagan administration decontrolled prices, domestic production rose and prices fell. As the 1990s rolled around, environmentalists were reduced to citing on “proven reserves” of oil (45 years) and natural gas (63 years) as “proof” that we would soon run out of fossil fuels and energy prices would then skyrocket. Of course, this was more hooey; proven reserves are the energy equivalent of inventory. Businesses hold inventory as the prospective benefits and costs dictate. Current inventories say nothing about the long-run prospect of shortages.

In 1978, for example, proven reserves of oil stood at 648 billion barrels, or 29.2 years’ worth at current levels of usage. Over the next 14 years, we used about 84 billion barrels, but – lo and behold – proven reserves rose to nearly a billion barrels by 1992. That happened because it was now profitable to explore for and produce oil in a newly free market of fluctuating oil prices, making it cost-efficient to hold larger inventories of proven reserves. (And in today’s energy market, it is innovative technologies that are driving discoveries and production of new shale oil and gas.) Really, it is an idle pastime to estimate the number of years of “known” resources remaining because nobody knows how much of a resource remains. It is not worth anybody’s time to make an accurate estimate; it is easier and more sensible to simply let the free market take its course. If the price rises, we will produce more and discover more reserves to hold as “inventory.” If we can’t find any more, the resultant high prices will give us the incentive to invent new technologies and find substitutes for the disappearing resource. That is exactly what has just happened with the process called “fracking.” We have long known that conventional methods of oil drilling left 30-70% of the oil in the ground because it was too expensive to extract. When oil prices rose high enough, fracking allowed us to get at those sequestered supplies. We knew this in the early 1990s, even if we didn’t know exactly what technological process we would ultimately end up using.

Conservation was the first product packaged and sold by environmentalism, long predating Rachel Carson. It dated back to the origin of the national-park system in Theodore Roosevelt’s day and the times of John Muir and John Jacob Audubon. By the early 1990s, conservation was a mature product. The federal government was already the biggest landowner in the U.S. We already had more national parks than the federal government could hope to manage effectively. Environmentalists could no longer make any additional sales using conservation as the product.

Just about the only remaining salable product the environmentalists had was species preservation. Environmentalism flogged it for all it was worth, but that wasn’t much. After the Endangered Species Act was passed and periodic additions made to its list, what was left to do? Not nearly enough to support the upper-middle-class lifestyles of a few million environmentalists. (It takes an upper-middle-class income to enjoy the amenities of nature in all their glory.)

Environmentalism Presents: Global Warming

In the late 1980s, the theory that industrial activity was heating up the atmosphere by increasing the amount of carbon dioxide in the air began to gain popular support. In 1989, Time Magazine modified its well-known “Man of the Year” award to “Planet of the Year,” which it gave to “Endangered Earth.” It described the potential effects of this warming process as “scary.” The International Panel on Climate Change, an organization of environmentalists dedicated to selling their product, estimated that warming could average as much as 0.5 degrees Fahrenheit per decade over the next century, resulting in a 5.4 degree increase in average temperature. This would cause polar ice caps to melt and sea levels to rise, swamping coastal settlements around the world – and that was just the beginning of the adverse consequences of global warming.

No sooner had rollout begun than the skepticism rolled in along with the new product. Scientists could prove that atmospheric carbon dioxide was increasing and that industrial activity was behind that, but it could not prove that carbon dioxide was causing the amount of warming actually measured. As a matter of fact, there wasn’t actually an unambiguous case to be made for warming. What warming could be found had mostly occurred at night, in the winter and in the Southern Hemisphere (not the locus of most industrial activity). And to top it all off, it is not clear whether or not we should ascribe warming to very long-run cyclical forces that have alternated the Earth between Ice Ages and tropical warming periods for many thousands of years. By 1994, Time Magazine (which needed a continuous supply of exciting new headlines just as much as environmentalists needed a new supply of products with which to scare the public) had given up on global warming and resuscitated a previous global-climate scare from the 1970s, the “Coming Ice Age.”

It is easy to see the potential benefits of the global-warming product for environmentalists. Heretofore, almost all environmentalist products had an objective basis. That is, they spotlighted real problems. Real problems have real solutions, and the hullabaloo caused by purchase of those products led to varying degrees of improvement in the problems. Note this distinction: the products themselves did not cause or lead to the improvement; it was the uproar created by the products that did the job. Most of the improvement was midwived by economic measures, and environmentalism rejects economics the way vampires reject the cross. This put environmentalists in an anomalous position. Their very (indirect) success had worked against them. Their real income was dependent on selling environmentalism in any of various ways. Environmentalists cannot continue to sell more books about (say) air pollution when existing laws, regulations and devices have brought air quality to an acceptable level. They cannot continue to pass more coercive laws and regulations when the legally designated quality has been reached. Indeed, they will be lucky to maintain sales of previously written books to any significant degree. They cannot continue to (credibly) solicit donations on the strength of a problem that has been solved, or at least effectively managed.

Unfortunately for environmentalists, the environmental product is not like an automobile that gives service until worn out and needs replacement, ad infinitum. It is more like a vaccine that, once taken, needn’t be retaken. Once the public has been radicalized and sensitized to the need for environmentalism, it becomes redundant to keep repeating the process.

Global warming was a new kind of product with special features. Its message could not be ignored or softened. Either we reform or we die. There was no monkeying around with tradeoffs.

Unlike the other environmental products, global warming was not a real problem with real solutions. But that was good. Real problems get solved – which, from the environmentalist standpoint, was bad. Global warming couldn’t even be proved, let alone solved. That meant that we were forced to act and there could be no end to the actions, since they would never solve the problem. After all, you can’t solve a problem that doesn’t exist in the first place! Global warming, then, was the environmentalist gift that would keep on giving, endlessly beckoning the faithful, recruiting ever more converts to the cause, ringing the cash register with donations and decorating the mast of environmentalism for at least a century. Its very scientific dubiety was an advantage, since that would keep it in the headlines and keep its critics fighting against it – allowing environmentalists the perfect excuse to keep pleading for donations to fend off the evil global-warming deniers. Of course, lack of scientific credibility is also a two-edged sword, since environmentalists cannot force the public to buy their products and can never be quite sure when the credibility gap will turn the tide against them.

When you’re selling the environmentalist product, the last thing you want is certainty, which eliminates controversy. Controversy sells. And selling is all that matters. Environmentalists certainly don’t want to solve the problem of global warming. If the problem is solved, they have nothing left to sell! And if they don’t sell, they don’t eat, or at least they don’t enjoy any real income from environmentalism. Environmentalism is also aimed at gaining psychological benefits for its adherents by giving their lives meaning and empowering them by coercing people with whom they disagree. If there is no controversy and no problem, there is nothing to give their lives meaning anymore and no basis for coercing others.

The Economic Theory of Environmentalism

Both environmentalists and their staunchest foes automatically treat the environmental movement as a romantic crusade, akin to a religion or a moral reform movement. This is wrong. Reformers or altruists act without thought of personal gain. In contrast, environmentalists are self-interested individuals in the standard tradition of economic theory. Some of their transactions lie within the normal commercial realm of economics and others do not, but all are governed by economic logic.

That being so, should we view environmentalism in the same benign light as we do any other industry operating in a free market? No, because environmentalists reject the free market in favor of coercion. If they were content to persuade others of the merits of their views, their actions would be unexceptional. Instead, they demand subservience to their viewpoint via legal codification and all forms of legislative, executive, administrative and judicial tyranny. Their adherents number a few would-be dictators and countless petty dictators. Their alliance with science is purely opportunistic; one minute they accuse their opponents of being anti-scientific deniers and the next they are praying to the idol of Gaia and Mother Earth.

The only thing anti-environmentalists have found to admire about the environmental movement is its moral fervor. That concession is a mistake.

DRI-292 for week of 6-29-14: One in Six American Children is Hungry – No, Wait – One in Five!

An Access Advertising EconBrief:

One in Six American Children is Hungry – No, Wait – One in Five!

You’ve heard the ad. A celebrity – or at least somebody who sounds vaguely familiar, like singer Kelly Clarkson – begins by intoning somberly: “Seventeen million kids in America don’t know where their next meal is coming from or even if it’s coming at all.” One in six children in America is hungry, we are told. And that’s disgraceful, because there’s actually plenty of food, more than enough to feed all those hungry kids. The problem is just getting the food to the people who need it. Just make a donation to your local food pantry and together we can lick hunger in America. This ad is sponsored by the Ad Council and Feeding America.

What was your reaction? Did it fly under your radar? Did it seem vaguely dissonant – one of those things that strikes you wrong but leaves you not quite sure why? Or was your reaction the obvious one of any intelligent person paying close attention – “Huh? What kind of nonsense is this?”

Hunger is not something arcane and mysterious. We’ve all experienced it. And the world is quite familiar with the pathology of hunger. Throughout human history, hunger has been mankind’s number one enemy. In nature, organisms are obsessed with absorbing enough nutrients to maintain their body weight. It is only in the last few centuries that tremendous improvements in agricultural productivity have liberated us from the prison of scratching out a subsistence living from the soil. At that point, we began to view starvation as atypical, even unthinkable. The politically engineered famines that killed millions in the Soviet Union and China were viewed with horror; the famines in Africa attracted sympathy and financial support from the West. Even malnutrition came to be viewed as an aberration, something to be cured by universal public education and paternalistic government. In the late 20th century, the Green Revolution multiplied worldwide agricultural productivity manifold. As the 21st century dawned, the end of mass global poverty and starvation beckoned within a few decades and the immemorial problem of hunger seemed at last to be withering away.

And now we’re told that in America – for over a century the richest nation on Earth – our children – traditionally the first priority for assistance of every kind – are hungry at the ratio of one in six?

WHAT IS GOING ON HERE?

The Source of the Numbers – and the Truth About Child Hunger

Perhaps the most amazing thing about these ads, which constitute a full-fledged campaign, is the general lack of curiosity about their origins and veracity. Seemingly, they should have triggered a firestorm of criticism and investigation. Instead, they have been received with yawns.

The ads debuted last Fall. They were kicked off with an article in the New York Times on September 5, 2013, by Jane L. Levere, entitled “New Ad Campaign Targets Childhood Hunger.” The article is one long promotion for the ads and for Feeding America, but most of all for the “cause” of childhood hunger. That is, it takes for granted that a severe problem of childhood hunger exists and demands close attention.

The article cites the federal government as the source for the claim that “…close to 50 million Americans are living in ‘food insecure’ households,” or ones in which “some family members lacked consistent access throughout the year to adequate food.” It claims that “…almost 16 million children, or more than one in 5, face hunger in the United States.”

The ad campaign is characterized as “the latest in a long collaboration between Ad Council and Feeding America, ” which supplies some 200 food banks across the country that in turn supply more than 61,000 food pantries, soup kitchens and shelters. Feeding America began in the late 1990s as another organization, America’s Second Harvest, which enlisted the support of A-list celebrities such as Matt Damon and Ben Affleck. This was when the partnership with the Ad Council started.

Priscilla Natkins, a Vice-President of Ad Council, noted that in the early days “only” one out of 10 Americans was hungry. Now the ratio is 1 out of 7 and more than 1 out of 5 children. “We chose to focus on children,” she explained, “because it is a more poignant approach to illustrating the problem.”

Further research reveals that, mirabile dictu, this is not the first time that these ads have received skeptical attention. In 2008, Chris Edwards of Cato Institute wrote about two articles purporting to depict “hunger in America.” That year, the Sunday supplement Parade Magazine featured an article entitled “Going Hungry in America.” It stated that “more than 35.5 million Americans, more than 12% of the population and 17% of our children, don’t have enough food, according to the Department of Agriculture.” Also in 2008, the Washington Post claimed that “about 35 million Americans regularly go hungry each year, according to federal statistics.”

Edwards’ eyebrows went up appropriately high upon reading these accounts. After all, this was even before the recession had been officially declared. Unlike the rest of the world, though, Edwards actually resolved to verify these claims. This is what Edwards found upon checking with the Department of Agriculture.

In 2008, the USDA declared that approximately 24 million Americans were living in households that faced conditions of “low food security.” The agency defined this condition as eating “less varied diets, participat[ing] in Federal food-assistance programs [and getting] emergency food from community food pantries.” Edwards contended that this meant those people were not going hungry – by definition. And indeed, it is semantically perverse to define a condition of hunger by describing the multiple sources of food and change in composition of food enjoyed by the “hungry.”

The other 11 million (of the 35 million figure named in the two articles) people fell into a USDA category called “very low food security.” These were people whose “food intake was reduced at times during the year because they had insufficient money or other resources for food” [emphasis added]. Of these, the USDA estimated that some 430,000 were children. These would (then) comprise about 0.6% of American children, not the 17% mentioned by Parade Magazine, Edwards noted. Of course, having to reduce food on one or more occasions to some unnamed degree for financial reasons doesn’t exactly constitute “living in hunger” in the sense of not knowing where one’s next meal was coming from, as Edwards observed. The most that could, or should, be said was that the 11 million and the 430,000 might constitute possible candidates for victims of hunger.

On the basis of this cursory verification of the articles’ own sources, Chris Edward concluded that hunger in America ranked with crocodiles in the sewers as an urban myth.

We can update Edwards’ work. The USDA figures come from survey questions distributed and tabulated by the Census Bureau. The most recent data available were released in December 2013 for calendar year 2012. About 14.5% of households fell into the “low food security” category and about 5.7% of households were in the “very low food security” pigeonhole. Assuming the current average of roughly 2.58 persons per household, this translates to approximately 34 million people in the first category and just under 13.5 million people in the second category. If we assume the same fraction of children in these at-risk households as those in 2008, that would imply about 635,000 children in the high-risk category, or less than 0.9 of 1% of the nation’s children. That is a far cry from the 17% of the nation’s children mentioned in the Washington Post article of 2008. It is a farther cry from the 17,000,000 children mentioned in current ads, which would be over 20% of America’s children.

The USDA’s Work is From Hunger

It should occur to us to wonder why the Department of Agriculture – Agriculture, yet – should now reign as the nation’s arbiter of hunger. As it happens, economists are well situated to answer that question. They know that the federal food-stamp began in the 1940s primarily as a way of disposing of troublesome agricultural surpluses. The federal government spent the decade of the 1930s throwing everything but the kitchen sink at the problem of economic depression. Farmers were suffering because world trade had imploded; each nation was trying to protect its own businesses by taxing imports of foreign producers. Since the U.S. was the world’s leading exporter of foodstuffs, its farmers were staggering under this impact. They were swimming in surpluses and bled so dry by the resulting low prices that they burned, buried or slaughtered their own output without bringing it to market in an effort to raise food prices.

The Department of Agriculture devised various programs to raise agricultural prices, most of which involved government purchases of farm goods to support prices at artificially high levels. Of course, that left the government with lots of surplus food on its hands, which it stored in Midwestern caves in a futile effort to prevent spoilage. Food distribution to the poor was one way of ridding itself of these surpluses, and this was handled by the USDA which was already in possession of the food.

Just because the USDA runs the food-stamp program (now run as a debit-card operation) doesn’t make it an expert on hunger, though. Hunger is a medical and nutritional phenomenon, not an agricultural one. Starvation is governed by the intake of sufficient calories to sustain life; malnutrition is caused by the maldistribution of nutrients, vitamins and minerals. Does the Census Bureau survey doctors on the nutritional status of their patients to provide the USDA with its data on “food insecurity?”

Not hardly. The Census Bureau simply asks people questions about their food intake and solicits their own evaluation of their nutritional status. Short of requiring everybody to undergo a medical evaluation and submit the findings to the government, it could hardly be otherwise. But this poses king-sized problems of credibility for the USDA. Asking people whether they ever feel hungry or sometimes don’t get “enough” food is no substitute for a medical evaluation of their status.

People can and do feel hungry without coming even close to being hungry in the sense of risking starvation or even suffering a nutritional deficit. Even more to the point, their feelings of hunger may signal a nutritional problem that cannot be cured by money, food pantries, shelters or even higher wages and salaries. The gap between the “low food security” category identified by the USDA and starving peoples in Africa or Asia is probably a chasm the size of the Grand Canyon.

The same America that is supposedly suffering rampant hunger among both adults and children is also supposedly suffering epidemics of both obesity and diabetes. There is only one way to reconcile these contradictions: by recognizing that our “hunger” is not the traditional type but rather the kind associated with diabetes (hence, obesity) rather than the traditional sort of starvation or malnutrition. Over-ingestion of simple carbohydrates and starches can often cause upward spikes in blood sugar among susceptible populations, triggering the release of insulin that stores the carbohydrate as fat. Since the carbohydrate is stores as fat rather than burned for energy, the body remains starved for energy and hungry even though it is getting fat. Thus do hunger and obesity coexist.

The answer is not more government programs, food stamps, food pantries and shelters. Nor, for that matter, is it more donations to non-profit agencies like Feeding America. It is not more food at all, in the aggregate. Instead, the answer is a better diet – something that millions of Americans have found out for themselves in the last decade or so. In the meantime, there is no comparison between the “hunger” the USDA is supposedly measuring and the mental picture we form in our minds when we think of hunger.

This is not the only blatant contradiction raised by the “hunger in America” claims. University of Chicago economist Casey Mulligan, in his prize-winning 2012 book The Redistribution Recession, has uncovered over a dozen government program and rule changes that reduced the incentive to work and earn. He assigns these primary blame for the huge drop in employment and lag in growth that the U.S. has summered since 2007. High on his list are the changes in the food-stamp program that substituted a debit card for stamps, eliminated means tests and allowed recipients to remain on the program indefinitely. A wealthy nation in which 46 million out of 315 million citizens are on the food dole cannot simultaneously be suffering a problem of hunger. Other problems, certainly – but not that one.

What About the Real Hunger?

That is not to say that real hunger is completely nonexistent in America. Great Britain’s BBC caught word of our epidemic of hunger and did its own story on it, following the New York Times, Washington Post, Parade Magazine party line all the way. The BBC even located a few appropriately dirty, ragged children for website photos. But the question to ask when confronted with actual specimens of hunger is not “why has capitalism failed?” or “why isn’t government spending enough money on food-security programs?” The appropriate question is “why do we keep fooling ourselves into thinking that more government spending is the answer when the only result is that the problem keeps getting bigger?” After all, the definition of insanity is doing the same thing over and over again and expecting a different result.

The New York Times article in late 2013 quoted two academic sources that were termed “critical” of the ad campaign. But they said nothing about its blatant lies and complete inaccuracy. No, their complaint was that it promoted “charity” as the solution rather than their own pet remedies, a higher minimum wage and more government programs. This calls to mind the old-time wisecrack uttered by observers of the Great Society welfare programs in the 1960s and 70s: “This year, the big money is in poverty.” The real purpose of the ad campaign is to promote the concept of hunger in America in order to justify big-spending government programs and so-called private programs that piggyback on the government programs. And the real beneficiaries of the programs are not the poor and hungry but the government employees, consultants and academics whose jobs depend on the existence of “problems” that government purports to “solve” but that actually get bigger in order to justify ever-more spending for those constituencies.

That was the conclusion reached, ever so indirectly and delicately, by Chris Edwards of Cato Institute in his 2008 piece pooh-poohing the “hunger in America” movement. It applies with equal force to the current campaign launched by non-profits like the Ad Council and Feeding America, because the food banks, food pantries and shelters are supported both directly and indirectly by government programs and the public perception of problems that necessitate massive government intervention. It is the all-too-obvious answer to the cry for enlightenment made earlier in this essay.

In this context, it is clear that the answer to any remaining pockets of hunger is indeed charity. Only private, voluntary charity escapes the moral hazard posed by the bureaucrat/consultant class that has no emotional stake in the welfare of the poor and unfortunate but a big stake in milking taxpayers. This is the moral answer because it does not force people to contribute against their will but does allow them to exercise free will in choosing to help their fellow man. A moral system that works must be better than an immoral one that fails.

Where is the Protest?

The upshot of our inquiry is that the radio ads promoting “hunger in America” and suggesting that America’s children don’t know where their next meal is coming from are an intellectual fraud. There is no evidence that those children exist in large numbers, but their existence in any size indicts the current system. Rather than rewarding the failure of our current immoral system, we should be abandoning it in favor of one that works.

Our failure to protest these ads and publicize the truth is grim testimony to how far America has fallen from its origins and ideals. In the first colonial settlements at Jamestown and Plymouth, colonists learned the bitter lesson that entitlement was not a viable basis for civilization and work was necessary for survival. We are in the process of re-learning that lesson very slowly and painfully.